Description
An experiment or study is reproducible or replicable when subsequent experiments confirm the results. This is re-search. However, we can define different types of reproducibility depending on the conditions that we use to replicate the previous work or in the information available. Our aim is to focus mostly on two different kinds1:1. Direct: Is when we obtaining the same results using the same experimental conditions, materials, and methods as described in the original experiment. This would be the ideal reproducibility of an experiment however, it requires a very accurate description of how the original experiment was performed. Some journals are trying to resolve the reproducibility crisis improving the rigor and the excellence on the reported methods and results (e.g. STAR Methods in Cell Press).2. Systematical: Refers to obtaining the same results, but under different conditions; for example, using another cell line or mouse strain, or inhibiting a gene pharmacologically instead of genetically. This open the door to subsequent studies to find the conditions under which an initial finding holds.
Reference:
- 1. Stanley E Lazic (2016) Experimental design for laboratory biologists. Maximizing information and improving reproducibility. Cambridge University Press.::2. Baker M (2016) 1,500 scientists lift the lid on reproducibility. Survey sheds light on the ‘crisis’ rocking research. Nature 533:452–4.
MitoPedia O2k and high-resolution respirometry:
Oroboros QM
MitoPedia topics:
Gentle Science
- But, is there a reproducibility crisis? According to a survey conducted by Nature2 of 1,576 researchers, "52 % agree that there is a significant 'crisis' of reproducibility, less than 31 % think that failure to reproduce published results means that the result is probably wrong, and most say that they still trust the published literature" (Baker 2016). Chemistry and biology are the subjects with the highest share of failed attempts of reproduction of results.
- Asking the researchers for the causes of this inability to reproduce published results, the top three answers are:
- Selective reporting
- Publication pressure
- Low statistical power and poor analysis
- Asking the researchers for the causes of this inability to reproduce published results, the top three answers are:
- The top three mentioned countermeasures are:
- Better understanding of statistics
- Better mentoring and supervision
- More robust design
- The top three mentioned countermeasures are:
Solve the reproducibility crisis
- While it is probably impossible to fully prevent human self-deception and inadequate command of statistical methods, what we can do is minimize sources of error connected to the instrumental equipment and its handling:
- Select instrumental equipment for which appropriate specifications are available.
- Have yourself trained on your equipment and make sure you know what you (both, you and the device you operate) are doing in each step of your experiment.
- Avoid black-box performance of software.
- Same for data analysis: get trained on analysis software. In the best case, use software that comes with your instrument in order to minimize errors during data transfer and translation.
- An Open Access policy fosters the establishment of an error culture and a culture of transparence in science. In this way, Open Access - as manifested in the Bioblast website (see Gentle Science - contributes to solving the reproducibility crisis.
- Methods: Identify the methods, apparatus (manufacturer's name and address in parentheses), and procedures in sufficient detail to allow other workers to reproduce the results. Give references to established methods. - Quoted from International Committee of Medical Journal Editors.
- While it is probably impossible to fully prevent human self-deception and inadequate command of statistical methods, what we can do is minimize sources of error connected to the instrumental equipment and its handling:
Further links
- Gnaiger E (2019) Editorial: A vision on preprints for mitochondrial physiology and bioenergetics. MitoFit Preprint Arch doi:10.26124/mitofit:190002.v2. - »MitoFit Preprints«
- Reproducibility: let’s get it right from the start. Nature Communications 9: Article number 3716 (2018)
- Validation by THE SCIENCE EXCHANGE NETWORK
- » Validating key experimental results via independent replication
- » Reproducibility Initiative
- How can we increase reproducibility in neuroscience studies? BioTechniques 2018.
- Nelson NC, Ichikawa K, Chung J, Malik MM (2021) Mapping the discursive dimensions of the reproducibility crisis: A mixed methods analysis. PLoS ONE 16:e0254090. https://doi.org/10.1371/journal.pone.0254090
References
Bioblast link | Reference | Year |
---|---|---|
Begley 2015 Circ Res | Begley CG, Ioannidis JPA (2015) Reproducibility in science: improving the standard for basic and preclinical research. Circ Res 116:116-26. https://doi.org/10.1161/CIRCRESAHA.114.303819 | 2015 |
Chiu 2017 PLOS Biol | Chiu K, Grundy Q, Bero L (2017) `Spin' in published biomedical literature: A methodological systematic review. PLoS Biology 15(9): e2002173. | 2017 |
Gnaiger 2019 MitoFit Preprints Editorial | Gnaiger E (2019) Editorial: A vision on preprints for mitochondrial physiology and bioenergetics. https://doi.org/10.26124/mitofit:190002.v2 | 2019 |
BEC 2020.1 doi10.26124bec2020-0001.v1 | Gnaiger E et al ― MitoEAGLE Task Group (2020) Mitochondrial physiology. https://doi.org/10.26124/bec:2020-0001.v1 | 2020 |
Ioannidis 2005 PLoS Med | Ioannidis JPA (2005) Why most published research findings are false. PLoS Med 2005 Aug;2(8):e124. | 2005 |
Ioannidis 2014 Lancet | Ioannidis JPA, Greenland S, Hlatky MA, Khoury MJ, Macleod MR, Moher D, Schulz KF, Tibshirani R (2014) Increasing value and reducing waste in research design, conduct, and analysis. https://doi.org/10.1016/S0140-6736(13)62227-8 | 2014 |
Kahneman 2011 Penguin Books | Kahneman D (2011) Thinking, fast and slow. Penguin Books 499 pp. | 2011 |
Stodden 2020 Proc Natl Acad Sci U S A | Stodden Victoria, Seiler Jennifer, Ma Zhaokun (2020) An empirical analysis of journal policy effectiveness for computational reproducibility. Proc Natl Acad Sci U S A 115:2584-9. | 2020 |
Triggle 2017 Drug Dev Res | Triggle Chris R, Triggle David J (2017) From Gutenberg to Open Science: an unfulfilled odyssey. Drug Dev Res 78:3-23. | 2017 |