IResearch
The hardness of the soft sciences.
For nearly two decades I've measured what we politely don't measure: bias, misconduct, and the uneven epistemic ground between disciplines.
Overview
The work has produced some of the largest empirical surveys of questionable research practices, and a series of papers mapping a hierarchy of the sciences along measurable dimensions of the published literature.
More recently I've been developing K-theory — an approach that treats scientific knowledge as a problem in information compression. The first empirical test, published in 2025, predicted the reproducibility outcomes of the Brazilian Reproducibility Initiative.
The lab is the Theoretical and Empirical METaknowledge (TEMET) lab at Heriot-Watt, with a parallel fellowship at the LSE Department of Methodology.
The TEMET lab
I lead the Theoretical and Empirical METaknowledge lab at Heriot-Watt — a research group dedicated to studying the structure of knowledge using the tools of metascience, statistics, and information theory. The lab brings together work on bias, misconduct, the hierarchy of the sciences, and K-theory under one roof.
Publications
Meta-research
- Fanelli, D., Voodla, A., Andres, S. & Uusberg, A. (2026) Questionable Research Practices: A Principled Classification and Ranking Based on Survey Data. Science and Engineering Ethics
- Fanelli, D. (2025) Behind the scenes of scientific fraud. Science
- Fanelli, D., Tan, P. B., Amaral, O. B. & Neves, K. (2025) A metric of knowledge as information compression reflects reproducibility predictions for biomedical experiments. Royal Society Open Science
- Bartos, F., Maier, M., Wagenmakers, E.-J., et al., Fanelli, D., Stanley, T. D. (2024) Footprint of Publication Selection Bias on Meta-Analyses in Medicine, Environmental Sciences, Psychology, and Economics. Research Synthesis Methods
- Martineau, S., Cristea, I. A., Chevance, A., Fanelli, D., Naudet, F. (2023) Are large prospective trials on antidepressants in mental disorders seeding trials? BMJ Open
- Bakker, C., Boughton, S., Faggion, C. M., Fanelli, D., Kaiser, K. A., Schneider, J. (2023) Reducing the residue of retractions in evidence synthesis: Ways to minimize inappropriate citation and use of retracted data. BMJ Evidence-Based Medicine
- Fanelli, D. (2022) The "tau" of science: how to measure, study, and integrate quantitative and qualitative knowledge. MetaArXiv
- Fanelli, D., Tan, P. D., Amaral, O. B., Neves, K. (2022) A metric of knowledge as information compression reflects reproducibility predictions in biomedical experiments. MetaArXiv
- Fanelli, D., Schleicher, M., Fang, F. C., Casadevall, A., Bik, E. M. (2021) Are individual and institutional predictors of misconduct modulated by national publication incentives policy? Results of a matched-control analysis of problematic image duplications. PLOS ONE
- Fanelli, D. & Moher, D. M. (2019) What difference might retractions make? An estimate of the epistemic impact of retractions on recent meta-analyses. bioRxiv
- Fanelli, D. (2019) A theory and methodology to quantify knowledge. Royal Society Open Science
- Tatsioni, A., Karassa, F. B., Goodman, S. N., Zarin, D. A., Fanelli, D., Ioannidis, J. P. A. (2019) Lost Evidence From Registered Large Long-Unpublished Randomized Controlled Trials: A Survey. Ann Intern Med.
- Fanelli, D. (2018) Is science really facing a reproducibility crisis, and do we need it to? PNAS
- Fanelli, D., Costas, R., Fang, F. C., Casadevall, A., Bik, E. M. (2018) Testing hypotheses on risk factors for scientific misconduct via matched-control analysis of papers containing problematic image duplications. Science and Engineering Ethics
- Fanelli, D., Ioannidis, J. P. A., Goodman, S. N. (2018) Improving the integrity of published science: an expanded taxonomy of retractions and corrections. European Journal of Clinical Investigation
- Naudet, F., Sakarovitch, C., Janiaud, P., Cristea, I., Fanelli, D., Moher, D., Ioannidis, J. P. A. (2018) Data sharing and reanalysis of randomized controlled trials in leading biomedical journals with a full data sharing policy. BMJ
- Fanelli, D., Costas, R., Fang, F. C., Casadevall, A., Bik, E. M. (2017) Why do scientists fabricate and falsify data? A matched-control analysis of papers containing problematic image duplications. bioRxiv
- Fanelli, D., Costas, R., Ioannidis, J. P. A. (2017) Meta-assessment of bias in science. PNAS
- Hosseini, M., Hilhorst, M., de Beaufort, I., Fanelli, D. (2017) Doing the Right Thing: A Qualitative Investigation of Retractions Due to Unintentional Error. Science and Engineering Ethics
- Goodman, S., Fanelli, D., Ioannidis, J. P. A. (2016) What does reproducibility mean? Science Translational Medicine
- Fanelli, D. (2016) Set up a 'self-retraction' system for honest errors. Nature
- Fanelli, D. & Larivière, V. (2016) Scientists' individual publication rate has not increased in a century. PLOS ONE
- McCrary, J., Christensen, G. & Fanelli, D. (2016) Conservative Tests under Satisficing Models of Publication Bias. PLOS ONE
- Ioannidis, J. P. A., Fanelli, D., Drake Dunne, D., Goodman, S. N. (2015) Meta-research: Evaluation and improvement of research methods and practices. PLOS Biology
- Fanelli, D., Costas, R. & Larivière, V. (2015) Misconduct Policies, Academic Culture and Career Stage, Not Gender or Pressures to Publish, Affect Scientific Integrity. PLOS ONE
- Fanelli, D. (2015) We need more research on causes and consequences, as well as on solutions. Addiction
- Pupovac, V. & Fanelli, D. (2015) Scientists admitting to plagiarism: a meta-analysis of surveys. Science and Engineering Ethics
- Yu, B. & Fanelli, D. (2014) Classifying Negative Findings in Biomedical Publications. Proceedings of BioNLP 2014, Workshop on Biomedical Natural Language Processing
- Fanelli, D. (2014) Publishing: rise in retractions is a signal of integrity. Nature
- Fanelli, D. & Ioannidis, J. P. A. (2014) Re-analyses actually confirm that US studies may overestimate effect sizes in softer research. PNAS
- Fanelli, D. (2013) Why growing retractions are (mostly) a good sign. PLOS Medicine
- Fanelli, D. & Ioannidis, J. P. A. (2013) US studies may overestimate effect sizes in softer research. PNAS
- Fanelli, D. & Glänzel, W. (2013) Bibliometric evidence for a Hierarchy of the Sciences. PLOS ONE
- Fanelli, D. (2013) Redefine misconduct as distorted reporting. Nature
- Fanelli, D. (2013) Positive results receive more citations, but only in some disciplines. Scientometrics
- Fanelli, D. (2012) Any publicity is better than none: newspaper coverage increases citations, in the UK more than in Italy. Scientometrics
- Fanelli, D. (2012) Negative results are disappearing from most disciplines and countries. Scientometrics
- Fanelli, D. (2010) Do pressures to publish increase scientists' bias? An empirical support from US states data. PLOS ONE
- Fanelli, D. (2010) "Positive" results increase down the Hierarchy of the Sciences. PLOS ONE
- Fanelli, D. (2009) How many scientists fabricate and falsify research? A systematic review and meta-analysis of survey data. PLOS ONE
Book chapters & reports
- Fanelli, D. (2022) Is the narrative of a crisis empirically supported? In: Jussim L., Krosnik J. eds. Research Integrity in the Behavioural Sciences. Oxford University Press.
- Fanelli, D. (2020) Institutional pressures to publish: what effects do we see? In: Mario Biagioli & Alexandra Lippman eds. Gaming the Metrics. MIT Press.
- EU Mutual Learning Exercise on Research Integrity (2019). Author of Report 2 (Incentives) and 4 (Training and Education). Prepared for the European Commission, Directorate-General for Research and Innovation.
- Linee guida del CNR per l'integrità nella ricerca (2015). Co-authored with Cinzia Caporale and input from the CNR Ethics Committee.
- Honesty, Accountability and Trust: Fostering Research Integrity in Canada (2010). Co-authored with the Expert Panel on Research Integrity, Council of Canadian Academies.
- Fanelli, D. (2011) The black, the white and the grey areas — towards an international and interdisciplinary definition of scientific misconduct. In: Nick Steneck & Tony Meyer eds. Promoting Research Integrity on a Global Basis. World Scientific Press.
Projects
Current
Increasing reproducibility through co-creation of interventions that support a transparent and trustworthy research ecosystem. We're deploying comCensus to involve researchers directly in the co-creation and evaluation of reproducibility policies.
Beyond Bad Apples: Towards a Behavioral and Evidence-Based Approach to Promote Research Ethics and Research Integrity in Europe. Using K-theory in WP3 to develop a more nuanced approach to QRPs.
Improving Reproducibility in SciencE.
Predicting reproducibility via a metric of information compression — collaboration with the Brazilian Reproducibility Initiative.
With Wesley Bonifay, University of Missouri.
A novel theory and methodology to explain and study consciousness through the lens of K-theory.
Concluded
- Innovating retractions to reward self-correction. A METRICS-sponsored project on revolutionising literature-amendment policies.
- A mathematical theory of knowledge, bias, science, and pseudoscience. A personal attempt to unify metascience, cognitive science, and philosophy.
- BayesCAMP: Bayesian Corrections Against Misuses of P-values. With Steven Goodman (Stanford) and Don van Ravenzwaaij (Groningen).
- Why do scientists fabricate and falsify data? A matched-control analysis with Elisabeth Bik, Ferric Fang, Arturo Casadevall, and Rodrigo Costas.
- From countries to individuals (2013–2016) — multilevel meta-meta-analysis with John P. A. Ioannidis (Stanford). NIH/Office of Research Integrity, $165k/year.
- Historical trends of scientists' "productivity" — with Vincent Larivière (Université de Montréal).
- The integrity of self-retracting scientists — qualitative study with Medard Hilhorst (Erasmus MC).
- No study's perfect — cross-disciplinary analysis of published errata. Partly funded by COPE.
- Bias, misconduct and the Hierarchy of the Sciences (2010–2013) — Leverhulme Early-Career Fellowship, ~£78k.
- Quantifying objectivity in the natural and social sciences (2008–2010) — Marie Curie Intra-European Fellowship, ~€160k.
Talks & keynotes
Conferences & public talks (2008–present)
- Scientific Freedom and Responsibility: Challenges and Opportunities, Santiago de Chile, CL, 2025. Invited keynote — Cautionary tales from Metascience.
- SFI Summit 2024, Cork, IE. Keynote — Research integrity: it's complex, and complicated.
- WCRI 2024. Opening session keynote — first pre-registered evidence that K-theory predicts irreproducibility on the BRI data.
- SIPS 2024, Kenya. Talk: how complexity affects psychology research.
- Science Foundation Ireland Summit 2023, Cork, IE. Keynote — Research integrity: it's complex, and complicated.
- 8th European Conference on Academic Integrity and Plagiarism, Porto, PT, 2022. Keynote — Research integrity in a complex world.
- ENRIO 2021 conference, online. Keynote — What challenges lie ahead for research integrity officers?
- REWARD-EQUATOR conference, Berlin, DE, 2020. Keynote — How explaining knowledge helps improve research.
- Meta-Science Symposium 2019, Stanford University, USA. Invited talk — Low reproducibility as divergent information: A K-theory analysis of reproducibility studies.
- Fixing Science: Practical Solutions for the Irreproducibility Crisis, National Academy of Scholars, Oakland, CA, USA. Invited plenary — Reproducibility reforms if there is no irreproducibility crisis.
- Fifth AgreenSkills meeting, Brussels, BE, 2019. Keynote — Is science facing a crisis or an opportunity?
- 18th Congress of the International Union for the Study of Social Insects, Guarajá, BR, 2018. Invited talk — Taking the pulse of social insects research.
- 9th International Conference on Complex Systems, Boston, MA, USA, 2018. Talk — Towards a meta-theory of scientific knowledge.
- Research integrity and Data Management, University of Copenhagen, DK, 2017. Invited talk — How and where can data management policies improve reproducibility and integrity?
- ASHG Conference, Orlando, FL, USA, 2017. Invited talk — Making genetics research more reproducible.
- Peer Review Congress, Chicago, IL, USA, 2017. Plenary — Summary effect sizes in meta-analyses after removal of retracted studies.
- 5th World Conference of Research Integrity, Amsterdam, NL, 2017. Keynote — Conceptual challenges concerning re-analysis and replication practices in reproducible research.
- 5th World Conference of Research Integrity, Amsterdam, NL, 2017. Invited talk — How to help scientists own their mistakes.
- National Academy of Sciences Sackler Colloquium: Reproducibility of Research, Washington, D.C., 2017. Invited panellist.
- International conference "Researching with Integrity", Tartu, EE, 2017. Keynote — Benefits and challenges of defining scientific misconduct.
- IV Brazilian Meeting on Research Integrity, Science and Publication Ethics (BRISPE), Goiânia, BR, 2016. Keynote — The mystery of missing negative results.
- Computation+Journalism Symposium, Stanford, USA, 2016. Invited panellist on Reproducible Journalism.
- MAER-Net Colloquium 2016, Little Rock, AR, USA. Talk — Understanding bias via cross-disciplinary multi-level meta-meta-regression.
- ESOF 2016 conference, Manchester, UK. Invited speaker — Going viral: social media and the practice of science in society.
- NRIN Research Conference 2016, Amsterdam, NL. Keynote — Is Dutch science at greater risk?
- Gaming Metrics 2016, UC Davis, CA, USA. Invited talk — Institutional pressures to publish: what effects do we see?
- The 2015 Southampton Conference on the Credibility of Research. Invited talk — An overall empirical perspective of the scientific crisis.
- 15th ISSI Conference, Istanbul, TU, 2015. Talk — Are scientists really publishing more? (co-authored with V Larivière).
- A new start for Europe: Opening up to an ERA of Innovation, European Commission, Brussels, BE, 2015. Invited talk — What "causes" scientific misconduct?
- World Conference of Science Journalists, Seoul, KR, 2015. Invited speaker — 50 Shades of Scientific Fraud.
- 4th World Conference on Research Integrity, Rio de Janeiro, BR, 2015. Invited talk — Research misconduct: Conceptions and policy solutions.
- 4th World Conference on Research Integrity, Rio de Janeiro, BR, 2015. Talk — From countries to individuals: unravelling the causes of bias and misconduct with multilevel meta-meta-analysis.
- 'Improving scientific practice', University of Amsterdam, 2014. Invited talk — Science 2.0: A System Centred on Accurate Reporting.
- Presentation of the Danish Code of Conduct for Research Integrity, Copenhagen Business School, DK, 2014. Keynote — Growing challenges for our growing integrity.
- 'Circling the square: Research, politics, media and impact', University of Nottingham, 2014. Invited talk — How to maintain integrity yet provide robust knowledge.
- Inauguration of the new Chair in Methodology and Integrity, Vrije Universiteit Amsterdam, NL, 2014. Invited talk — What can research on scientific misconduct tell us, and how can it mislead us?
- 82e Congrès de l'ACFAS, Association francophone pour le savoir, 2014. Invited talk — L'intégrité dans la diversité des disciplines.
- First AgreenSkills meeting, Leuven, BE, 2013. Keynote — Where do false and falsified results grow? How to weed them out?
- 8th World Conference of Science Journalists, Helsinki, FI, 2013. Invited talk — Can we still trust science? Mostly yes.
- 3rd World Conference on Research Integrity, Montreal, CA, 2013. Talk — Statistical studies on errors, bias and fraud: Towards an evidence-based RCR?
- ORI at 20, Baltimore, MD, 2013. Invited talk — Measuring incidence, to understand causes.
- 11th Ethical Forum, University Foundation, Brussels, Belgium, 2012. Keynote — What science tells us about scientific fraud.
- QUEST for Research Excellence, Washington DC, USA, 2012. Talk — Positive-outcome bias is increasing in most disciplines and countries.
- 17th International Conference on Science and Technology Indicators, Montreal, CA, 2012. Poster (with W. Glänzel) — A Bibliometric test of the Hierarchy of the Sciences.
- MAER-Net Colloquium 2011, Cambridge, UK. Invited talk — Unravelling the causes of publication bias.
- COPE, London, UK, 2011. Invited talk — The 'Bulk' of the Iceberg (and what journals can do about it).
- Science College 2010, Ruhr University, Bochum, DE. Keynote — The mystery of the missing negative results.
- 4S Annual Meeting, Tokyo, JP, 2010. Talk — Do pressures to publish increase scientists' bias?
- 2nd World Conference on Research Integrity, Singapore, 2010. Invited talk — The Black, the White and the Grey Areas — Towards an international and interdisciplinary definition of scientific misconduct.
- SEESHOP3 Meeting, Cardiff, UK, 2009. Talk — Expertise and the hierarchy of the sciences.
- 4S Annual Meeting, Washington DC, USA, 2009. Talk — Are behavioural and social sciences really less objective?
- Science in Society conference, Cambridge, UK, 2009. Talk — How many scientists fabricate and falsify research?
- Research Conference on Research Integrity, Niagara Falls, USA, 2009. Invited talk — How many scientists fabricate and falsify research?
Selected seminars & workshops
- The neglected importance of complexity in meta-research. Invited seminar, Inserm workshop 277, Bordeaux, FR.
- How an information metric could bring truce to the statistics wars. Invited seminar, Phil Stat Seminar: The Statistics Wars and Their Casualties, 2021.
- Measuring scientific bias across disciplines and countries. Workshop in Biostatistics, Department of Biomedical Data Science, Stanford University, CA (2017).
- Crisis in social science: Scientific misconduct. Berkeley Initiative for Transparency in Social Sciences Summer Institute, Berkeley, CA (2016).
- Bias and misconduct: How? Why? What can be done? Invited public lecture, Grey Matter event at deBuren, Ghent, BE (2013).
- Bias, misconduct and biomedical research. Invited seminar, MRC Human Genetics Unit, University of Edinburgh (2012).
- Research bias: causes, consequences, solutions. Invited workshop, Science College, Ruhr University, DE (2010).
Media
Interviews
- ScienceGuide (2018, NL) — Spreken over een 'replicatiecrisis' is onjuist en schadelijk
- Retraction Watch (2018, USA) — The retraction process needs work. Is there a better way?
- BNR Nieuwsradio (2017, NL) — Grote wetenschapsbladen overdrijven een beetje
- Retraction Watch Q&A (2017, USA) — Authors who retract for honest error say they aren't penalized as a result.
- ORF radio (2016, AT)
- Le Devoir (2014, CA) — Nos chercheurs sont-ils intègres?
- Deutschlandradio (2014, DE) — Zwischen wissenschaftlichem Fehlverhalten und unbeabsichtigten Fehlern
- BBC Radio 4 (2014) — Everything we know is wrong, radio documentary
- Radio3 Scienza (2014) — Contrordine, scienziati
- Nature Podcast (2013, UK) — Peer pressure
- RAI RadioTre Scienza (2013, IT) — Il caso di Diederik Stapel
- ABC National (2010, AU) — Publish or perish: scientists under pressure
Representative press coverage (2009–present)
- Quillette (2021) — Lockdown Scepticism Was Never a 'Fringe' Viewpoint
- Times Higher Education (2018, UK) — Is science really facing a reproducibility crisis?
- The Economist (2018, UK) — Are research papers less accurate and truthful than in the past?
- BuzzFeed (2017, USA) — The More Widely Cited A Study Is, The More Likely It Is To Exaggerate
- The Washington Post (2017, USA) — How biased is science, really?
- Ars Technica (2017, USA) — Analysis of meta-analyses identifies where science's real problems lie
- Times Higher Education (2015, UK) — Paper disputes causes of research misconduct
- The Economist (2013, UK) — Trouble at the lab
- The Guardian (2012, UK) — False positives: fraud and misconduct are threatening scientific research
- Nature (2012, UK) — Replication studies: Bad copy
- USA Today (2011, USA) — File drawer effect: Science studies neglecting negative results
- Times Higher Education (2010, UK) — 'Publish or perish' culture distorting research
- The Economist (2009, UK) — Liar! Liar!