IResearch

The hardness of the soft sciences.

For nearly two decades I've measured what we politely don't measure: bias, misconduct, and the uneven epistemic ground between disciplines.

§ I

Overview

The work has produced some of the largest empirical surveys of questionable research practices, and a series of papers mapping a hierarchy of the sciences along measurable dimensions of the published literature.

More recently I've been developing K-theory — an approach that treats scientific knowledge as a problem in information compression. The first empirical test, published in 2025, predicted the reproducibility outcomes of the Brazilian Reproducibility Initiative.

The lab is the Theoretical and Empirical METaknowledge (TEMET) lab at Heriot-Watt, with a parallel fellowship at the LSE Department of Methodology.

§ II

The TEMET lab

I lead the Theoretical and Empirical METaknowledge lab at Heriot-Watt — a research group dedicated to studying the structure of knowledge using the tools of metascience, statistics, and information theory. The lab brings together work on bias, misconduct, the hierarchy of the sciences, and K-theory under one roof.

§ III

Publications

Meta-research

  1. Fanelli, D., Voodla, A., Andres, S. & Uusberg, A. (2026) Questionable Research Practices: A Principled Classification and Ranking Based on Survey Data. Science and Engineering Ethics
  2. Fanelli, D. (2025) Behind the scenes of scientific fraud. Science
  3. Fanelli, D., Tan, P. B., Amaral, O. B. & Neves, K. (2025) A metric of knowledge as information compression reflects reproducibility predictions for biomedical experiments. Royal Society Open Science
  4. Bartos, F., Maier, M., Wagenmakers, E.-J., et al., Fanelli, D., Stanley, T. D. (2024) Footprint of Publication Selection Bias on Meta-Analyses in Medicine, Environmental Sciences, Psychology, and Economics. Research Synthesis Methods
  5. Martineau, S., Cristea, I. A., Chevance, A., Fanelli, D., Naudet, F. (2023) Are large prospective trials on antidepressants in mental disorders seeding trials? BMJ Open
  6. Bakker, C., Boughton, S., Faggion, C. M., Fanelli, D., Kaiser, K. A., Schneider, J. (2023) Reducing the residue of retractions in evidence synthesis: Ways to minimize inappropriate citation and use of retracted data. BMJ Evidence-Based Medicine
  7. Fanelli, D. (2022) The "tau" of science: how to measure, study, and integrate quantitative and qualitative knowledge. MetaArXiv
  8. Fanelli, D., Tan, P. D., Amaral, O. B., Neves, K. (2022) A metric of knowledge as information compression reflects reproducibility predictions in biomedical experiments. MetaArXiv
  9. Fanelli, D., Schleicher, M., Fang, F. C., Casadevall, A., Bik, E. M. (2021) Are individual and institutional predictors of misconduct modulated by national publication incentives policy? Results of a matched-control analysis of problematic image duplications. PLOS ONE
  10. Fanelli, D. & Moher, D. M. (2019) What difference might retractions make? An estimate of the epistemic impact of retractions on recent meta-analyses. bioRxiv
  11. Fanelli, D. (2019) A theory and methodology to quantify knowledge. Royal Society Open Science
  12. Tatsioni, A., Karassa, F. B., Goodman, S. N., Zarin, D. A., Fanelli, D., Ioannidis, J. P. A. (2019) Lost Evidence From Registered Large Long-Unpublished Randomized Controlled Trials: A Survey. Ann Intern Med.
  13. Fanelli, D. (2018) Is science really facing a reproducibility crisis, and do we need it to? PNAS
  14. Fanelli, D., Costas, R., Fang, F. C., Casadevall, A., Bik, E. M. (2018) Testing hypotheses on risk factors for scientific misconduct via matched-control analysis of papers containing problematic image duplications. Science and Engineering Ethics
  15. Fanelli, D., Ioannidis, J. P. A., Goodman, S. N. (2018) Improving the integrity of published science: an expanded taxonomy of retractions and corrections. European Journal of Clinical Investigation
  16. Naudet, F., Sakarovitch, C., Janiaud, P., Cristea, I., Fanelli, D., Moher, D., Ioannidis, J. P. A. (2018) Data sharing and reanalysis of randomized controlled trials in leading biomedical journals with a full data sharing policy. BMJ
  17. Fanelli, D., Costas, R., Fang, F. C., Casadevall, A., Bik, E. M. (2017) Why do scientists fabricate and falsify data? A matched-control analysis of papers containing problematic image duplications. bioRxiv
  18. Fanelli, D., Costas, R., Ioannidis, J. P. A. (2017) Meta-assessment of bias in science. PNAS
  19. Hosseini, M., Hilhorst, M., de Beaufort, I., Fanelli, D. (2017) Doing the Right Thing: A Qualitative Investigation of Retractions Due to Unintentional Error. Science and Engineering Ethics
  20. Goodman, S., Fanelli, D., Ioannidis, J. P. A. (2016) What does reproducibility mean? Science Translational Medicine
  21. Fanelli, D. (2016) Set up a 'self-retraction' system for honest errors. Nature
  22. Fanelli, D. & Larivière, V. (2016) Scientists' individual publication rate has not increased in a century. PLOS ONE
  23. McCrary, J., Christensen, G. & Fanelli, D. (2016) Conservative Tests under Satisficing Models of Publication Bias. PLOS ONE
  24. Ioannidis, J. P. A., Fanelli, D., Drake Dunne, D., Goodman, S. N. (2015) Meta-research: Evaluation and improvement of research methods and practices. PLOS Biology
  25. Fanelli, D., Costas, R. & Larivière, V. (2015) Misconduct Policies, Academic Culture and Career Stage, Not Gender or Pressures to Publish, Affect Scientific Integrity. PLOS ONE
  26. Fanelli, D. (2015) We need more research on causes and consequences, as well as on solutions. Addiction
  27. Pupovac, V. & Fanelli, D. (2015) Scientists admitting to plagiarism: a meta-analysis of surveys. Science and Engineering Ethics
  28. Yu, B. & Fanelli, D. (2014) Classifying Negative Findings in Biomedical Publications. Proceedings of BioNLP 2014, Workshop on Biomedical Natural Language Processing
  29. Fanelli, D. (2014) Publishing: rise in retractions is a signal of integrity. Nature
  30. Fanelli, D. & Ioannidis, J. P. A. (2014) Re-analyses actually confirm that US studies may overestimate effect sizes in softer research. PNAS
  31. Fanelli, D. (2013) Why growing retractions are (mostly) a good sign. PLOS Medicine
  32. Fanelli, D. & Ioannidis, J. P. A. (2013) US studies may overestimate effect sizes in softer research. PNAS
  33. Fanelli, D. & Glänzel, W. (2013) Bibliometric evidence for a Hierarchy of the Sciences. PLOS ONE
  34. Fanelli, D. (2013) Redefine misconduct as distorted reporting. Nature
  35. Fanelli, D. (2013) Positive results receive more citations, but only in some disciplines. Scientometrics
  36. Fanelli, D. (2012) Any publicity is better than none: newspaper coverage increases citations, in the UK more than in Italy. Scientometrics
  37. Fanelli, D. (2012) Negative results are disappearing from most disciplines and countries. Scientometrics
  38. Fanelli, D. (2010) Do pressures to publish increase scientists' bias? An empirical support from US states data. PLOS ONE
  39. Fanelli, D. (2010) "Positive" results increase down the Hierarchy of the Sciences. PLOS ONE
  40. Fanelli, D. (2009) How many scientists fabricate and falsify research? A systematic review and meta-analysis of survey data. PLOS ONE

Book chapters & reports

  • Fanelli, D. (2022) Is the narrative of a crisis empirically supported? In: Jussim L., Krosnik J. eds. Research Integrity in the Behavioural Sciences. Oxford University Press.
  • Fanelli, D. (2020) Institutional pressures to publish: what effects do we see? In: Mario Biagioli & Alexandra Lippman eds. Gaming the Metrics. MIT Press.
  • EU Mutual Learning Exercise on Research Integrity (2019). Author of Report 2 (Incentives) and 4 (Training and Education). Prepared for the European Commission, Directorate-General for Research and Innovation.
  • Linee guida del CNR per l'integrità nella ricerca (2015). Co-authored with Cinzia Caporale and input from the CNR Ethics Committee.
  • Honesty, Accountability and Trust: Fostering Research Integrity in Canada (2010). Co-authored with the Expert Panel on Research Integrity, Council of Canadian Academies.
  • Fanelli, D. (2011) The black, the white and the grey areas — towards an international and interdisciplinary definition of scientific misconduct. In: Nick Steneck & Tony Meyer eds. Promoting Research Integrity on a Global Basis. World Scientific Press.
§ IV

Projects

Current

TRUSTparency Horizon Europe · CSA

Increasing reproducibility through co-creation of interventions that support a transparent and trustworthy research ecosystem. We're deploying comCensus to involve researchers directly in the co-creation and evaluation of reproducibility policies.

BEYOND Horizon Europe · CSA

Beyond Bad Apples: Towards a Behavioral and Evidence-Based Approach to Promote Research Ethics and Research Integrity in Europe. Using K-theory in WP3 to develop a more nuanced approach to QRPs.

iRISE Horizon Europe · CSA

Improving Reproducibility in SciencE.

K-theory and the BRI Collaboration

Predicting reproducibility via a metric of information compression — collaboration with the Brazilian Reproducibility Initiative.

Statistical model selection with an expanded metric of complexity Collaboration

With Wesley Bonifay, University of Missouri.

Consciousness as Metaknowledge Theory

A novel theory and methodology to explain and study consciousness through the lens of K-theory.

Concluded

  • Innovating retractions to reward self-correction. A METRICS-sponsored project on revolutionising literature-amendment policies.
  • A mathematical theory of knowledge, bias, science, and pseudoscience. A personal attempt to unify metascience, cognitive science, and philosophy.
  • BayesCAMP: Bayesian Corrections Against Misuses of P-values. With Steven Goodman (Stanford) and Don van Ravenzwaaij (Groningen).
  • Why do scientists fabricate and falsify data? A matched-control analysis with Elisabeth Bik, Ferric Fang, Arturo Casadevall, and Rodrigo Costas.
  • From countries to individuals (2013–2016) — multilevel meta-meta-analysis with John P. A. Ioannidis (Stanford). NIH/Office of Research Integrity, $165k/year.
  • Historical trends of scientists' "productivity" — with Vincent Larivière (Université de Montréal).
  • The integrity of self-retracting scientists — qualitative study with Medard Hilhorst (Erasmus MC).
  • No study's perfect — cross-disciplinary analysis of published errata. Partly funded by COPE.
  • Bias, misconduct and the Hierarchy of the Sciences (2010–2013) — Leverhulme Early-Career Fellowship, ~£78k.
  • Quantifying objectivity in the natural and social sciences (2008–2010) — Marie Curie Intra-European Fellowship, ~€160k.
§ V

Talks & keynotes

Conferences & public talks (2008–present)

  1. Scientific Freedom and Responsibility: Challenges and Opportunities, Santiago de Chile, CL, 2025. Invited keynote — Cautionary tales from Metascience.
  2. SFI Summit 2024, Cork, IE. Keynote — Research integrity: it's complex, and complicated.
  3. WCRI 2024. Opening session keynote — first pre-registered evidence that K-theory predicts irreproducibility on the BRI data.
  4. SIPS 2024, Kenya. Talk: how complexity affects psychology research.
  5. Science Foundation Ireland Summit 2023, Cork, IE. Keynote — Research integrity: it's complex, and complicated.
  6. 8th European Conference on Academic Integrity and Plagiarism, Porto, PT, 2022. Keynote — Research integrity in a complex world.
  7. ENRIO 2021 conference, online. Keynote — What challenges lie ahead for research integrity officers?
  8. REWARD-EQUATOR conference, Berlin, DE, 2020. Keynote — How explaining knowledge helps improve research.
  9. Meta-Science Symposium 2019, Stanford University, USA. Invited talk — Low reproducibility as divergent information: A K-theory analysis of reproducibility studies.
  10. Fixing Science: Practical Solutions for the Irreproducibility Crisis, National Academy of Scholars, Oakland, CA, USA. Invited plenary — Reproducibility reforms if there is no irreproducibility crisis.
  11. Fifth AgreenSkills meeting, Brussels, BE, 2019. Keynote — Is science facing a crisis or an opportunity?
  12. 18th Congress of the International Union for the Study of Social Insects, Guarajá, BR, 2018. Invited talk — Taking the pulse of social insects research.
  13. 9th International Conference on Complex Systems, Boston, MA, USA, 2018. Talk — Towards a meta-theory of scientific knowledge.
  14. Research integrity and Data Management, University of Copenhagen, DK, 2017. Invited talk — How and where can data management policies improve reproducibility and integrity?
  15. ASHG Conference, Orlando, FL, USA, 2017. Invited talk — Making genetics research more reproducible.
  16. Peer Review Congress, Chicago, IL, USA, 2017. Plenary — Summary effect sizes in meta-analyses after removal of retracted studies.
  17. 5th World Conference of Research Integrity, Amsterdam, NL, 2017. Keynote — Conceptual challenges concerning re-analysis and replication practices in reproducible research.
  18. 5th World Conference of Research Integrity, Amsterdam, NL, 2017. Invited talk — How to help scientists own their mistakes.
  19. National Academy of Sciences Sackler Colloquium: Reproducibility of Research, Washington, D.C., 2017. Invited panellist.
  20. International conference "Researching with Integrity", Tartu, EE, 2017. Keynote — Benefits and challenges of defining scientific misconduct.
  21. IV Brazilian Meeting on Research Integrity, Science and Publication Ethics (BRISPE), Goiânia, BR, 2016. Keynote — The mystery of missing negative results.
  22. Computation+Journalism Symposium, Stanford, USA, 2016. Invited panellist on Reproducible Journalism.
  23. MAER-Net Colloquium 2016, Little Rock, AR, USA. Talk — Understanding bias via cross-disciplinary multi-level meta-meta-regression.
  24. ESOF 2016 conference, Manchester, UK. Invited speaker — Going viral: social media and the practice of science in society.
  25. NRIN Research Conference 2016, Amsterdam, NL. Keynote — Is Dutch science at greater risk?
  26. Gaming Metrics 2016, UC Davis, CA, USA. Invited talk — Institutional pressures to publish: what effects do we see?
  27. The 2015 Southampton Conference on the Credibility of Research. Invited talk — An overall empirical perspective of the scientific crisis.
  28. 15th ISSI Conference, Istanbul, TU, 2015. Talk — Are scientists really publishing more? (co-authored with V Larivière).
  29. A new start for Europe: Opening up to an ERA of Innovation, European Commission, Brussels, BE, 2015. Invited talk — What "causes" scientific misconduct?
  30. World Conference of Science Journalists, Seoul, KR, 2015. Invited speaker — 50 Shades of Scientific Fraud.
  31. 4th World Conference on Research Integrity, Rio de Janeiro, BR, 2015. Invited talk — Research misconduct: Conceptions and policy solutions.
  32. 4th World Conference on Research Integrity, Rio de Janeiro, BR, 2015. Talk — From countries to individuals: unravelling the causes of bias and misconduct with multilevel meta-meta-analysis.
  33. 'Improving scientific practice', University of Amsterdam, 2014. Invited talk — Science 2.0: A System Centred on Accurate Reporting.
  34. Presentation of the Danish Code of Conduct for Research Integrity, Copenhagen Business School, DK, 2014. Keynote — Growing challenges for our growing integrity.
  35. 'Circling the square: Research, politics, media and impact', University of Nottingham, 2014. Invited talk — How to maintain integrity yet provide robust knowledge.
  36. Inauguration of the new Chair in Methodology and Integrity, Vrije Universiteit Amsterdam, NL, 2014. Invited talk — What can research on scientific misconduct tell us, and how can it mislead us?
  37. 82e Congrès de l'ACFAS, Association francophone pour le savoir, 2014. Invited talk — L'intégrité dans la diversité des disciplines.
  38. First AgreenSkills meeting, Leuven, BE, 2013. Keynote — Where do false and falsified results grow? How to weed them out?
  39. 8th World Conference of Science Journalists, Helsinki, FI, 2013. Invited talk — Can we still trust science? Mostly yes.
  40. 3rd World Conference on Research Integrity, Montreal, CA, 2013. Talk — Statistical studies on errors, bias and fraud: Towards an evidence-based RCR?
  41. ORI at 20, Baltimore, MD, 2013. Invited talk — Measuring incidence, to understand causes.
  42. 11th Ethical Forum, University Foundation, Brussels, Belgium, 2012. Keynote — What science tells us about scientific fraud.
  43. QUEST for Research Excellence, Washington DC, USA, 2012. Talk — Positive-outcome bias is increasing in most disciplines and countries.
  44. 17th International Conference on Science and Technology Indicators, Montreal, CA, 2012. Poster (with W. Glänzel) — A Bibliometric test of the Hierarchy of the Sciences.
  45. MAER-Net Colloquium 2011, Cambridge, UK. Invited talk — Unravelling the causes of publication bias.
  46. COPE, London, UK, 2011. Invited talk — The 'Bulk' of the Iceberg (and what journals can do about it).
  47. Science College 2010, Ruhr University, Bochum, DE. Keynote — The mystery of the missing negative results.
  48. 4S Annual Meeting, Tokyo, JP, 2010. Talk — Do pressures to publish increase scientists' bias?
  49. 2nd World Conference on Research Integrity, Singapore, 2010. Invited talk — The Black, the White and the Grey Areas — Towards an international and interdisciplinary definition of scientific misconduct.
  50. SEESHOP3 Meeting, Cardiff, UK, 2009. Talk — Expertise and the hierarchy of the sciences.
  51. 4S Annual Meeting, Washington DC, USA, 2009. Talk — Are behavioural and social sciences really less objective?
  52. Science in Society conference, Cambridge, UK, 2009. Talk — How many scientists fabricate and falsify research?
  53. Research Conference on Research Integrity, Niagara Falls, USA, 2009. Invited talk — How many scientists fabricate and falsify research?

Selected seminars & workshops

  • The neglected importance of complexity in meta-research. Invited seminar, Inserm workshop 277, Bordeaux, FR.
  • How an information metric could bring truce to the statistics wars. Invited seminar, Phil Stat Seminar: The Statistics Wars and Their Casualties, 2021.
  • Measuring scientific bias across disciplines and countries. Workshop in Biostatistics, Department of Biomedical Data Science, Stanford University, CA (2017).
  • Crisis in social science: Scientific misconduct. Berkeley Initiative for Transparency in Social Sciences Summer Institute, Berkeley, CA (2016).
  • Bias and misconduct: How? Why? What can be done? Invited public lecture, Grey Matter event at deBuren, Ghent, BE (2013).
  • Bias, misconduct and biomedical research. Invited seminar, MRC Human Genetics Unit, University of Edinburgh (2012).
  • Research bias: causes, consequences, solutions. Invited workshop, Science College, Ruhr University, DE (2010).
§ VI

Media

Interviews

  • ScienceGuide (2018, NL) — Spreken over een 'replicatiecrisis' is onjuist en schadelijk
  • Retraction Watch (2018, USA) — The retraction process needs work. Is there a better way?
  • BNR Nieuwsradio (2017, NL) — Grote wetenschapsbladen overdrijven een beetje
  • Retraction Watch Q&A (2017, USA) — Authors who retract for honest error say they aren't penalized as a result.
  • ORF radio (2016, AT)
  • Le Devoir (2014, CA) — Nos chercheurs sont-ils intègres?
  • Deutschlandradio (2014, DE) — Zwischen wissenschaftlichem Fehlverhalten und unbeabsichtigten Fehlern
  • BBC Radio 4 (2014) — Everything we know is wrong, radio documentary
  • Radio3 Scienza (2014) — Contrordine, scienziati
  • Nature Podcast (2013, UK) — Peer pressure
  • RAI RadioTre Scienza (2013, IT) — Il caso di Diederik Stapel
  • ABC National (2010, AU) — Publish or perish: scientists under pressure

Representative press coverage (2009–present)

  • Quillette (2021) — Lockdown Scepticism Was Never a 'Fringe' Viewpoint
  • Times Higher Education (2018, UK) — Is science really facing a reproducibility crisis?
  • The Economist (2018, UK) — Are research papers less accurate and truthful than in the past?
  • BuzzFeed (2017, USA) — The More Widely Cited A Study Is, The More Likely It Is To Exaggerate
  • The Washington Post (2017, USA) — How biased is science, really?
  • Ars Technica (2017, USA) — Analysis of meta-analyses identifies where science's real problems lie
  • Times Higher Education (2015, UK) — Paper disputes causes of research misconduct
  • The Economist (2013, UK) — Trouble at the lab
  • The Guardian (2012, UK) — False positives: fraud and misconduct are threatening scientific research
  • Nature (2012, UK) — Replication studies: Bad copy
  • USA Today (2011, USA) — File drawer effect: Science studies neglecting negative results
  • Times Higher Education (2010, UK) — 'Publish or perish' culture distorting research
  • The Economist (2009, UK) — Liar! Liar!