Technology makes it increasingly practical and efficient to quickly deploy experiments, and run large numbers of people through them. The upshot is that, today, a fixed amount of effort produces work of a much higher level of scientific rigor than 100, 50, or even 10 years ago. Some scientists have focused their steely gazes on applying this new better technology to foundational findings of the past, triggering a replication crisis that has made researchers throughout the human sciences question the very ground they walk on. John Ioannidis is a prominent figure in bringing attention to the replication crisis with new methods and a very admirable devotion to the thankless work of replication.
In the provocatively titled “Why Most Published Research Findings Are False”, Ioannidis makes five inferences about scientific practice in the experimental human sciences:
- The smaller the stud- ies conducted in a scientific field, the less likely the research find- ings are to be true.
- The smaller the effect sizes in a scientific field, the less likely the research findings are to be true.
- The greater the num- ber and the lesser the selection of tested relationships in a scientific field, the less likely the research findings are to be true.
- The greater the flex- ibility in designs, definitions, out- comes, and analytical modes in a scientific field, the less likely the research findings are to be true.
- The greater the financial and other interests and preju- dices in a scientific field, the less likely the research findings are to be true.
- The hotter a scientific field (with more scientific teams involved), the less likely the research findings are to be true.
His argument, and arguments like it, has produced a great effort at quantifying the effects of these various forms of bias. Excellent work has already gone into the top three or four. But the most mysterious, damning, dangerous, and intriguing of these is #6. And, if you dig through the major efforts at pinning these various effects down, you’ll find that they all gloss over #6, understandably, because it seems impossible to measure. That said, Ioannidis gives us a little hint about how we’d measure it. He briefly entertains the idea of a whole scientific discipline built on nothing, which nevertheless finds publishable results in 1 out of 2, or 4 or 10 or 20 cases. If such a discipline existed, it would help us estimate the relative impact of preconceived notions on scientific outputs.
Having received much of my training in psychology, I can say that there are quite a few cases of building a discipline on nothing. They’re not at the front of our minds because psychology pedagogy tends to focus more on its successes, but if you peer between the cracks you’ll find scientific, experimental, quantitative, data-driven sub-fields of psychology that persisted for decades before fading with the last of their proponents, that are remembered now as false starts, dead ends, and quack magnets. A systematic review of the published quantitative findings of these areas, combined with a possibly unfair assumption that they were based entirely on noise, could help us estimate the specific frequency at which preconceived bias creates Type I false positive error.
What disciplines am I talking about? Introspection, phrenology, hypnosis, and several others are the first that came to mind mind. More quantitative areas of psychoanalysis, if they exist, and if they’re ridiculous, could also be fruitful. In case I or anyone else wants to head down this path, I collected a bunch of resources for where I’d start digging. My goal would be to find tables of numbers, or ratios of published to unpublished manuscripts, or some way to distinguish true results from true non results from false results from false non results.
- The archives of Titchener (at Cornell) and Wundt
- Boring’s paper on the History of Introspection https://pdfs.semanticscholar.org/1191/4d0d6987fa13d7f75c0717441d1457b969f3.pdf
- Bem’s pilots
- https://www.newyorker.com/magazine/2010/12/13/the-truth-wears-off (ironically written by Jonah Lehrer)
- Orne’s “On the social psychology of the psychological experiment”
- Other dead theories:
Dictionary of Theories, Laws, and Concepts in Psychology (https://books.google.com/books?id=6mu3DLkyGfUC&pg=PA49 )