Many issues in bioethics are influenced by “data” supplied by psychology: how IVF children socialise; whether patients who request euthanasia are depressed; whether surrogate mothers enjoy their work, and so on. So the state of social psychology matters deeply for bioethics. And social psychology, say many psychologists, is in a state of crisis.
Nothing illustrates this more starkly than the news earlier this month that 60 out of 100 psychology experiments failed to replicate. This study from the Reproducibility Project at the Center for Open Science in Charlottesville, Va., was published in the leading journal Science, which is hardly a marginal, counter-cultural outlet for crank science.
For a layman, it is disturbing that the results of experiments reported in peer-reviewed journals could not be reproduced. But should it be?
In a New York Times op-ed, Lisa Feldman Barrett, a psychologist at Northeastern University, in Boston, says that there is no cause for alarm. Rather, the public needs to understand that hypothesis testing is precisely how science works. Failure to replicate may be due to a misunderstanding of the conditions under which the results were obtained.
“ … when physicists discovered that subatomic particles didn’t obey Newton’s laws of motion, they didn’t cry out that Newton’s laws had 'failed to replicate'. Instead, they realized that Newton’s laws were valid only in certain contexts, rather than being universal, and thus the science of quantum mechanics was born.”
And she concludes that
“As with any scientific field, psychology has some published studies that were conducted sloppily, and a few bad eggs who have falsified their data. But contrary to the implication of the Reproducibility Project, there is no replication crisis in psychology.”
However, Oxford bioethicist Brian D. Earp feels that Feldman Barrett was a bit too optimistic. In the past couple of years, several high-profile psychologists have been accused of fraud and there is a feeling that the field is being flooded by low-quality papers. And far too often, psychologists would stick the results of failed-to-replicate studies into the back of their files because such research is almost impossible to get published:
“I think that she understates the problem facing psychology (and other ‘messy’ disciplines, like medicine), with respect to the real crisis of confidence within those fields. Publication bias is a serious problem. The ‘file drawer’ problem is a serious problem. Questionable research practices, sloppy (or just plain wrong) statistics, and ineffective peer review are all serious problems. Finally, the typical rarity of conducting ‘direct’ replications–much less writing them up and submitting them for publication–is an ongoing, deep-seated problem for many areas in science. For its role in trying to address this problem systematically, the Reproducibility Project deserves a huge round of applause.”
This article is published by
and BioEdge under a Creative Commons licence. You may republish it or translate it free of charge with attribution for non-commercial purposes following these guidelines
. If you teach at a university we ask that your department make a donation. Commercial media must contact us
for permission and fees. Some articles on this site are published under different terms.