Sir Robin Murray, a distinguished British professor of psychiatry, recently published a paper in Schizophrenia Bulletin titled, “Mistakes I Have Made in My Research Career.” He describes the evolution of his thinking regarding the concept of schizophrenia, including the problems with the neurodevelopmental model, the limitations of the drugs used to treat the condition, and his failure to pay adequate attention to the role of social factors in the etiology of psychotic states. These ideas are not new to anyone who has read Anatomy of an Epidemic. Sir Robin’s ’s paper could be read as a synopsis of Chapter 6, “A Paradox Revealed.”
When a commentary on this paper was published on Mad In America, it garnered significant attention (over 100 comments in its first week). I, too, was fascinated. What for me was most striking was captured in the title. It harkens back to the book by Carol Tavris and Elliot Aronson, “Mistakes Were Made (But Not by Me)”. That book was about the concept of cognitive dissonance and its role in making errors in our thinking difficult to acknowledge. But notice the difference in titles. Professor Murray did not distance himself from error. He could have called his paper, “Mistakes Psychiatry Made.” Instead, he titled it, “Mistakes I Have Made” (italics mine). Kudos to him. In his paper and his comment on MIA, he distinguishes himself as a gracious man.
I perceived his paper to be a self-reflection rather than an indictment of psychiatry. It is an example of the impact of cognitive dissonance and the various biases to which we are all susceptible. When Murray writes about his belated acknowledgement of the role of social factors in the etiology of psychosis, he states, “The truth was that my preconceptions [regarding the neurodevelopmental model] had made me blind to the influence of the social environment.”
One type of bias is known as confirmation bias. This is the tendency to pay more attention to information that supports one’s opinion and discount or ignore data that contradict it. According to a leader in this field, Daniel Kahneman, these tendencies are not a reflection of limited intelligence or moral failing but are a core component of human cognition. (Kahneman’s work is summarized in his outstanding book, Thinking, Fast and Slow.) We have an amazing ability to make rapid inferences that are often correct. But this comes at a cost and carries risk.
In the history of science and medicine, there have been repeated instances of highly valued theories elevated prematurely to accepted doctrine only to give way to new ideas and theories. Every young medical student has the experience of thinking smugly about discredited ideas once highly revered that now seem, in retrospect, quaint. We value science and logic and believe they will protect us from the same fate. But time marches on and it is a rare medical or scientific career that is untouched by the experience of having to give up an idea once “known” to be valid.
Ideas may give way under the weight of new data; the scientific process continues to hold its value. But it is not impervious — sometimes over decades — to error. Although we might conclude our careers with our most deeply held beliefs intact, who knows what our successors will retain and discard?
I wonder what leads Robin Murray to acknowledge his mistakes when others seem to hunker down. I also wonder how I can know when I am misled in my assumptions. I have changed my mind multiple times in my career. I do not know if my current ideas are any truer than earlier ones. I had a conversation recently with a colleague in which I was articulating these thoughts about Professor Murray’s essay and she responded that my confirmation bias had led me to draw faulty, negative conclusions about the usefulness of antipsychotic drugs. We were at an impasse. If we all have confirmation bias, how do we decide who is correct? How do we know what is true?
In recent years, as my own conclusions have diverged from those of so many of my colleagues, I often find myself caught up in an inner dialogue about knowledge and truth. Peter Zachar’s book, A Metaphysics of Psychopathology, has been another important resource for me. It concludes with a philosophical examination of psychiatric nosology that is preceded by a careful examination of epistemology — how do we determine what is true, who has the authority to judge truth? I think of it as a philosophical companion to Kahnemann’s book. Zachar discusses the radical empiricists’ notion of the coherence theory of truth. He writes that “new propositions that seem to readily cohere with what we already believe are going to be accepted more easily than propositions that contradict currently accepted knowledge.” This makes it hard to shift in a more discordant — radical — direction.
At the same time, philosophical modernism led us to question authority, or at least acknowledge that authorities can be wrong. Zachar writes about literalism — a philosophical stance that is often associated with fundamentalist interpretations of the Bible but can be found elsewhere. One literalist assumption, according to Zachar, is that “It is important to discern which ‘experts’ should be granted epistemic authority.” We live in a world that is so complex; in many areas, none of us can ever know all of the available evidence. So while we may value our ability to think for ourselves and form our own conclusions, we still rely on authorities. I believe in the reality of climate change not out of a deep understanding of climatology but mostly because I have decided which experts I will trust.
I am preoccupied with “truth” claims in one very narrow area of our universe — namely, what is the best way to think about the antipsychotic drugs. I have decided that Joanna Moncrieff’s drug-centered model is most useful because it helps me to consider these drugs not as a specific treatment for a specific disorder but as psychoactive drugs that may have some benefits for some people at some times. This strikes me as a paradigm that is more consistent with the available data on the subject. The Krapelinian concept of schizophrenia has not held up well over the century since it was posited. There is no unified theory of etiology or pathogenesis, its genetic risk factors are multiple and variable from one individual to the next, and there are vastly different outcomes for those assigned the diagnosis. It is at best a category that entails considerable heterogeneity.
The drugs classified as “antipsychotic” are similarly varied in their actions in the brain. In recent years, their proposed targets have broadened. The idea that antipsychotic drugs treat a specific pathological process inherent to all who merit the diagnosis of schizophrenia simply makes no sense. And yet these compounds seem fairly consistently to reduce the intensity of voices and lessen the intrusiveness of unusual thoughts. The notion that their impact results from their inducing a state of cognitive indifference, as Laborit observed so many years ago, seems to explain why they can be helpful in some circumstances. The cognitive indifference reduces the intrusion and intensity of voices. That is consistent with what people tell me when they take the drugs.
At the same time, it makes sense that this same effect might contribute to the negative impact on functional outcomes when people take them over years. Their drive to get out of bed, engage with people, and find jobs may be diminished by the effects of the drugs. It also makes sense that the drugs might have this impact on anyone who takes them, regardless of diagnosis. That is to say, it is likely better conceived as a drug effect and not a disease-specific effect. This does not lead me to a conclusion that they should never be prescribed but it leads me to be more cautious in their administration.
My views overlap considerably with Professor Murray but they also diverge. For instance, I do not accept that “there is no doubt that antipsychotics are necessary in acute active psychosis,” although I often find them to be helpful in that setting. What is remarkable is that while my views are shared by other colleagues, I find it hard to find anyone with whom my perspective overlaps fully. This seems to be less a reflection of my “rightness” or their “wrongness” but of the complexity of the evidence and the question at hand.
If I understand Kahnemann, his antidote to cognitive bias is to slow down and do the hard work of thinking through problems. I wish I could borrow his supple mind. While I like to think of myself as deriving my conclusions through careful research and reason, I am also told — repeatedly — that I am passionate and some have suggested my passion clouds my reason. I like to think my passion drives me to reasoned inquiry, but who knows?
The photo shown here — of the many books I have read in recent years — at least speaks to my attempt at reasoned inquiry.
Thank you, Professor Murray. I would much rather engage critically with accepted dogma at the risk of making mistakes than sit back and accept what the reigning authorities assure me is true.
This blog is dedicated to Mickey Nardo, MD, a psychiatrist and blogger who embodies careful thinking and a deep humanity. His blog, 1boringoldman.com, has done the heavy lifting of careful analysis of drug studies that has been sorely missing in other quarters. He has taught me and many others and I look forward to reading more of his blogs in the months and years to come.
Dr. Sandra Steingard is Medical Director at HowardCenter, a community mental health center where she has worked for the past 21 years. She is also clinical Associate Professor of Psychiatry at the College of Medicine of the University of Vermont. For more than 25 years, her clinical practice has primarily included patients who have experienced psychotic states. Dr. Steingard serves as Board Chair of the Foundation for Excellence in Mental Health Care.