There is a great podcast by economist Tim Harford about things that have gone terribly wrong and what we can learn from them. The podcast is called “Cautionary Tales,” and, believe me, they are. If you’re the kind of person who learns from their mistakes and the mistakes of others, you’ll like this podcast.
Two of the recent stories really resonated with me. The first is called “The Deadly Airship Race,” and it’s about two airships from the early 1900s that were pitted against each other in a race. One airship, the R101, was pressed into service by an overzealous British Lord (Lord Thompson) to beat another airship, the R100, in a race to see which one was the best airship. Along the way, many signs that the airship was not airworthy were ignored by those who would make the decision for it to fly, resulting in tragedy. You can listen to it here: https://megaphone.link/CAD1187741475
As the podcast points out, one of the big factors that led to the tragedy is the human impulse to push through adversity when there are large stakes at hand. (Read “Large Stakes and Big Mistakes.”) One such example is the concept of “get-there-itis” which many aircraft pilots exhibit, sometimes to their own demise. The principle basically says that we will push through even if there is a danger component to our goal if the goal’s reward is big enough, or if we see others doing the same thing. For many pilots, this “get-there-itis” has led them to not deviate from their flight plan even if they’re low on fuel, flying into a storm or have some mechanical failure. For some, it has cost them dearly:
The other cautionary tale that stuck with me is that of the cognitive dissonance we humans tend to show when we’re deeply invested in a program or idea. The podcast episode, titled “Buried by the Wall Street crash,” told the story of two economists, each successful in their own way after taking risks with investments. When the stock market crashed in 1929, both of them were hit badly with the consequences of their “gambling,” but one of them was more resilient than the other.
As it turns out, John Maynard Keynes was not afraid to change his mind about things when given new data. He was flexible enough to protect his wealth from the crash. On the other hand, Irving Fisher stuck to his guns and continued to invest in companies that were crashing. Embedded in the podcast was the story of cult members who were waiting for the world to end. When the world did not end, and their savior aliens did not appear, they took one of two paths: they either admitted that they were wrong and moved on, or they found a way to rationalize the error they made. Those who rationalized their mistake attributed the lack of the end of the world to their own faith in the end of the world. Not doing so would cause them mental discomfort.
Here is that podcast episode: https://megaphone.link/CAD8216744261
I’ve experienced cognitive dissonance myself at different points in my life, and I’d like to think that I have dealt with it rationally. Like John Keynes is said to have said (maybe), I took in new data and changed my mind accordingly. But, man, it is hard to accept that you’re wrong when you’re in too deep on something.
The most painful of all of those experiences was the one girlfriend who drove me into the ground in more ways than one about 15 years ago. Like many romantic relationships, that relationship was at first all about the physicality of it all. Slowly but surely, the physicality was replaced with actual discussions about stuff, and I started to realize that I was probably making a mistake in staying together. Not only that, but she did not like cooking at all, so we found ourselves eating out every single night. Between that and other expenditures, I found myself without cash quickly. I ended up overextending my credit because I didn’t want to see that the relationship was costing me more and more each week.
Lucky for me, a hurricane of circumstances allowed me to walk away, breaking off the relationship and opening my eyes to what had happened. I had changed in many ways, and my bank account was deep in the red. Slowly, I made my way back to mental and financial health, and I shudder to think of where I would be had I continued to cling on to something that clearly wasn’t working.
It would be exactly three years of no dating before I’d go on my next first date, my last first date.
I see the effects of cognitive dissonance today when I take up arguments with anti-vaccine people online. When presented with facts that are incontrovertible, you can see them twitch a little. They may even clutch something on their person, or the person next to them. They reach out for something, anything, to steady themselves and their belief. It hurts too much — or tickles a little bit — to think that they could possibly be wrong.
I also see it in people who’ve been at their jobs for decades and can’t possibly imagine a new and better way to get something done… And I’m not talking about reinventing the wheel, either. I’m talking about realizing that the wheel can be created with new and interesting technologies. They get an allergic reaction to that.
So I must say that I do find it refreshing when I meet people who share with me the lack of fear about changing one’s mind. There’s nothing wrong with adjusting the course, especially if there are rocks ahead. Doing so can be painful, though, so I get if if you don’t do it right away, or if you don’t do it at all… But I wish you did and I wish you will.
René F. Najera, DrPH
I'm a Doctor of Public Health, having studied at the Johns Hopkins University Bloomberg School of Public Health.
All opinions are my own and in no way represent anyone else or any of the organizations for which I work.
About History of Vaccines: I am the editor of the History of Vaccines site, a project of the College of Physicians of Philadelphia. Please read the About page on the site for more information.
About Epidemiological: I am the sole contributor to Epidemiological, my personal blog to discuss all sorts of issues. It also has an About page you should check out.