The COVID-19 pandemic exacerbated many preexisting problems. One of them is how to vet the avalanche of information flowing down at us on a daily basis. Another is how to properly assess the risks that we face.
Most of the questions I had to answer in the early days of the pandemic were formulated as real-life scenarios punctuated by a question mark: I want to go shopping, I was told, and try on clothes. Will I catch the coronavirus this way? As vaccines were rolled out, some people started to wonder how safe these expedited miracles really were given the chatter that accompanied them. While assessing risk is crucial to our survival, it turns out that we are not always very rational about it.
In the 1970s, one hundred and eleven students on the campus of the University of Oregon volunteered for. They were presented with two causes of death and asked which one was more likely, in general, in the United States and by how much. When compared to actual data from the U.S. government, the students held “many, often severe, misconceptions,” the researchers wrote. The students thought accidents were 25 times more frequent than strokes in killing people; actually, strokes caused at the time 85% more deaths than all accidents put together. Students also thought that an American had equal chances of dying from a disease than from an accident; in reality, diseases killed 15.4 times more people than accidents. Meanwhile, tornadoes were reported as being a graver threat than asthma, but dying from asthma was actually 20 times more likely. Dramatic events that were out of someone’s immediate control and that often made the news were perceived as causing more deaths than plain old diseases.
These failures to properly assess risk should not come as a complete surprise. We have often heard that people tend to be more scared of flying in an airplane than riding in a car: flying-related anxiety is estimated by different studies to range. Given the prevalence of car crashes, it’s the drive to the airport that is actually the more dangerous stretch of the journey. Unfortunately, when it comes to risk perception, our brain is not the cool cucumber number cruncher we’d like to think it is. As journalist Dan Gardner wrote in his book Risk: The Science and Politics of Fear, there are things that press our buttons and that make a risk appear worse than it is in our mind.
Dr. Paul Slovic, a psychologist interested in how we make decisions in the face of risk, initially saw them emerge in his research over time. If something has catastrophic potential, it can often be perceived as inherently riskier even if the risk itself is very low. This is why nuclear power is commonly seen as much more dangerous than it actually is: people remember Chernobyl. If something is unfamiliar, it might also boost its perceived risk. If we don’t feel in control of a thing’s potential for harm, we will think it riskier. Ditto when children are involved.
Media coverage can also make rare events very salient in the mind and thus skew our perception of how likely they are to happen. Many of us remember the dramatic reporting over anthrax scares, Satanic cults, and terrorism. Speaking of the latter, the German psychologist Gerd Gigerenzer calculated that, in the year following the 9/11 terrorist attacks in the United States, fewer Americans flew on airplanes and more took to the rural interstate highways, resulting in. This is roughly six times more than the number of passengers who died in the four fatal flights on 9/11. The fear of flying following a catastrophic, highly publicized attack potentially led to more deaths as people sought to avoid what they believed to be the bigger risk: dying in an airplane taken over by terrorists.
Over time, this list of factors that predispose us to inflate risk got trimmed down to a single, very strong predictor of how people felt about the risk of a particular technology or activity: the dread factor. It is a gut-level, immediate reaction of fear toward a potential risk, and it rapidly takes over our more rational way of thinking.
The COVID-19 vaccines have certainly elicited their share of dread among some people. Early reports of individuals fainting during vaccination led to a bit of a panic, even though vasovagal syncope (the medical term for this type of fainting) is well known; it can be triggered by the fear of needles, sharp pain, or the sight of blood; and.
The risk of myocarditis (an inflammation of the heart muscle) with the Ď㽶ĘÓƵ-based COVID-19 vaccines appears to be incredibly low (), and myocarditis itself can be caused by common viruses and is thought to be under-diagnosed since. But intense media coverage of this very rare side effect can skew our perception of its risk.
The same goes for the very rare blood clots associated with AstraZeneca’s COVID-19 vaccine. They are very real and serious, but the unrelenting media coverage of “the first person in this city/province/state/country to die from a blood clot after receiving the vaccine,” mixed with the dread some people already felt toward this new vaccine, blew a very rare risk into a gargantuan worry. When dread sets in, statistics and studies may not be enough to recalibrate our risk perception.
None of this should be taken as a rallying cry for us to suppress our emotions and follow the Vulcan path of pure logic. Emotions are often healthy, but we need to be aware that itty-bitty risks can get magnified under the lens of our emotional state, especially if these risks push specific buttons in us, like loss of control and our fear of catastrophe. But the reverse is also true. Some risks pass us by without so much as a touch of concern. The flu is, a reminder frequently made by medical experts during the COVID-19 pandemic in response to the faulty argument that COVID-19 was just a bad flu. We generally don’t think of the flu as a major killer, but it is.
If we want to get better at assessing risks, we need to learn to slow down our thinking. Our initial gut reaction following the perception of a potential threat is fed by emotions and by the headlines we have read on social media. We need to give our brain time to move the heavy gears of our reasoning. If we hear that AstraZeneca’s COVID-19 vaccine resulted in 17 people getting blood clots, we should ask ourselves: out of how many? Out of 100? That would be very scary. Out of a million (which is)? That’s a very different risk. To not ask this question is to fall victim to “denominator neglect.” The top number, or numerator, in a fraction is important, but so is the bottom number, or denominator.
Thinking more slowly and more deliberately about risk is a useful skill to develop. Otherwise, we run the risk of fretting over small risks and ignoring the bigger ones.
Take-home message:
-Our perception of a risk can be inflated because of a quick, gut-level dread that is not based on facts
-Thinking more slowly and more deliberately can allow the more reasonable part of our brain to kick in and do a better job at assessing risk
-When someone tells you how many people experienced harm following a particular activity, don’t forget to ask “out of how many?” to have a better idea of how risky the activity may be