Coronavirus skeptics, deniers: Why some of us stick to deadly beliefs
In recent weeks, several conservative media personalities, political and business leaders and other influencers have publicly shrugged off warnings about the dangers of the novel coronavirus, calling it no deadlier than the flu. While some of them have walked back their comments in the face of rising global and U.S. COVID-19 cases and deaths, many of their fans continue to subscribe to the ideology that the contagion is “fake” or overblown. Meanwhile, many young adults are defying the 6-feet-apart social distancing rules. A group in Kentucky even threw a coronavirus party, which helped to spread the virus.
What causes certain people to stick to their beliefs and act with skepticism despite overwhelming contradictory evidence? Berkeley News asked Celeste Kidd, a UC Berkeley computational cognitive scientist who studies false beliefs, curiosity and learning. Here is what Kidd has to say:
Berkeley News: So, why do some people ignore scientific or other evidence to follow authoritarian ideology or to confirm their own biases?
Celeste Kidd: As humans, we rely on other people to inform our opinions. It’s the strength of our species and the reason why we have modern medicine and technologies like smartphones and the internet and robots and vaccines. We especially pay attention to authority figures and majority opinions. We also pay more attention to the beliefs of those we like over those we dislike.
People in positions of authority have a special duty to be careful with their words for this reason. Their words, by nature of their position and stature, are more likely to be adopted as beliefs by people, and at a larger scale, than the words of everyone else. They can use that power to do a lot of good if they are careful or do a lot of damage if they are not.
For example, recently, President Trump repeatedly and confidently suggested that an old malaria treatment, chloroquine, could treat coronavirus, in the absence of scientific evidence to back that statement. He said it was “safe,” that he had a “good feeling” about it and that it could be “one of the biggest game changers in the history of medicine,” even after Dr. Anthony Fauci, director of the National Institute of Allergy and Infectious Diseases, corrected him. An Arizona man, heeding Trump’s advice, ingested an aquarium cleaner that contained an ingredient called chloroquine, and died. The words of authority figures, like heads of state, hold particular weight in influencing people’s beliefs and can be deadly.
Why do we choose to believe some things and not others?
Most of us like to think of ourselves as rational agents who can make decisions and form beliefs that make sense. But the world is far too big and complex for us to have the time or attentional bandwidth to know about everything, so we have to pick and choose. The scientific name for this is “sampling,” and it works well in a dynamic world where the approximate truth is usually good enough to make everyday decisions.
We’re also built to favor investigating the things we feel uncertain about. This tendency pushes us to expand and update our knowledge base. Once we feel like we know everything, we disengage and move on to the next thing. This prevents us from wasting time on what we already know so we can learn something new.
The problem arises when we believe that we know everything there is to know, but we are wrong. When this happens, we are less open to changing our minds based on new information because we don’t seek out new information, and we are more inclined to ignore it when we do encounter it.
Ultimately, who and what influences our beliefs?
If everyone around us appears to believe something, we’re more likely to believe it. And that feedback loop matters, especially early on. For example, if we are forming an opinion about something we’re not completely sure about, we are more likely to make up our minds based on the first pieces of evidence we see. All of this is unconscious, and that’s how learning systems work.
Let’s say your neighbor mentions that she’s on a new activated charcoal diet to rid her body of toxins. Maybe you leave that conversation unsure about whether that diet is legitimate, and you hop online to do some sleuthing. If you search for the phrase “activated charcoal,” you’re likely to see a bunch of pseudoscientific health and wellness content about how activated charcoal is wonderful for all kinds of things—clearing your skin, curing your hangover, calming indigestion.
If the first couple things you see echo that viewpoint, you tend to quickly adopt that belief with high certainty. In the case of the activated charcoal diet, that may not be prudent. There is no good evidence that activated charcoal can do any of those things. And believing that it can, at best, robs you of the opportunity to discover other evidence-based methods that would actually help. At worst, it can be dangerous.
Are some personality types more prone than others to sticking to beliefs despite contradictory evidence?
All of us stick to beliefs in the face of contradictory evidence. All of us have beliefs that do not match reality. It is unavoidable. But it’s possible that some people are better or worse than others at keeping an open mind. Our previous research suggests that uncertainty makes people more willing to change their mind. The downside of that is that constant uncertainty can make us less willing to make decisions and act, which would make it hard to navigate life.
What we do know from work in our lab is that how certain you feel is not a good indicator of how certain you should feel based on the strength of the evidence. Research led by Louis Martí in our lab measured people’s confidence and accuracy as they were in the process of learning a new concept. What we found was that people’s certainty was not predicted by the strength of the evidence. Instead, it was predicted by feedback. If people guessed the answer to a question and got it right by some fluke, their confidence remained high even when they got subsequent answers wrong. That early positive feedback created high certainty that couldn’t be shaken.
Let’s say you hear that sunlight heals sick people, and so you spend some time outside next time you get a cold, or open the bedroom blinds, and you do seem to heal quicker. You tell your friend, he tries it, he says he got better quicker too. Now you feel certain that sunlight cures sickness. But maybe sunlight made you less attentive to your symptoms, or maybe you had a minor cold that ran its course. But because you are certain about the healing powers of sunlight, you aren’t open to subsequent data. So, if someone suggests that maybe sunlight does not cure illness, you are not really interested. Why should you be? You figured it out.
What kind of leadership is needed right now, given our belief systems and what is at stake?
We all need to be more intellectually humble. We all need to recognize that how certain we feel is irrelevant to how certain we should be. We need to recognize that there are scientists and medical experts out there who have the knowledge and expertise we need to make smart decisions, and they are willing and able to share that information with us.
We need our leaders especially right now to understand the role they have in all of this. Words aren’t just words. Words are the basis of beliefs, and beliefs drive our behavior. People who don’t believe the pandemic is real or that it will spread put themselves and everyone else at risk by not doing what needs to be done to stop it.
The media and online platforms that disseminate information also have a critical role to play. A lot of online sites repost and recycle content, which could artificially distort the apparent level of agreement or disagreement that exists in the population. It’s important that people creating and curating this information, and managing delivery platforms, recognize the profound role they play in shaping beliefs and changing minds. They must be cognizant of the responsibility they bear in times of crisis.
Can people be taught to be more open-minded?
It’s possible. Beliefs are not stable, finished things because we are constantly taking in new data and updating. It may seem disheartening that people shut down once they become certain. But I see hope in the fact that people are fundamentally social and that they seek to engage with one another. People are sensitive to the beliefs of those around them. When those beliefs change, people may reconsider their positions. That’s why talking about what is happening is important, and informed people who know the most should be talking the loudest.
As for behavioral and cognitive scientists, we don’t yet know if the tendency to hold onto dubious beliefs can be trained out of people. It’s something a lot of researchers, including us, are interested in right now. If people are aware of their fallibility, they could be taught to moderate their behavior accordingly. We are investigating the viability of that idea. We’ll test it and see, because that’s how science works.