Ultimate magazine theme for WordPress.

Your Mind’s Constructed-in Biases Insulate Your Beliefs From Contradictory Details

By Jay Maddock, Texas A&M University

As early as 2008 there was a rumor that Barack Obama was not born in the USA. At the time, I was the chairman of the Hawaii Board of Health. The director and assistant health director, both appointed by a Republican governor, checked Obama’s birth certificate on state records and confirmed it was real.

I would have thought that this evidence would settle the matter, but it didn’t. Many people thought the birth certificate was a made-up document. Today many people still believe that President Obama was not born in the United States

More recently, I’ve been listening to a Science Friday podcast about the anti-vaccination movement. A woman called who did not believe vaccines were safe, despite overwhelming scientific evidence to support them. The host asked her how much evidence she would need to believe vaccines are safe. Your Answer: Not a lot of scientific evidence could change your mind.

As a psychologist, I was disturbed by this exchange, but not shocked. There are several well-known mechanisms in human psychology that enable people to hold onto their beliefs even in the face of conflicting information.

Cognitive abbreviations are associated with prejudice

In its early days, the science of psychology assumed that people would make rational decisions. However, over the decades it has become apparent that many of the decisions people make – from romantic partners and finances to risky health behaviors like unsafe sex and health promoting behaviors – are not made rationally.

Instead, the human mind is prone to several cognitive prejudices. These are systematic mistakes in the way you think about the world. Given the complexity of the world around you, your brain is cutting off a few corners so you can process complex information quickly.

For example, availability bias refers to the tendency to use information that you can quickly remember. This is helpful when ordering ice cream in one place with 50 flavors. You don’t have to think about all of them, just one that you recently tried and liked. Unfortunately, these shortcuts can mean you arrive at an unrational decision.

One form of cognitive bias is known as cognitive dissonance. This is the feeling of discomfort you may experience when your beliefs are inconsistent with your actions or new information. In this state, people can reduce their dissonance in two ways: by changing their beliefs to match the new information, or by interpreting the new information to justify their original beliefs. In many cases, people consciously choose the latter, whether consciously or not.

For example, you might think you’re active, no couch potato at all – but you spend all Saturday lying on the couch playing reality TV. You can either start thinking about yourself in new ways, or you can justify your behavior by perhaps saying you had a very busy week and need rest tomorrow for your workout.

Affirmation bias is another process that helps you justify your beliefs. It’s about giving preference to information that supports your beliefs and downplaying or ignoring information to the contrary. Some researchers have referred to this as “my side blindness” – people see the flaws in arguments that contradict their own but cannot see any weaknesses on their own side. Imagine fans of a soccer team who played for the season 7-9 that their team is really very strong and that they discover mistakes in other teams but not in theirs.

With the decline in mass media over the past few decades and the increase in niche and social media, it is becoming easier to surround yourself with news that you are already okay with while minimizing the exposure to news that you don’t have. These information bubbles reduce cognitive dissonance, but also make it difficult to change your mind if you’re wrong.

Basing beliefs about yourself

It can be especially difficult to change certain beliefs that are central to how you understand yourself – that is, who you think you are. For example, if you believe that you are a kind person and cut someone off in traffic instead of thinking that you might not be that nice, it is easier to believe that the other person was driving like an idiot.

This relationship between belief and self-image can be reinforced through connections with groups such as political parties, cults, or other like-minded thinkers. These groups are often belief bubbles in which the majority of members believe the same thing and repeat these beliefs among themselves, strengthening the idea that their beliefs are correct.

Researchers have found that people generally think they are better informed about certain topics than they really are. This has been shown in a multitude of studies looking at vaccinations, the Russian invasion of Ukraine, and even how toilets work. These ideas are then passed on from person to person without being based on fact. For example, only 70% of Republicans say they believe the 2020 presidential election was free and fair, even though there is no evidence of widespread electoral fraud.

[The Conversation’s science, health and technology editors pick their favorite stories. Weekly on Wednesdays.]

Belief bubbles and defenses against cognitive dissonance can be difficult to break. And they can have important downstream effects. For example, these psychological mechanisms influence the way people decided whether or not to follow public health guidelines on social distancing and wearing masks during the COVID-19 pandemic, sometimes with fatal consequences.

Changing people’s minds is difficult. Given the confirmation bias, evidence-based arguments that contradict what someone already believes are likely to be discounted. The best way to change your mind is to start with yourself. Think as openly as possible about why you believe what you are doing. Do you really understand the problem? Could you think about it differently?

As a professor, I want my students to discuss ideas from the side that they personally disagree with. This tactic tends to lead to a deeper understanding of the issues and make them question their beliefs. Try it honestly. You might be surprised where you end up.The conversation

Jay Maddock, Professor of Public Health, Texas A&M University

This article is republished by The Conversation under a Creative Commons license. Read the original article.

Comments are closed.