Featured Product
This Week in Quality Digest Live
Innovation Features
Jill Roberts
Another way to know what’s too old to eat
Gregory Way
Drug designers are rethinking their response to medications that affect multiple targets
Del Williams
8-in. cable and disc systems are comparable to belt or bucket systems
Edmund Andrews
For creative collaboration, sometimes you can’t beat a face-to-face meeting
Steven Brown
21st-century standard candles at NIST

More Features

Innovation News
Virtual reality training curriculum prepares organizations for rapid transformation
Meet the latest generation of LC xx6 encoders
Maximum work envelope in a small footprint
On-demand pipe flow measurement, no process interruptions
Products range from software to scanners
Four new models available in TVM Series

More News

The Un-Comfort Zone With Robert Wilson

Innovation

Cognitive Bias Is the Loose Screw in Critical Thinking

Recognizing your biases enhances understanding and communication

Published: Tuesday, March 1, 2022 - 13:02

When I was a kid, I was enamored of cigarette-smoking movie stars. When I was a teenager, some of my friends began to smoke; I wanted to smoke too, but my parents forbade it. I was also intimidated by the ubiquitous anti-smoking commercials I saw on television warning me that smoking causes cancer. As much as I wanted to smoke, I was afraid of it.

When I started college as a pre-med major, I also started working in a hospital emergency room. I was shocked to see that more than 90 percent of the nurses working there were smokers, but that was not quite enough to convince me that smoking was OK. It was the doctors. Eleven of the 12 emergency room physicians I worked with were smokers. That was all the convincing I needed. If actual medical doctors thought smoking was safe, then so did I.

I started smoking without concern because I had fallen prey to an authority bias, which is a type of cognitive bias. Fortunately for my health, I wised up and quit smoking 10 years later.

It’s likely you’re unaware of these habits

Have you ever thought someone was intelligent simply because they were attractive? Have you ever dismissed a news story because it ran in a media source you didn’t like? Have you ever thought or said, “I knew that was going to happen!” in reference to a team winning, a stock going up in value, or some other unpredictable event occurring? If you answered yes to any of these, then you may be guilty of relying on a cognitive bias.

In a recent article, I wrote about the importance of critical thinking and how in today’s information age no one has an excuse for living in ignorance. Since then, I have recalled a huge impediment to critical thinking: cognitive bias. We’re all culpable of leaning on these mental crutches, even though we don’t do it intentionally.

What are cognitive biases?

The Cambridge English Dictionary defines “cognitive bias” as the way a particular person understands events, facts, and other people, based on their own particular set of beliefs and experiences. It may not be reasonable or accurate.

PhilosophyTerms.com calls it a bad mental habit that gets in the way of logical thinking.

PositivePsychology.com describes it this way: “We are often presented with situations in life when we need to make a decision with imperfect information, and we unknowingly rely on prejudices or biases.”

According to Alleydog.com, a cognitive bias is an involuntary pattern of thinking that produces distorted perceptions of people, surroundings, and situations around us.

In brief, a cognitive bias is a shortcut to thinking. And it’s completely understandable; the onslaught of information that we are exposed to every day necessitates some kind of time-saving method. It’s simply impossible to process everything, so we make quick decisions. Most people don’t have the time to thoroughly think through everything they’re told. Nevertheless, as understandable as depending on biases may be, it’s still a severe deterrent to critical thinking.

Here’s what to watch out for

Wikipedia lists 197 different cognitive biases. I’m going to share a few of the more common ones, so that in the future you’ll be aware of the ones you may be using.

Confirmation bias is when you prefer media and information sources that are in alignment with your current beliefs. People do this because it helps maintain their confidence and self-esteem when the information they receive supports their knowledge set. Exposing oneself to opposing views and opinions can cause cognitive dissonance and mental stress. On the other hand, exposing yourself to new information and different viewpoints helps open up new neural pathways in your brain which will enable you to think more creatively (see my article “Surprise: Creativity is a Skill not a Gift!”).

Anchoring bias occurs when you become committed or attached to the first thing you learn about a particular subject. A first impression of something or someone is a good example (see my article “Sometimes You Have to Rip the Cover Off the Book”). Similar to anchoring is the halo effect, which is when you assume that a person’s positive or negative traits in one area will be the same in some other aspect of their personality. For example, you might think that an attractive person will also be intelligent without seeing any proof to support it.

Hindsight bias is the inclination to see some events as more predictable than they are. It’s also known as the “I knew it all along" reaction. Examples of this bias would be believing that you knew who was going to win an election, a football or baseball game, or even a coin toss after it occurred.

Misinformation effect is when your memories of an event can become affected or influenced by information you received after the event occurred. Researchers have proven that memory is inaccurate because it’s vulnerable to revision when you receive new information.

Actor-observer bias is when you attribute your actions to external influences and other people’s actions to internal ones. You might think you missed a business opportunity because your car broke down, but that your colleague failed to get a promotion because of incompetence.

False consensus effect is when you assume more people agree with your opinions and share your values than actually do. This happens because you tend to spend most of your time with others, such as family and friends, who actually do share beliefs similar to yours.

Availability bias occurs when you believe the information you possess is more important than it actually is. This happens when you watch or listen to media news sources that run dramatic stories without sharing any balancing statistics on how rare such events may be. For example, if you see several stories on fiery plane crashes, you might start to fear flying because you assume they occur with greater frequency than they actually do.

Bandwagon effect, also known as herd mentality or groupthink, is the propensity to accept beliefs or values because many other people hold them as well. This is a conformity bias that occurs because most people desire acceptance, connection, and belonging with others, and fear rejection if they hold opposing beliefs. Most people won’t think through an opinion and will assume it’s correct because so many others agree with it.

Authority bias is when you accept the opinion of an authority figure because you believe they know more than you. You might assume that they have already thought through an issue and made the right conclusion. And, because they are an authority in their field, you grant more credibility to their viewpoint than you would to anyone else. This is especially true in medicine, where experts are frequently seen as infallible. An example would be an advertisement showing a doctor, wearing a lab coat, touting the advertiser’s product.

Negativity is when you pay more attention to bad news than good. This is a natural bias that dates back to humanity’s prehistoric days, when noticing threats, risks, and other lethal dangers could save your life. In today’s civilized world, this bias isn’t as necessary, (see my article “Fear: Lifesaver or Manipulator”).

Illusion of control is the belief that you have more control over a situation than you actually do. An example of this is when a gambler believes he or she can influence a game of chance.

Understand more and communicate better

Learning these biases, and being on the alert for them when you make a decision to accept a belief or opinion, will help you become more effective at critical thinking.

Discuss

About The Author

The Un-Comfort Zone With Robert Wilson’s picture

The Un-Comfort Zone With Robert Wilson

Robert Evans Wilson Jr. is an author, humorist, and innovation consultant. He works with companies that want to be more competitive and with people who want to think like innovators. Wilson is also the author of the humorous children’s book The Annoying Ghost Kid, which was self-published in 2011. For more information on Wilson, visit www.jumpstartyourmeeting.com.