How Our Brains Trick Us into Believing the Wrong Things
Watching the past presidential elections, we can easily find protests and demonstrations where huge crowds of supporters argued with the opposite sides, blaming for the mess they brought to the country.
Supporters limit themselves to see only the good policies while turning blind those bad to the society. And that’s how the confrontation begins.
Have you ever wondered why such a large discrepancy can be caused between the two? Instead of mere difference in political views, it was actually confirmation bias that came into play.
Reason for discrepancy: Confirmation bias
Confirmation bias is a psychological phenomenon which people tend to seek information to reinforce their own beliefs. It is also known as myside bias, which literally means the strong belief to own group’s ideas when we are in a large collaborative group.
How confirmation bias is lethal to us? It blinds us from being objective to facts. Facts that oppose our beliefs. Facts that can prove us wrong. Consequently, we are made irrational and rendered incapable of proper reasoning.
Confirmation bias comes in three dimensions: Biased search for information, biased interpretation and biased memory. They all contribute to our misjudgement in different ways.
1. Biased Search for Information – Only test in one-sided way
It refers to the tendency for people to test their hypotheses in a one-sided way. In simpler and more direct words, we always look for evidence consistent to our hypotheses. This phenomenon has been confirmed by numerous experiments.
For example, in a study, participants were asked to rate another person on the introversion-extroversion scale from the performance of an interview they conducted with him/her. They were also provided with a list of interview questions to choose from. [1]
Interestingly, when the interviewee was introduced as an introvert/extrovert, the interviewer would pick questions that presumed the personality. That is, with the introduction as an introvert, questions like “What do you find unpleasant about noisy parties?” were likely to be asked which gave the interviewee little room to justify himself/herself.
The selection of questions looks to strengthen the belief of the interviewee as an introvert/extrovert. And all these were done subconsciously.
2. Biased Interpretation – interpret in a way supports our beliefs
We are also found lopsided to interpret a piece of information in a way that favors our beliefs. Even when we are given the same piece of evidence, people having opposing stances can view the evidence entirely differently. [2]
During the presidential election in 2004, a study was conducted to people with strong feelings towards the two parties. They were given contradictory statements given by a Republican, a Democratic and a politically neutral figures. They were also given statements that convinced them the contradiction was reasonable. In the end, the result showed that participants were much more likely to rate the political figure of the opposing party contradictory, even with the same evidence.
3. Biased memory – remember memory selectively to support beliefs
It is also known as “selective recall”, where people remember a piece of information selectively to reinforce their beliefs. There are two sayings in this bias, one suggesting memory consistent with prior expectations is stored more easily, while another one suggesting surprising information is more memorable. Both views are confirmed in studies. One thing to be sure, is that we have selective memory.
In one study testing on participants were asked to recall the traits of a person in a job application scenario. When told the applicant was looking for a librarian job, participants recalled more traits related to introversion. On the other hand, participants recalled more extroverted traits when were told it was a real estate salesperson application. [3]
Confirmation bias makes us believe our faulty belief even more.
Up to this point, we are aware of the fact that our minds are biased. But what does it do to us?
On the scientific grounds we often look for cause-and-effect relationship. If confirmation bias is in play, we are likely to fall into traps to affirming faulty hypotheses.
Researchers are sometimes guilty of confirmation bias by setting up experiments or framing their data in ways that will tend to confirm their hypotheses. It is common to see an incident follows another.
But does that mean there are causal relationship? Not really. But when researchers seek to identify the relationship, they are likely to falsely recognize it.
When it comes to business decision making, it is also very dangerous not to be objective. People usually overlook the importance of information that may have substantially influenced the decision to be made when the piece of information is not in their way.
For example, when an executive team is devising a new strategy, they are likely to magnify even the tiniest clue of success. The downside and contrary results are put aside and disregarded. Or they are dismissed as exceptional or special cases which require little attention. Such flaws and selective blindness in decision-making can severely harm a business.
Or even, back to daily life when we wish for a weight loss. We pick a diet and follow it. Weight changes. If the weight is reduced, you reckon it is the magical effect of the diet. If later the weight rebounds, confirmation bias tells you that it is just a fluctuation and the diet is still working perfectly.
To defeat confirmation bias, do these practices to stop trusting your instinct blindly.
Now everyone has the bias and it is obviously affecting us. How can we fight against it?
Prove ourselves wrong instead
Write down our hypotheses and, instead of seeking evidence in favor of our view, look for the opposite. Set ours to find opposing evidence as much as we can.
There is always a reason why we can find disconfirming evidence. In fact, finding opposing view is actually a big hint on the presence of flaws in our hypotheses.
Independent thinking in group
In group decision-making, always look to obtain information from each member in a way that they are not dependent on one another. We should strive to clear away any influence that can potentially affect one’s decision. Welcome people of confronting ideas to have a clearer picture of ours.
This is actually what Abraham Lincoln did to clear his minds. He invited rival politicians and welcomed debate and discussion, in spite of their completely contradicting opinions.
The same method is also used in police investigation. Witnesses are not allowed to discuss with one another to avoid influencing unbiased witnesses.
Expect the results
If we encounter unexpected situations or surprising results, never treat them as “special” or “exceptional” case. They are not!
Try to explain the occurrence of the incidents and provide 3 possible reasons. Why 3 but not 5,6 or 7? Research has actually suggested 3 is the ideal number. Having more or less causes is actually equally effective, if not less, at analyzing the problem. [4]
The more possible reasons identified, the more likely the underlying cause of the unexpected result can be found.
Featured photo credit: Flaticon via flaticon.com
Reference
[1] | ^ | Communication Cache: Mark Snyder & William B. Swann Jr.: Hypothesis-Testing Processes in Social Interaction |
[2] | ^ | Web Archive: Neutral Bases of Motivated Reasoning: an fMRI Study of Emotional Constraints on Partisan Political Judgement in the 2004 U.S. Presidential Election |
[3] | ^ | University of Minnesota: Testing hypotheses about other people: the use of historical knowledge |
[4] | ^ | Jstor: Auditor’s Hypothesis Testing in Inference Tasks |
The post How Our Brains Trick Us into Believing the Wrong Things appeared first on Lifehack.
Source: Lifehack.org
Post a Comment