What is confirmation bias in psychology




















Consider the debate over gun control. Let's say Sally is in support of gun control. She seeks out news stories and opinion pieces that reaffirm the need for limitations on gun ownership. When she hears stories about shootings in the media, she interprets them in a way that supports her existing beliefs. Henry, on the other hand, is adamantly opposed to gun control. He seeks out news sources that are aligned with his position.

When he comes across news stories about shootings, he interprets them in a way that supports his current point of view. These two people have very different opinions on the same subject and their interpretations are based on their beliefs. Even if they read the same story, their bias tends to shape the way they perceive the details, further confirming their beliefs. In the s, cognitive psychologist Peter Cathcart Wason conducted a number of experiments known as Wason's rule discovery task.

He demonstrated that people have a tendency to seek information that confirms their existing beliefs. Unfortunately, this type of bias can prevent us from looking at situations objectively. It can also influence the decisions we make and lead to poor or faulty choices. During an election season, for example, people tend to seek positive information that paints their favored candidates in a good light.

They will also look for information that casts the opposing candidate in a negative light. By not seeking out objective facts, interpreting information in a way that only supports their existing beliefs, and only remembering details that uphold these beliefs, they often miss important information.

These details and facts might have otherwise influenced their decision on which candidate to support. James Goodwin gives an example of confirmation bias as it applies to extrasensory perception.

As Catherine A. We are more likely to remember and repeat stereotype-consistent information and to forget or ignore stereotype-inconsistent information, which is one way stereotypes are maintained even in the face of disconfirming evidence. Confirmation bias is not only found in our personal beliefs, it can affect our professional endeavors as well.

In the book Psychology , Peter O. Gray offers this example of how confirmation bias may affect a doctor's diagnosis. A doctor who has jumped to a particular hypothesis as to what disease a patient has may then ask questions and look for evidence that tends to confirm that diagnosis while overlooking evidence that would tend to disconfirm it. Awareness, he thinks, would lead to fewer diagnostic errors. A good diagnostician will test his or her initial hypothesis by searching for evidence against that hypothesis.

Unfortunately, we all have confirmation bias. Even if you believe you are very open-minded and only observe the facts before coming to conclusions, it's very likely that some bias will shape your opinion in the end. It's very difficult to combat this natural tendency. That said, if we know about confirmation bias and accept the fact that it does exist, we can make an effort to recognize it by working to be curious about opposing views and really listening to what others have to say and why.

When we make decisions, this bias is most likely to occur when we are gathering information. It is also likely to occur subconsciously, meaning that we are probably unaware of its influence on our decision-making.

As such, the first step to avoiding confirmation bias is being aware that it is a problem. By understanding its effect and how it works, we are more likely to identify it in our decision-making. Psychology professor and author Robert Cialdini suggests two approaches to recognizing when these biases are influencing our decision making:. Second, because the bias is most likely to occur early in the decision-making process, we should focus on starting with a neutral fact base.

This can be achieved by having one or ideally, multiple third parties who gather facts to form a more objective body of information.

Third, when hypotheses are being drawn from the assembled data, decision-makers should also consider having inter-personal discussions that explicitly aim at identifying individual cognitive bias in the hypothesis selection and evaluation.

While it is likely impossible to eliminate confirmation bias completely, these measures may help manage cognitive bias and make better decisions in light of it. Confirmation bias was known to the ancient Greeks. The phenomenon was first described as confirmation bias by Peter Wason in To find out what the rule is, Wason told them they could make various other sets of numbers to see if they also satisfied the rule. An examiner would tell them if the conjured numbers satisfied the rule or not.

Most subjects proposed the rule was a sequence of even numbers and would follow this rule by doubling the given numbers in order to test their hypothesis. However, this was not the rule Wason had in mind. The rule was simply that the numbers in the set were increasing. The experiment showed that most subjects formed a similar hypothesis and only tried number sequences that proved it rather than considering sequences that disproved it.

A major study carried out by researchers at Stanford University in explored the psychological dynamics of confirmation bias. The study was composed of undergraduate students who held opposing viewpoints on the topic of capital punishment, and who were asked with evaluating two fictitious studies on the topic.

One of the false studies given to participants provided data in support of the argument that capital punishment deters crime, while the other supported the opposite view that capital punishment had no appreciable effect on overall criminality in the population.

So, after being confronted both with evidence that supported capital punishment and evidence that refuted it, both groups reported feeling more committed to their original stance.

The net effect of having their position challenged was a re-entrenchment of their existing beliefs. The term was coined by internet activist Eli Pariser to describe the intellectual isolation that can occur when websites use algorithms to predict the information a user would want to see, and then provide information to the user according to this prediction.

This means that as we use particular websites and content networks, those networks are more likely to serve us content that we prefer, while excluding content that our browsing patterns have shown run contrary to our preferences. We normally prefer content that confirms our beliefs because it requires less critical reflection. So, filter bubbles might favour information that confirms your existing options and exclude disconfirming evidence from your online experience.

But the results they saw were quite different. One of my friends saw investment information about BP. The other saw news. For one, the first page results contained links about the oil spill; for the other, there was nothing about it except for a promotional ad from BP. If this were the only source of information that these women were exposed to, surely they would have formed very different conceptions of the BP oil spill.

The internet search engine showed information tailored to the beliefs their past searches showed, and picked results predicted to fit with their reaction to the oil spill. Unbeknownst to them, it facilitated confirmation bias.

While the implications of this particular filter bubble may have been harmless, filter bubbles on social media platforms have been shown to influence elections by tailoring the content of campaign messages and political news to different subsets of voters. Confirmation bias describes our underlying tendency to notice, focus on, and give greater credence to evidence that fits with our existing beliefs.

Evaluating evidence takes time and energy, and so our brain looks for such shortcuts to make the process more efficient. We look for evidence that best supports our existing hypotheses because the most readily available hypotheses are the ones we already have. Another reason why we sometimes show confirmation bias is that it protects our self-esteem.

No one likes feeling bad about themselves— and realizing that a belief they valued is false can have this effect. A study by Stanford researchers found that after being confronted with equally compelling evidence in support of capital punishment and evidence that refuted it, subjects reported feeling more committed to their original stance on the issue.

Websites use algorithms to predict the information a user wants to see, and then provide information accordingly. So, filter bubbles might exclude information that clashes with your existing opinions from your online experience. Filter bubbles and the confirmation bias they produce has been shown to influence elections and may inhibit the constructive discussion democracy rests on. Confirmation bias is most likely to occur when we are gathering the information needed to make decisions.

Research on this effect was pioneered by American psychologist Edward Thorndike who in described ways officers rated their soldiers on different traits based on first impression Neugaard, Experiments have shown that when positive attributes are presented first, a person is judged more favorably than when negative traits are shown first.

This is a subtype of confirmation bias because it allows us to structure our thinking about other information using only initial evidence. Iqra Noor is a member of the Class of at Harvard University. On campus, Iqra is involved with cultural, advocacy, and tutoring organizations. Noor, I. Confirmation bias. Simply Psychology. Agarwal, P. American Psychological Association. APA Dictionary of Psychology. Casad, B. Fyock, J. The role of memory biases in stereotype maintenance.

The British journal of social psychology, 33 3 , — Neugaard, B. Halo effect. Snyder, M. Testing hypotheses about other people: The use of historical knowledge. Journal of Experimental Social Psychology, 15 4 , — Toggle navigation. Confirmation bias happens when a person gives more weight to evidence that confirms their beliefs and undervalues evidence that could disprove it.



0コメント

  • 1000 / 1000