Tag Archives: cognitive-bias

An Expert’s an Expert Only When We Agree

In the face of information that is contradictory to our beliefs, not only do we reinforce our position, but we also question the credibility of the source itself.

In a study showing that we only agree that there is scientific consensus if that consensus agrees with our viewpoint, researchers from the Cultural Cognition Project also found that if an expert’s opinion is antithetical to our own, we consider them to be objectively less knowledgeable, credible and trustworthy than their peers.

It seems that expert opinion is only expert opinion when it agrees with our opinion. This study found that people more readily count someone as an expert when that person endorses a conclusion that fits their cultural predispositions. The study calls this cultural cognition — individuals tend to form perceptions of risk that reflect and reinforce one or another idealized vision of how society should be organized. Thus, according to the study, generally speaking, persons who subscribe to individualistic values tend to dismiss claims of environmental risks, because acceptance of such claims implies the need to regulate markets, commerce, and other outlets for individual strivings.

Corrections and When They Work

A correction only serves its purpose (to correct our falsely-held beliefs) if we are predisposed to believe the correction itself. If we disagree with the correction, however, it instead acts to actually reinforce our incorrect beliefs (the “backfire effect”).

That’s the conclusion drawn from research conducted by Brendan Nyhan, looking at how we avoid cognitive dissonance in the face of corrective information (pdf).

Brendan’s research on cognitive dissonance and corrections has been nicely summarised by Ryan Sager in a couple of posts: one that looks briefly at the effect of corrections on misinformation, and another looking in great detail at the roots of the anti-vaccine movement.

We find that responses to corrections in mock news articles differ significantly according to subjects’ ideological views.  As a result, the corrections fail to reduce misperceptions for the most committed participants. Even worse, they actually strengthen misperceptions among ideological subgroups in several cases. […]

Test subjects read mock news articles featuring misleading statements about well-known but ideologically contentious subjects such as the presence of weapons of mass destruction in Iraq prior to the U.S. invasion. Half of their subjects read articles including only the misleading statements; half read articles that also included a correction.

By comparing the two groups of respondents, [it was] determined that the ideology of the subjects tended to predict reactions. Efforts to correct misperceptions were more likely to succeed among those ideologically sympathetic to the correction, such as liberals to the notion that WMD were never found in Iraq after Saddam Hussein was deposed. But the corrections tended to “boomerang” among those ideologically predisposed to believe the erroneous information. Thus, conservative subjects who had read the correction were even more.

Every article Sager points to in these posts is worth reading, especially Is Health Care Turnaround a Bad Bet?, How Facts Backfire and Persistence of Myths Could Alter Public Policy Approach.

Hand Washing Leads to Rational Evaluations

Postdecisional dissonance–an extremely close relative of both post-purchase rationalisation and the choice-supportive bias–is the phenomenon whereby once we have made a decision we perceive our chosen option as the most attractive choice and the discarded alternatives as less attractive, regardless of the evidence.

Some intriguing recent research suggests that the physical act of cleaning one’s hands helps us rationally evaluate our past decisions–cleaning our hands cleans our minds, too.

After choosing between two alternatives, people perceive the chosen alternative as more attractive and the rejected alternative as less attractive. This postdecisional dissonance effect was eliminated by cleaning one’s hands. Going beyond prior purification effects in the moral domain, physical cleansing seems to more generally remove past concerns, resulting in a metaphorical “clean slate” effect.

The article is behind the Science paywall but there is an interesting conversation in the comments of Overcoming Bias (via).

Self-Awareness and the Importance of Feedback

It comes as no surprise to hear that we are poor at perceiving how others view us and are poor at recognising the true personality traits of those we observe, but it’s the extent to which this is true and methods we can use to overcome these ‘personality blind spots’ that I find interesting.

When people are asked how long they think their romantic relationship will last, they’re not very good at estimating the right answer. Their friends, it turns out, fare far better. But if you ask people how satisfied they are in a relationship, their ratings accurately predict how long they’ll stay together. In many cases, we have the necessary information to understand things as they are—but our blind spots don’t allow us to take it into account.

After looking at some of our biases that make this so (e.g. the illusion of transparency and the spotlight effect) and what traits we are able to discern in ourselves and in others with some accuracy, the article goes on to suggest that the best way to learn more about ourselves is to solicit feedback.

How you’re seen does matter. Social judgment forms the basis for social interaction itself. Almost every decision others make about you, from promotions to friendships to marriages, is based on how people see you. So even if you never learn what you’re really like, learning how others perceive you is a worthwhile goal.

The solution is asking others what they see. The best way to do this is to solicit their opinions directly—though just asking your mom won’t cut it. You’ll need to get feedback from multiple people—your friends, coworkers, family, and, if you can, your enemies. Offer the cloak of anonymity without which they wouldn’t dare share the brutal truth.

Forer Experiments: Your Personalised Personality Profile

Here is the ‘personalised’ personality profile as used in a 1948 experiment by Bertram Forer:

You have a great need for other people to like and admire you. You have a tendency to be critical of yourself. You have a great deal of unused capacity which you have not turned to your advantage. While you have some personality weaknesses, you are generally able to compensate for them. Your sexual adjustment has presented problems for you. Disciplined and self-controlled outside, you tend to be worrisome and insecure inside. At times you have serious doubts as to whether you have made the right decision or done the right thing. You prefer a certain amount of change and variety and become dissatisfied when hemmed in by restrictions and limitations. You pride yourself as an independent thinker and do not accept others’ statements without satisfactory proof. You have found it unwise to be too frank in revealing yourself to others. At times you are extroverted, affable, sociable, while at other times you are introverted, wary, reserved. Some of your aspirations tend to be pretty unrealistic. Security is one of your major goals in life.

On a scale of 0 (very poor) to 5 (excellent), participants in the study rated the accuracy of the above statement as 4.26 (mean). Only after these ratings were provided did Forer reveal to the participants that all of them had been provided with the exact same statement.

It was after this experiment that Forer famously described the personal validation fallacy (or: the Forer effect)

In Tricks of the Mind (an excellent Christmas present for those interested in such things, by the way), Derren Brown discusses an updated version of this experiment that he conducted for his TV show of the same name. The fifteen participants in this experiment (from the U.K., U.S. and Spain) provided personal items to Brown (a traced outline of their hand, the time and date of their birth, and a small, every-day ‘personal object’), and in return were provided with personality profiles such as that above and were asked to mark its accuracy out of 100.

Three participants scored it poorly, between 40 and 50, while the remaining twelve rated the profile as highly accurate–one rating it as 99% accurate, while another was so drawn in to the profile that she believed the TV crew had secretly read her diary. Two more felt so revealed by the statement that they refused to discuss their profile on film.

Even though all participants in Brown’s experiment expected to receive a series of “vague and ambiguous statements” that could apply widely, they all still fell foul of the personal validation fallacy.

No matter how much we know, we seem unable to account for our biases and beliefs.