Tag Archives: cognitive-bias

An Expert’s an Expert Only When We Agree

In the face of inform­a­tion that is con­tra­dict­ory to our beliefs, not only do we rein­force our pos­i­tion, but we also ques­tion the cred­ib­il­ity of the source itself.

In a study show­ing that we only agree that there is sci­entif­ic con­sensus if that con­sensus agrees with our view­point, research­ers from the Cul­tur­al Cog­ni­tion Pro­ject also found that if an exper­t’s opin­ion is anti­thet­ic­al to our own, we con­sider them to be object­ively less know­ledge­able, cred­ible and trust­worthy than their peers.

It seems that expert opin­ion is only expert opin­ion when it agrees with our opin­ion. This study found that people more read­ily count someone as an expert when that per­son endorses a con­clu­sion that fits their cul­tur­al pre­dis­pos­i­tions. The study calls this cul­tur­al cog­ni­tion – indi­vidu­als tend to form per­cep­tions of risk that reflect and rein­force one or anoth­er ideal­ized vis­ion of how soci­ety should be organ­ized. Thus, accord­ing to the study, gen­er­ally speak­ing, per­sons who sub­scribe to indi­vidu­al­ist­ic val­ues tend to dis­miss claims of envir­on­ment­al risks, because accept­ance of such claims implies the need to reg­u­late mar­kets, com­merce, and oth­er out­lets for indi­vidu­al striv­ings.

Corrections and When They Work

A cor­rec­tion only serves its pur­pose (to cor­rect our falsely-held beliefs) if we are pre­dis­posed to believe the cor­rec­tion itself. If we dis­agree with the cor­rec­tion, how­ever, it instead acts to actu­ally rein­force our incor­rect beliefs (the “back­fire effect”).

That’s the con­clu­sion drawn from research con­duc­ted by Brendan Nyhan, look­ing at how we avoid cog­nit­ive dis­son­ance in the face of cor­rect­ive inform­a­tion (pdf).

Brendan’s research on cog­nit­ive dis­son­ance and cor­rec­tions has been nicely sum­mar­ised by Ryan Sager in a couple of posts: one that looks briefly at the effect of cor­rec­tions on mis­in­form­a­tion, and anoth­er look­ing in great detail at the roots of the anti-vac­cine move­ment.

We find that responses to cor­rec­tions in mock news art­icles dif­fer sig­ni­fic­antly accord­ing to sub­ject­s’ ideo­lo­gic­al views.  As a res­ult, the cor­rec­tions fail to reduce mis­per­cep­tions for the most com­mit­ted par­ti­cipants. Even worse, they actu­ally strengthen mis­per­cep­tions among ideo­lo­gic­al sub­groups in sev­er­al cases. […]

Test sub­jects read mock news art­icles fea­tur­ing mis­lead­ing state­ments about well-known but ideo­lo­gic­ally con­ten­tious sub­jects such as the pres­ence of weapons of mass destruc­tion in Iraq pri­or to the U.S. inva­sion. Half of their sub­jects read art­icles includ­ing only the mis­lead­ing state­ments; half read art­icles that also included a cor­rec­tion.

By com­par­ing the two groups of respond­ents, [it was] determ­ined that the ideo­logy of the sub­jects ten­ded to pre­dict reac­tions. Efforts to cor­rect mis­per­cep­tions were more likely to suc­ceed among those ideo­lo­gic­ally sym­path­et­ic to the cor­rec­tion, such as lib­er­als to the notion that WMD were nev­er found in Iraq after Sad­dam Hus­sein was deposed. But the cor­rec­tions ten­ded to “boomerang” among those ideo­lo­gic­ally pre­dis­posed to believe the erro­neous inform­a­tion. Thus, con­ser­vat­ive sub­jects who had read the cor­rec­tion were even more.

Every art­icle Sager points to in these posts is worth read­ing, espe­cially Is Health Care Turn­around a Bad Bet?, How Facts Back­fire and Per­sist­ence of Myths Could Alter Pub­lic Policy Approach.

Hand Washing Leads to Rational Evaluations

Post­de­cision­al dissonance–an extremely close rel­at­ive of both post-pur­chase ration­al­isa­tion and the choice-sup­port­ive bias–is the phe­nomen­on whereby once we have made a decision we per­ceive our chosen option as the most attract­ive choice and the dis­carded altern­at­ives as less attract­ive, regard­less of the evid­ence.

Some intriguing recent research sug­gests that the phys­ic­al act of clean­ing one’s hands helps us ration­ally eval­u­ate our past decisions–clean­ing our hands cleans our minds, too.

After choos­ing between two altern­at­ives, people per­ceive the chosen altern­at­ive as more attract­ive and the rejec­ted altern­at­ive as less attract­ive. This post­de­cision­al dis­son­ance effect was elim­in­ated by clean­ing one’s hands. Going bey­ond pri­or puri­fic­a­tion effects in the mor­al domain, phys­ic­al cleans­ing seems to more gen­er­ally remove past con­cerns, res­ult­ing in a meta­phor­ic­al “clean slate” effect.

The art­icle is behind the Sci­ence pay­wall but there is an inter­est­ing con­ver­sa­tion in the com­ments of Over­com­ing Bias (via).

Self-Awareness and the Importance of Feedback

It comes as no sur­prise to hear that we are poor at per­ceiv­ing how oth­ers view us and are poor at recog­nising the true per­son­al­ity traits of those we observe, but it’s the extent to which this is true and meth­ods we can use to over­come these ‘per­son­al­ity blind spots’ that I find inter­est­ing.

When people are asked how long they think their romantic rela­tion­ship will last, they’re not very good at estim­at­ing the right answer. Their friends, it turns out, fare far bet­ter. But if you ask people how sat­is­fied they are in a rela­tion­ship, their rat­ings accur­ately pre­dict how long they’ll stay togeth­er. In many cases, we have the neces­sary inform­a­tion to under­stand things as they are—but our blind spots don’t allow us to take it into account.

After look­ing at some of our biases that make this so (e.g. the illu­sion of trans­par­ency and the spot­light effect) and what traits we are able to dis­cern in ourselves and in oth­ers with some accur­acy, the art­icle goes on to sug­gest that the best way to learn more about ourselves is to soli­cit feed­back.

How you’re seen does mat­ter. Social judg­ment forms the basis for social inter­ac­tion itself. Almost every decision oth­ers make about you, from pro­mo­tions to friend­ships to mar­riages, is based on how people see you. So even if you nev­er learn what you’re really like, learn­ing how oth­ers per­ceive you is a worth­while goal.

The solu­tion is ask­ing oth­ers what they see. The best way to do this is to soli­cit their opin­ions directly—though just ask­ing your mom won’t cut it. You’ll need to get feed­back from mul­tiple people—your friends, cowork­ers, fam­ily, and, if you can, your enemies. Offer the cloak of anonym­ity without which they would­n’t dare share the bru­tal truth.

Forer Experiments: Your Personalised Personality Profile

Here is the ‘per­son­al­ised’ per­son­al­ity pro­file as used in a 1948 exper­i­ment by Ber­tram Forer:

You have a great need for oth­er people to like and admire you. You have a tend­ency to be crit­ic­al of your­self. You have a great deal of unused capa­city which you have not turned to your advant­age. While you have some per­son­al­ity weak­nesses, you are gen­er­ally able to com­pensate for them. Your sexu­al adjust­ment has presen­ted prob­lems for you. Dis­cip­lined and self-con­trolled out­side, you tend to be wor­ri­some and insec­ure inside. At times you have ser­i­ous doubts as to wheth­er you have made the right decision or done the right thing. You prefer a cer­tain amount of change and vari­ety and become dis­sat­is­fied when hemmed in by restric­tions and lim­it­a­tions. You pride your­self as an inde­pend­ent thinker and do not accept oth­ers’ state­ments without sat­is­fact­ory proof. You have found it unwise to be too frank in reveal­ing your­self to oth­ers. At times you are extro­ver­ted, affable, soci­able, while at oth­er times you are intro­ver­ted, wary, reserved. Some of your aspir­a­tions tend to be pretty unreal­ist­ic. Secur­ity is one of your major goals in life.

On a scale of 0 (very poor) to 5 (excel­lent), par­ti­cipants in the study rated the accur­acy of the above state­ment as 4.26 (mean). Only after these rat­ings were provided did Forer reveal to the par­ti­cipants that all of them had been provided with the exact same state­ment.

It was after this exper­i­ment that Forer fam­ously described the per­son­al val­id­a­tion fal­lacy (or: the Forer effect)

In Tricks of the Mind (an excel­lent Christ­mas present for those inter­ested in such things, by the way), Derren Brown dis­cusses an updated ver­sion of this exper­i­ment that he con­duc­ted for his TV show of the same name. The fif­teen par­ti­cipants in this exper­i­ment (from the U.K., U.S. and Spain) provided per­son­al items to Brown (a traced out­line of their hand, the time and date of their birth, and a small, every-day ‘per­son­al object’), and in return were provided with per­son­al­ity pro­files such as that above and were asked to mark its accur­acy out of 100.

Three par­ti­cipants scored it poorly, between 40 and 50, while the remain­ing twelve rated the pro­file as highly accurate–one rat­ing it as 99% accur­ate, while anoth­er was so drawn in to the pro­file that she believed the TV crew had secretly read her diary. Two more felt so revealed by the state­ment that they refused to dis­cuss their pro­file on film.

Even though all par­ti­cipants in Brown’s exper­i­ment expec­ted to receive a series of “vague and ambigu­ous state­ments” that could apply widely, they all still fell foul of the per­son­al val­id­a­tion fal­lacy.

No mat­ter how much we know, we seem unable to account for our biases and beliefs.