Tag Archives: trust

First We Believe, Then We Evaluate

When presen­ted with a piece of inform­a­tion for the first time, do we first under­stand the mes­sage before care­fully eval­u­at­ing its truth­ful­ness and decid­ing wheth­er to believe it, or do we instead imme­di­ately and auto­mat­ic­ally believe everything we read?

In an art­icle that traces the his­tory of this ques­tion (Descartes argued that “under­stand­ing and believ­ing are two sep­ar­ate pro­cesses” while Spinoza thought that “the very act of under­stand­ing inform­a­tion was believ­ing it”), an ingeni­ous exper­i­ment con­duc­ted almost twenty years ago by Daniel Gil­bert, author of Stum­bling on Hap­pi­ness, describes how Spinoza was cor­rect: when we first encounter inform­a­tion we believe it imme­di­ately and without thought, only to fully eval­u­ate its truth­ful­ness moments later provided we are not dis­trac­ted.

Obvi­ously it is import­ant to be aware of this beha­viour, as to be dis­trac­ted while read­ing crit­ic­al inform­a­tion of ques­tion­able vera­city could cause us to not eval­u­ate it fully or at all. How­ever this beha­viour has fur­ther implic­a­tions, accord­ing to the art­icle, sug­gest­ing that this may “explain oth­er beha­viours that people reg­u­larly dis­play”, includ­ing:

  • Cor­res­pond­ence bias: this is people’s assump­tion that oth­ers’ beha­viour reflects their per­son­al­ity, when really it reflects the situ­ation.
  • Truth­ful­ness bias: people tend to assume that oth­ers are telling the truth, even when they are lying.
  • The per­sua­sion effect: when people are dis­trac­ted it increases the per­suas­ive­ness of a mes­sage.
  • Deni­al-innu­endo effect: people tend to pos­it­ively believe in things that are being cat­egor­ic­ally denied.
  • Hypo­thes­is test­ing bias: when test­ing a the­ory, instead of try­ing to prove it wrong people tend to look for inform­a­tion that con­firms it.

The Optimal Level of Trust

How much we trust people influ­ences much more than just our inter­per­son­al rela­tion­ships and can even cost us a con­sid­er­able amount of fin­an­cial harm.

The study con­clud­ing this (pdf) sug­gests that too much or too little trust has a fin­an­cial cost equi­val­ent to that of not attend­ing uni­ver­sity and shows that if we trust too much we assume too much social risk, but trust too little and we give up poten­tially prof­it­able oppor­tun­it­ies:

Highly trust­worthy indi­vidu­als think oth­ers are like them and tend to form beliefs that are too optim­ist­ic, caus­ing them to assume too much social risk, to be cheated more often and ulti­mately per­form less well than those who hap­pen to have a trust­wor­thi­ness level close to the mean of the pop­u­la­tion. On the oth­er hand, the low-trust­wor­thi­ness types form beliefs that are too con­ser­vat­ive and thereby avoid being cheated, but give up prof­it­able oppor­tun­it­ies too often and, consequently, under-per­form. Our estim­ates imply that the cost of either excess­ive or too little trust is com­par­able to the income lost by fore­go­ing col­lege. Fur­ther­more, we find that people who trust more are cheated more often by banks as well as when pur­chas­ing goods second hand, when rely­ing on the ser­vices of a plumb­er or a mech­an­ic and when buy­ing food.

via Tim Har­ford