One of my favourite reads–the British Psychological Society’s (BPS) Research Digest–has recently published its 150th issue. To observe this occasion, Digest has asked what twenty-three psychologistsÂ still don’t understand about themselves.
I’ve mentioned a number of the featured psychologists here on Lone Gunman before, Â including Robert Cialdini, Alison Gopnik andÂ Richard Wiseman.
As Vaughan notes, many of those contributing to the article “bemoan their inability to apply their research findings to their own life”. An example of this, that I’m sure many of us can relate to, comes from David Buss: the inability to ‘override’ our well-known biases (related: the bias blind spot).
One nagging thing that I still don’t understand about myself is why I often succumb to well-documented psychological biases, even though I’m acutely aware of these biases. One example is my failure at affective forecasting, such as believing that I will be happy for a long time after some accomplishment (e.g. publishing a new book), when in fact the happiness dissipates more quickly than anticipated. Another is succumbing to the male sexual overperception bias, misperceiving a woman’s friendliness as sexual interest. A third is undue optimism about how quickly I can complete work projects, despite many years of experience in underestimating the time actually required. One would think that explicit knowledge of these well-documented psychological biases and years of experience with them would allow a person to cognitively override the biases. But they donâ€™t.
In order to avoid cognitive dissonance youÂ have a number of choices. Primarily: selective exposure and/orÂ confirmation bias. Researchers from a number of US universities are now attempting to quantify these phenomena, looking atÂ how we seek validation as opposed to correctness.
The researchers found that people are about twice as likely to select information that supports their own point of view (67 percent) as to consider an opposing idea (33 percent). Certain individuals, those with close-minded personalities, are even more reluctant to expose themselves to differing perspectives [â€¦]Â They will opt for the information that corresponds to their views nearly 75 percent of the time.
The researchers also found, not surprisingly, that people are more resistant to new points of view when their own ideas are associated with political, religious or ethical values.
[â€¦] Perhaps more surprisingly, people who have little confidence in their own beliefs are less likely to expose themselves to contrary views than people who are very confident in their own ideas.
As an author of the study (pdf) suggests, maybe those who fall victim to selective exposure and the confirmation bias do so because the new information “might prevent them from living the lives they’re living”. Sounds almost like an evolutionary response to prevent dissonance.
Be wary of advice and forecasts from economic ‘experts’, says economist Zachary Karabellâ€”not because they are trying to sell their services or because they are lying, but because they truly believe their (unintentionally) skewed opinions.
Being wrong in the past is not much of a liability as long as one is right in the present. [â€¦]
There may be “experts” who knowingly skew their analysis to serve their own bottom line. But I believe they are rare. The issue is less the integrity of those selling their wares than the market forces that choose them. When times are good and people feel confident, experts who support that view find more tractionâ€”and more demandâ€”than those who don’t. When times turn troubled, as they most certainly are now, those whose perspective rhymes with the prevailing gloom appear wiser than those who do not.
Prominent experts, therefore, are often simply those whose voices are in harmony with today’s mood and who have an easier time selling their stories. That doesn’t mean that the analysis is inherently flawedâ€”only that it is inherently market-driven.
Obvious, but it’s good to have this reiterated every now and again.
In response to Jane O’Grady’s Open Democracy article critiquing the ‘neuro-social-sciences’, Julian Sanchez outlinesÂ his thoughts on the perils of pop psychology:
There are arguments that simply can’t be made in the span of even a longish newspaper or magazine article. If one is writing for a lay audience, in fact, I feel pretty confident that it’s not even possible to clearly lay out the contested questions, or what precisely the various positions on them are, in that allotment of space. At best, an untrained reader of O’Grady’s piece would come away simply befuddled and unsure what she was getting on about. Some, to judge by the comments, appear to believe they have learned something from it, which suggests that O’Grady has given them the unhealthy illusion of knowing something.
Pop psychology and philosophy succeed only in furtheringÂ confirmation bias and the Dunning-Kruger effect among readers, Sanchez believes:
This brings us around to some of my longstanding ambivalence about blogging and journalism more generally: “Discourse at this level can’t possibly accomplish anything beyond giving people some simulation of justification for what they wanted to believe in the first place.”
[â€¦] People who actually know something are more likely to be fairly tentative and circumspect, while people ill-informed enough to think everything is quite simple will be confident they know all they need to.
How can we test our rationality and various biases?
Shouldn’t you get more rationality credit if you spend more time studying common biases, statistical techniques, and the like?Â Well this would be good evidence of your rationalityÂ ifÂ you were in fact prettyÂ rational about your rationality, i.e., if you knew that when you read or discussed such issues your mind would then systematically, broadly, and reasonably incorporate those insights into you reasoning processes.
But what if your mind is far from rational?Â What if your mind is likely to just go through the motions of studying rationality to allow itself to smugly believe it is more accurate, or to bond you more closely to your social allies?
So just because you know of all the cognitive biases and fallacies doesn’t mean you’re not going to fall victim to the bias blind spot or actually become more rational.
This puts me in mind of this (paraphrased) quote from an anonymous advertising executive:
Those who claim to be well versed in the ‘psychology of advertising’ and to therefore be ‘immune’ not only don’t know much about psychology or advertising, but are our ideal targets.