Tag Archives: logical-fallacies

Forer Experiments: Your Personalised Personality Profile

Here is the ‘personalised’ personality profile as used in a 1948 experiment by Bertram Forer:

You have a great need for other people to like and admire you. You have a tendency to be critical of yourself. You have a great deal of unused capacity which you have not turned to your advantage. While you have some personality weaknesses, you are generally able to compensate for them. Your sexual adjustment has presented problems for you. Disciplined and self-controlled outside, you tend to be worrisome and insecure inside. At times you have serious doubts as to whether you have made the right decision or done the right thing. You prefer a certain amount of change and variety and become dissatisfied when hemmed in by restrictions and limitations. You pride yourself as an independent thinker and do not accept others’ statements without satisfactory proof. You have found it unwise to be too frank in revealing yourself to others. At times you are extroverted, affable, sociable, while at other times you are introverted, wary, reserved. Some of your aspirations tend to be pretty unrealistic. Security is one of your major goals in life.

On a scale of 0 (very poor) to 5 (excellent), participants in the study rated the accuracy of the above statement as 4.26 (mean). Only after these ratings were provided did Forer reveal to the participants that all of them had been provided with the exact same statement.

It was after this experiment that Forer famously described the personal validation fallacy (or: the Forer effect)

In Tricks of the Mind (an excellent Christmas present for those interested in such things, by the way), Derren Brown discusses an updated version of this experiment that he conducted for his TV show of the same name. The fifteen participants in this experiment (from the U.K., U.S. and Spain) provided personal items to Brown (a traced outline of their hand, the time and date of their birth, and a small, every-day ‘personal object’), and in return were provided with personality profiles such as that above and were asked to mark its accuracy out of 100.

Three participants scored it poorly, between 40 and 50, while the remaining twelve rated the profile as highly accurate–one rating it as 99% accurate, while another was so drawn in to the profile that she believed the TV crew had secretly read her diary. Two more felt so revealed by the statement that they refused to discuss their profile on film.

Even though all participants in Brown’s experiment expected to receive a series of “vague and ambiguous statements” that could apply widely, they all still fell foul of the personal validation fallacy.

No matter how much we know, we seem unable to account for our biases and beliefs.

Planning for the Worst Case Scenario

Eliezer Yudkowsky on planning for the abyssal.

Never mind hindsight on the real-estate bubble – there are lots of things that could potentially trigger financial catastrophes.  I’m willing to bet the American government knows what it will do in terms of immediate rescue operations if an atomic bomb goes off in San Francisco.  But if the US government had any advance idea of under which circumstances it would nationalize Fannie Mae or guarantee Bear Stearns’s counterparties, this plan was not very much in evidence as various government officials gave every appearance of trying to figure everything out on the fly. […]

It’s questionable whether the government should be in the position of trying to forecast the abyss – to put a probability on financial meltdown in any given year due to any given cause.  But advance abyssal planning isn’t about the probability, as it would be in investing.  It’s about the possibility.  If you can realistically imagine global financial meltdowns of various types being possible, there’s no excuse for not war-gaming them.  If your brain doesn’t literally cease to exist upon facing systemic meltdowns at the time, you ought to be able to imagine plausible systemic meltdowns in advance.

This got me thinking about planning for our own abyss (be it employment or health). Why don’t we have plans for the worst case scenario? After all, as the Financial Times’ Tim Harford states, “[A] recently published research paper [shows] that most unemployed people are too cocky about their prospects of finding a new job. On average, they expect seven weeks of unemployment, but eventually endure 23 weeks. And this is using data from the mid-1990s, not recession years”.

A case of the planning fallacy?

List of Logical Fallacies

First cognitive science, now logic: a list of logical fallacies.

A fallacy is a component of an argument which, being demonstrably flawed in its logic or form, renders the argument invalid in whole.

I prefer the many informal fallacies: an important one of which is that correlation does not imply causation (cum hoc ergo propter hoc); a favourite of which is the gambler’s fallacy.

Also: on logic, I’ve previously written about this list of paradoxes.