Be stingy with praise for moral behaviour, Robin Hanson suggests, as by doing so people will strive to be more moral to win more difficult-to-obtain praise.
In support of this “stingy school of thought on moral praise”, Hanson points to studies of contradictory behaviour known as “moral licensing”: these studies show how small, seemingly moral acts prevent us from doing further good deeds and may actually increase the odds of us doing immoral deeds.
It seems that we have a good/bad balance sheet in our heads that we’re probably not even aware of. For many people, doing good makes it easier — and often more likely — to do bad. It works in reverse, too: Do bad, then do good. [â€¦]
From a theoretical perspective, the research has shown that “it’s like we can withdraw from our moral bank accounts,” [BenoÃ®t Monin, a social psychologist who studies moral licensing at Stanford University] said. “It’s a lens through which you see the rest of your behavior. But it may not even be conscious.”
This seemingly contradictory behavior is all around us, but it is probably most apparent, and easy to lampoon, in the greening of America. [â€¦]
People who bought green products were more likely to cheat and steal than those who bought conventional products. [â€¦] After getting high-efficiency washers, consumers increased clothes washing by nearly 6 percent. Other studies show that people leave energy-efficient lights on longer. A recent study [â€¦] showed that of 500 people who had greened their homes, a third saw no reduction in bills. [â€¦]
Moral licensing behavior extends, in a different way, into dieting. [â€¦] People eat more chocolate while drinking Diet Coke than while drinking more sugary fare.
The Macbeth effect is the tendency for people who have acted or thought in an immoral or unethical manner to want to clean themselves physically as a kind of surrogate for actual moral cleansing.
Researchers looking at this effect wondered about other interesting characteristics of moral psychology which led them to devising a test for implicit links between different colours and morality. For example, the colour black is commonly connected with evil and white with good, and the researchers wondered whether this would present itself. It did:
Psychologists have long known that if people are presented with, say, the word “blue” printed in a blue font, they will be able to state the colour of the font much faster than if the word “red” is printed in the same blue font.
The study conducted by Mr Sherman and Dr Clore presented words of moral goodness, like “virtuous” and “honesty”, and of badness, like “cheat” and “sin”, in either black or white fonts on a computer screen. As they report in Psychological Science, the two researchers found that when “good” words were presented in black it took the participants about 510 milliseconds to state the colour of the word. When these same words were presented in white it took roughly 480 millisecondsâ€”a significant difference. A similar effect was seen with “bad” words. Responding to white ones took around 525 milliseconds, whereas black ones needed only about 500. These results are remarkably similar to those found when words are printed in colours that clash with their meaning.
We have based our society on the assumption that deciding to lie or to tell the truth is within our conscious control. But […] this assumption may be flawed and […] honesty may instead be the result of controlling a desire to lie (a conscious process) or of not feeling the temptation to lie in the first place (an automatic process).
An intriguing idea and one with far-reaching consequences, especially given that this is on what our entire judiciary system is based. Can someone fairly be punished for a genetic trait (innate lying)?
So is the desire to lie (or, conversely, the desire to be honest) innate, and if so, what does this mean?
What they found is that honesty is an automatic process-but only forÂ some people. Comparing scans from tests with and without the opportunityÂ to cheat, the scientists found that for honest subjects, deciding to beÂ honest took no extra brain activity. But for others, the dishonestÂ group, both deciding to lie and deciding to tell the truth requiredÂ extra activity in the areas of the brain associated with criticalÂ thinking and self-control.
One surprising finding from this study reveals the complexity [we] face in trying to dissect moral behavior: The decision to lie forÂ personal gain turns out to be a strikingly unemotional choice. SomeÂ moral dilemmas Greene studies, like the trolley problem, triggerÂ emotional processing centers in our brains. In his coin toss experiment,Â there was no sign at all that emotions factored into a subject’sÂ decision to lie or to tell the truth. “Moral judgment is not a singleÂ thing,” Greene concludes, suggesting that although we often lump themÂ together under the heading of “morality,” deciding what’s right or wrongÂ and deciding to tell the truth or to tell a lie may, in some situations,Â be entirely disconnected processes.
On a related note:Â the classic Good Samaritan study.