I’m currently watching Carl Sagan’s excellentÂ Cosmos: A Personal Voyage. I feel compelled to post the following quote from episode four, Heaven and Hell, as it stood out for itsÂ elegant argument for the strength of scientific ideas and for not rejecting uncomfortable (if incorrect) ideas:
There are many hypotheses in science which are wrong. That’s all right. It’s the aperture to finding out what’s right. Science is a self-correcting process. To be accepted, new ideas must survive the most rigorous standards of evidence and scrutiny.
The worst aspect of the Velikovsky affair is not that many of his ideas were wrong or silly or in gross contradiction to the facts. Rather, the worst aspect is that some scientists attempted to suppress Velikovsky’s ideas.
The suppression of uncomfortable ideas may be common in religion or in politics, but it is not the path to knowledge. And there is no place for it in theÂ endeavourÂ of science.
We do not know beforehand where fundamental insights will arise from about our mysterious and lovely solar system. And the history of our study of the solar system shows clearly that accepted and conventional ideas are often wrong and that fundamental insights can arise from the most unexpected sources.
And if you think this only applies to wacky astronomical ideas or insights about our solar systemâ€¦ well, then you’re deluding yourself.
I can’t wait for the updated CosmosÂ presented by Neil deGrasse Tyson; it’ll be the best thing on TV since sliced bread.
Why do we blindly follow experts when their advice is so often so wrong*? How can we differentiate between good advice and bad? These are just two of the questions David Freedman attempts to answer in Wrong: Why Experts Keep Failing Us (a book that sounds like it could be a nice complement to Kathryn Schulz’s book, mentioned previously).
In an interview with Time, Freedman discusses topics related to his thesis, such as our reaction when confronted with experts, experts and theÂ confirmation bias, the “Wizard of Oz” effect,Â how animal experiments help to advance science but don’t always provide suitable advice for humans, and what he knows about bad advice and how to recognise it:
Bad advice tends to be simplistic. It tends to be definite, universal and certain. But, of course, that’s the advice we love to hear. The best advice tends to be less certain â€” those researchers who say, ‘I think maybe this is true in certain situations for some people.’ We should avoid the kind of advice that tends to resonate the most â€” it’s exciting, it’s a breakthrough, it’s going to solve your problems â€” and instead look at the advice that embraces complexity and uncertainty. [â€¦]
It goes against our intuition, but we have to learn to force ourselves to accept, understand and even embrace that we live in a complex, very messy, very uncertain world.
*Some depressing facts from Freedman’s book, as chosen by Time:
About two-thirds of the findings published in the top medical journals are refuted within a few years. [â€¦] As much as 90% of physicians’ medical knowledge has been found to be substantially or completely wrong. In fact, there is a 1 in 12 chance that a doctor’s diagnosis will be so wrong that it causes the patient significant harm. And it’s not just medicine. Economists have found that all studies published in economics journals are likely to be wrong. Professionally prepared tax returns are more likely to contain significant errors than self-prepared returns. Half of all newspaper articles contain at least one factual error.
When there is a large-scale and wide-ranging problem that needs a solution, we shouldn’t attempt to solve it with an equally large solution but instead attempt to break the issue down and find outlying successes to replicate.
That’s the wisdom of Dan and Chip Heath–authors of Made to Stick and Switch–saying that to solve complex problems we should change our way of thinking to ‘bright-spot’ analysis and attempt to scale small successes.
That’s the first step to fixing everything from addiction to corporate malaise to malnutrition. A problem may look hopelessly complex. But there’s a game plan that can yield movement on even the toughest issues. And it starts with locating a bright spot — a ray of hope. [â€¦]
Our rational brain has a problem focus when it needs a solution focus. If you are a manager, ask yourself, What is the ratio of the time you spend solving problems versus scaling successes?
We need to switch from archaeological problem solving to bright-spot evangelizing. [â€¦] Even in failure there is success. [â€¦]
These flashes of success, these bright spots, can provide our road map for action — and the hope that change is possible.
Following the forced retirement of Helen Thomas following her controversial comments on Israel and Palestine, Felix Salmon discusses how being wrong–and more importantly, the willingness to be wrong–is an admirable trait that should be applauded.
In discussing this, Salmon points to a conversation between Tyler Cowen and Wil Wilkinson, where Cowen proposes:
Take whatever your political beliefs happen to be. Obviously the view you hold you think is most likely to be true, but I think you should give that something like 60-40, whereas in reality most people will give it 95 to 5 or 99 to 1 in terms of probability that it is correct. Or if you ask people what is the chance this view of yours is wrong, very few people are willing to assign it any number at all. Or if you ask people who believe in God or are atheists, what’s the chance you’re wrong â€“ I’ve asked atheists what’s the chance you’re wrong and they’ll say something like a trillion to one, and that to me is absurd, that even if you think all of the strongest arguments for atheism are correct, your estimate that atheism is in fact the correct point of view shouldn’t be that high, maybe you know 90-10 or 95 to 5, at most.
I try hard to believe [â€¦] that many if not most of my opinions are wrong (although of course I have no idea which they are), and that many of the most interesting and useful things I write come out of my being wrong rather than being right. This is not, as Wilkinson noted to Cowen, an easy intellectual stance to hold: he calls it “a weird violation of the actual computational constraints of the human mind”.
via The Browser
Discussing how many great stories “hinge on people being wrong”, Kathryn Schulz interviews This American Life host Ira Glass on the benefits of being wrong.
I feel like being wrong is really important to doing decent work. To do any kind of creative work well, you have to run at stuff knowing that it’s usually going to fail. You have to take that into account and you have to make peace with it. [â€¦] In my experience, most stuff that you start is mediocre for a really long time before it actually gets good. And you can’t tell if it’s going to be good until you’re really late in the process. So the only thing you can do is have faith that if you do enough stuff, something will turn out great and really surprise you. [â€¦]
I had this experience a couple of years ago where I got to sit in on the editorial meeting at the Onion. Every Monday they have to come up with like 17 or 18 headlines, and to do that, they generate 600 headlines per week. I feel like that’s why it’s good: because they are willing to be wrong 583 times to be right 17. [â€¦]
If you do creative work, there’s a sense that inspiration is this fairy dust that gets dropped on you, when in fact you can just manufacture inspiration through sheer brute force. You can simply produce enough material that the thing will arrive that seems inspired.
This fantastically comprehensive interview is one of the best I’ve read in a while and is part of a series of interviews on the subject of ‘wrongness’ following the publication of Schulz’s book, Being Wrong: Adventures in the Margin of Error.
Previous interviewees include Anthony Bourdain, Joe Posnanski, Diane Ravitch and Alan Dershowitz (part two).
via Intelligent Life