2014年5月5日 星期一

風險學與你


'I follow the evidence. People who go to checkups: do fewer of them die from heart disease? From cancer? Or from any cause? The answer, three times: no. They just get more treatment, take more medication, and worry more often'


Advice on stock market crashes, plane disasters and bad weather. Can you risk not reading this piece?

In his new book Risk Savvy, psychologist Gerd Gigerenzer argues that when it comes to taking risks in life, we are often much better off following our instincts than expert advice
Travel: after 9/11, many Americans chose to drive rather than fly. But there were 1,600 road casualt
Travel: after 9/11, many Americans chose to drive rather than fly. But there were an extra 1,600 road deaths in 12 months. Photograph: Kevork Djansezian
 
At 66, the moustachioed psychologist Gerd Gigerenzer exudes strapping good health – but that's not because he goes regularly to the doctor for checkups. "I follow the evidence," he says. "People who go to checkups: do fewer of them die from heart disease? From cancer? Or from any cause? The answer, three times: no. They just get more treatment, take more medication, and worry more often."
  1. Risk Savvy: How To Make Good Decisions
  2. by Gerd Gigerenzer

  1. Tell us what you think: Star-rate and review this book
The Bavarian-born Gigerenzer – though once a professional banjo player – has spent decades studying risk, and he long ago concluded that the ways we attempt to cope with life's uncertainties – including medical checkups – can make matters worse. These days, when he is in an upmarket restaurant, he won't even bother opening the menu: asking the waiter what he or she would order is the only way to get what's best, he insists. For research purposes, he once tested an unlikely strategy for managing financial risk: instead of trusting the experts, as most people might, what if you stopped pedestrians at random, gave them a list of companies, asked which ones they had heard of, then just invested in those?
"I try as hard as I can to live by my principles, so I put in a large sum of my own money," recalls Gigerenzer, who lives in Berlin but today is sipping coffee in the New York offices of his American publisher. "It was one of the most lucrative things I've ever done."
For the rest of us – as Gigerenzer demonstrates in his new book, Risk Savvy – things regularly don't turn out so well. We hear a terrifying news story involving aeroplanes, so we switch to car travel instead, even though it's vastly more dangerous: in the 12 months following 9/11, that choice killed an estimated 1,600 Americans, unacknowledged victims of al-Qaida. Or we're told that taking the contraceptive pill "doubles" the risk of thrombosis – as the Department of Health notoriously announced in 1995 – but nobody explains what that really means: a doubling from one woman in every 7,000 to two in 7,000. That report scared so many women off the pill, it's been calculated, that there were 13,000 additional abortions in England and Wales the following year.
And then there is the tale of the American weather forecaster who warned of a 50% chance of rain on Saturday, then a 50% chance on Sunday – meaning that the likelihood of rain that weekend, or so he claimed, was 100%. (Don't chuckle too hard: do you know what phrases such as "a 30% chance of rain tomorrow" really mean? In one study, most Berliners said it meant it would rain for 30% of the time the following day.)
At first glance, Risk Savvy looks like yet another of those books that have become bestsellers recently by telling us we're much more foolish than we thought. We have learned that we are "predictably irrational": that our decisions are influenced by factors as seemingly irrelevant as the height of the ceiling, the weather, or the strength of the car salesman's handshake; and that we do stupid things with money, such as travelling across town to save £5 on a cheap kettle, but not bothering to make the trip when buying an expensive new laptop, even though the saving is the same. Yet one driving motivation behind Gigerenzer's work is to show that the thrust of this research is wrong: that we are not idiots, chronically misled by our instincts. In fact, he argues, we would handle risk far better if we knew when to trust our guts more, and when to spurn expert advice in favour of simple rules of thumb.
Gerd Gigerenzerger: 'We can teach kids to understand risk.' Gerd Gigerenzer: 'We can teach kids to understand risk.' Photograph: Oliver Hartung/The New York Time "The error my dear colleagues make," Gigerenzer says, is that they begin from the assumption that various "rational" approaches to decision-making must be the most effective ones. Then, when they discover that is not how people operate, they define that as making a mistake: "When they find that we judge differently, they blame us, instead of their models!" This is mainly a reference to Gigerenzer's long-running and mainly friendly dispute with Daniel Kahneman, the Nobel prize-winner and author of the hugely successful Thinking, Fast and Slow. Kahneman maintains that we have two inner "systems" for making decisions, the fast but error-prone unconscious system one, and the calculating, conscious system two, on which we ought to rely more.
Gigerenzer, a director at the Max Planck Institute for Human Development in Berlin – his wife, the American historian Lorraine Daston, runs another of the numerous Max Planck institutes – thinks that distinction is absurdly vague, and that it is Kahneman who is error-prone. But far worse, he argues, are the political implications of this outlook. If we are hopeless bunglers, forever making bad decisions, it is easy to conclude that what is needed instead is a paternalistic society in which we surrender to experts: "The idea is that if people can't be trusted to deal with risk and uncertainty, then someone else needs to do it." The approach known as "nudging", which grew directly from Kahneman's work, is just the latest example: it takes it as a given that our urges lead us astray, then asks how those urges might be channelled in healthier ways. "But this isn't a vision for the 21st century – to guide people along from birth to death like sheep!" Gigerenzer says. His stance may make for some awkward conversations next month, when he visits David Cameron's behavioural insights team, AKA the Nudge Unit. ("They wrote to me that they much admired my work," he says, a bit wryly.)
In reality, though, experts may be guilty of more risk-related errors than the rest of us – or more consequential ones, anyhow. Gigerenzer recalls the surreal week in 2007 when Goldman Sachs executives blamed their firm's implosion on a sequence of "25-sigma events". To put that in perspective, a five-sigma event is one you would expect to have occurred once between the end of the last Ice Age and today; a 25-sigma event is as likely as winning the national lottery 21 times in a row. And yet, as John Lanchester writes in his book I.O.U.: "Goldman was claiming to experience them several days in a row. That is so wrong you can't put it into words. It shouldn't be possible to be that wrong."
But the underlying mistake it had made was fairly simple, Gigerenzer thinks. Goldman thought it was operating in a world of calculable risks; in fact, it was operating in a world of true uncertainty, where the risk of different outcomes couldn't be known. "The financial crisis had many causes, but one of them is this illusion that you could calculate the risk," he says. "You have these very nice models, and they work, assuming that the world is stable and nothing in particular happens" – which is, by definition, precisely not the case in a crisis. The banks' mathematical risk models gave them a fatal sense of security: "It's like having an airbag in your car that works all the time, except when you have an accident."
Even when you can calculate the probabilities, trusting experts can be a terrible idea. Gigerenzer's research has shown that many doctors don't grasp the pros and cons of procedures such as cancer screening – or that they do, but have ulterior motives, such as not wanting to get sued if a patient declines screening then dies of cancer. Take mammograms: according to Risk Savvy, for every 1,000 women aged 50 or older who don't get routine screening, about five will die from breast cancer within a decade. For every 1,000 who do get screened, the figure's about four. Hardly a huge difference. And then there are the downsides of screening: for every 1,000 women, 100 will experience false alarms or other distress, while five will undergo unnecessary treatments, including mastectomy. Yet you're still more likely to see leaflets describing the benefits as "a 20% risk reduction", or just dispensing with numbers in favour of condescending slogans: "Why should I have a mammogram? Because you're a woman." Gigerenzer's team campaigns for fact-boxes setting out the upsides and downsides of each course of action; in Austria, they have already been adopted.
The pill: research says taking it doubles the chance of thrombosis. But this means 1 in 7,000 become The pill: research says taking it doubles the chance of thrombosis. But this means 1 in 7,000 becomes 2 in 7,000. Photograph: Lehtikuva Oy/Rex Features The consequences of misunderstanding risk can sometimes be more horrifying. In the early days of HIV testing, when the diagnosis felt like a death sentence, 22 blood donors in Florida were informed that they had tested positive. Was there any hope the tests might be wrong? Suppose, says Gigerenzer, that about five in 100,000 HIV tests administered to low-risk women result in false positives. That sounds tiny, and wouldn't provide much comfort. But there is a crucial additional fact: only about 10 in 100,000 women, in the US, have HIV anyway. So an average woman, receiving a positive result, has a one in three chance of being fine. But in the Florida case, before the possibility of false positives could be investigated, seven of the 22 donors had reportedly killed themselves.
That is a case where more information would have been much better, but the surprising conclusion of much of Gigerenzer's work is the opposite: that we are often best advised to go with less information and rely on those simple rules of thumb, conscious or unconscious, that psychologists call "heuristics". Recall those pedestrians, stopped at random and asked which companies they had heard of. This is known as the "recognition heuristic", and it is a surprisingly good way to pick stocks, because there is a good correlation between a firm's performance and its prominence. (It is far from a flawless method, of course; the point is that it is less flawed than cleverer-seeming strategies.) In another study, Germans and Americans were asked which of two American cities, Detroit or Milwaukee, had the larger population. The Germans did much better than the Americans: they were much less likely to have heard of Milwaukee, so they (correctly) picked Detroit. The Americans knew too much: they got bogged down analysing possible reasons for either answer.
In some parts of life – such as the arts, or romance – we are usually happy to trust our intuition. If a friend told you he had used data-gathering and number-crunching to conclude that he preferred Mozart over Beethoven, you would think him rather odd. "And if the woman you desire has a spreadsheet, with all the possible names and consequences, and she does a calculation and selects you … well, would you really want to have been selected in this way?" Gigerenzer wonders. "Probably not."
Playing a musical instrument well draws similarly on intuition as much as intellect – as Gigerenzer knows first-hand, having paid his way through college by playing the banjo in a German Dixieland band. And in cricket and baseball, fielders don't catch high-flying balls by calculating heuristics in their heads. Rather, they unconsciously use the "gaze heuristic": they fix their eyes on the ball, then adjust their running speed so as to keep the angle of their gaze constant – which leaves them in the right place when the ball approaches the ground.
But in business and politics, gut feelings are taboo: they are used all the time, but nobody dares admit it. "On average, for big decisions – say, whether to set up a new factory in Shanghai or not – every other decision is based on gut feeling," Gigerenzer says. "But executives won't admit this. So instead you find reasons after the fact. You send an employee on a two-week trip to find reasons to present to shareholders. Or you hire expensive consultants, who'll provide a 200-page document to justify the gut feeling, without mentioning that that's what they're doing." (The paperwork responsibilities piled on doctors, academics and others often fulfil a similar function.) In the worst cases, decisions get taken solely on the basis of whether they can be justified with data, which usually means a hyper-cautious adherence to the status quo.
Hence another of Gigerenzer's rules of thumb: if an experienced person with a good track record has a strong hunch about some decision, listen to that person, and don't demand that she or he justifies the hunch with facts. That is the point about hunches: they operate at a level inaccessible to the conscious mind of the person who has them. What if the culture of Goldman Sachs had permitted its most senior managers to say "I've got a bad feeling about this", and for that to be taken seriously?
In Germany, thanks largely to Gigerenzer's efforts, risk literacy is included on school curriculums in the early stages of education, and he's optimistic that the approach will spread more widely. He wrote Risk Savvy, he says, "as an alternative to this flood of popular books that say we're foolish, irrational, and there's not much that can be done about us … But the assumption that people commit all these errors is only partly correct. And the assumption that there's no way to help them is strictly incorrect. We have experimental evidence that we can teach kids to understand risk. In fact," he adds, eyebrows bouncing with amusement, "we can even teach doctors."

Gut instinct
Gerd Gigerenzer's top risk tips

1 Always ask: "What is the absolute risk increase?"
Journalists are fond of referring to a "100% risk increase", a "fivefold" increase, and so on – but the absolute risk might be tiny. How many more people per thousand are actually affected?
2 Don't buy financial products you don't understand
That is not the same as being risk-averse. But it is the only reliable way to avoid falling prey to banks' conflicts of interest – or being sold something even the bank staff don't understand.
3 Set your "aspiration level". Then pick the first option that satisfies it and stop searching.
This is "satisficing", as opposed to "maximising", and it can eliminate huge amounts of worry and wasted time. If you are buying, say, a new mobile phone, decide what matters most – cost, features etc – then purchase the first one that ticks those boxes. In principle, at least, it needn't be confined to small choices: why not use it to pick whom to marry?
4 Don't ask an expert what they recommend for you; ask them what they would do, or how they would advise a close relative.
This triggers a shift in perspective, which helps focus things on the real risks and benefits of whatever is being discussed.
• The picture caption on this article was amended on 5 May 2014. An earlier version said there were 1,600 road deaths in the US in the 12 months after 9/11. It has been estimated that there were an extra 1,600 road deaths.

沒有留言: