Risk Savvy

Risk Savvy

$18.00

In stock
0 out of 5

$18.00

SKU: 9780143127109 Categories: , , ,
Title Range Discount
Trade Discount 5 + 25%

Description

A new eye-opener on how we can make better decisions—by the author of Gut Feelings

In this age of big data we often trust that expert analysis—whether it’s about next year’s stock market or a person’s risk of getting cancer—is accurate. But, as risk expert Gerd Gigerenzer reveals in his latest book, Risk Savvy, most of us, including doctors, lawyers, and financial advisors, often misunderstand statistics, leaving us misinformed and vulnerable to exploitation.

Yet there’s hope. In Risk Savvy, Gigerenzer gives us an essential guide to the science of good decision making, showing how ordinary people can make better decisions for their money, their health, and their families. Here, Gigerenzer delivers the surprising conclusion that the best results often come from considering less information and listening to your gut.GERD GIGERENZER is director of the Max Planck Institute for Human Development in Berlin, and lectures around the world on the importance of risk education for everyone from children to prominent doctors, bankers, and politicians.

Praise for Gerd Gigerenzer’s Work

“Logic be damned! . . . Gigerenzer delivers a convincing argument for going with your gut.”

—Men’s Health

“All innumerates—buyers, sellers, students, professors, doctors, patients, lawyers and their clients, politicians, voters, writers, and readers—have something to learn from Gigerenzer.”

—Publishers Weekly

“Gladwell drew heavily on Gigerenzer’s research. But Gigerenzer goes a step further by explaining just why our gut instincts are so often right.”

—Businessweek

“[Gigerenzer] has the gift of exposition and several times gives the reader that Eureka! feeling.”

—The Telegraph (UK)

“Gerd Gigerenzer, director of the Max Planck Institute for Human Development in Berlin, locates specific strategies that the unconscious mind uses to solve problems. These are not impulsive or capricious responses, but evolved methods that lead to superior choices.”

—The Boston Globe

PENGUIN BOOKS

RISK SAVVY

Gerd Gigerenzer is the author of Gut Feelings. He is currently the director of the Max Planck Institute for Human Development in Berlin, Germany, and lectures around the world on the importance of proper risk education for everyone from school-age children to prominent doctors, bankers, and politicians.

Creativity requires the courage to let go
of certainties.

Erich Fromm

To be alive at all involves some risk.

Harold Macmillan

1

Are People Stupid?

Knowledge is the antidote to fear.

Ralph Waldo Emerson

Remember the volcanic ash cloud over Iceland? The subprime disaster? How about mad cow disease? Each new crisis makes us worry until we forget and start worrying about the next one. Many of us found ourselves stranded in crowded airports, ruined by vanishing pension funds, or anxious about tucking into a yummy beef steak. When something goes wrong, we are told that the way to prevent further crisis is better technology, more laws, and bigger bureaucracy. How to protect ourselves from the next financial crisis? Stricter regulations, more and better advisers. How to protect ourselves from the threat of terrorism? Homeland security, full body scanners, further sacrifice of individual freedom. How to counteract exploding costs in health care? Tax hikes, rationalization, better genetic markers.

One idea is absent from these lists: risk-savvy citizens. And there is a reason.

“Human beings are fallible: lazy, stupid, greedy and weak,” an article in the Economist announced.1 We are said to be irrational slaves to our whims and appetites, addicted to sex, smoking, and electronic gadgets. Twenty-year-olds drive with their cell phones glued to their ears, oblivious to the fact that doing so lowers their reaction time to that of a seventy-year-old. A fifth of Americans believe that they are in the top 1 percent income group and just as many believe that they will soon be there. Bankers have little respect for people’s ability to invest money, and some doctors tell me that most of their patients lack intelligence, making it pointless to disclose health information that might be misunderstood in the first place. All of this points to the conclusion that Homo sapiens (“man the wise”) is a misnomer. Something has gone wrong in our genes. Evolution seems to have cheated us with shabby mental software and miswired our brains. In short, John and Jane Q. Public need continuous guidance, as a child needs a parent. Although we live in the high-tech twenty-first century, some form of paternalism is the only viable strategy: Close the doors, collect the experts, and tell the public what’s best for them.

This fatalistic message is not what you will read in this book.2 The problem is not simply individual stupidity, but the phenomenon of a risk-illiterate society.

Literacy—the ability to read and write—is the lifeblood of an informed citizenship in a democracy. But knowing how to read and write isn’t enough. Risk literacy is the basic knowledge required to deal with a modern technological society. The breakneck speed of technological innovation will make risk literacy as indispensable in the twenty-first century as reading and writing were in previous centuries. Without it, you jeopardize your health and money, or may be manipulated into unrealistic fears and hopes. One might think that the basics of risk literacy are already being taught. Yet you will look in vain for it in most high schools, law schools, medical schools, and beyond. As a result, most of us are risk illiterate.

When I use the general term risk savvy I refer not just to risk literacy, but also more broadly to situations where not all risks are known and calculable. Risk savvy is not the same as risk aversion. Without taking risks, innovation would end, as would fun, and courage would belong to the past. Nor does risk savvy mean turning into a reckless daredevil or BASE jumper, denying the possibility of landing on one’s nose. Without a beneficial degree of caution, humans would have ceased to exist long ago.

You might think, why bother if there are experts to consult? But it isn’t that simple. Bitter experience teaches that expert advice may be a dangerous thing. Many doctors, financial advisers, and other risk experts themselves misunderstand risks or are unable to communicate them in an understandable way. Worse, quite a few have conflicts of interest or are so afraid of litigation that they recommend actions to clients they would never recommend to their own families. You have no choice but to think for yourself.

I’d like to invite you into the world of uncertainty and risk, beginning with weather reports and a very humble hazard, getting soaked.

Chances of Rain

A weathercaster on U.S. television once announced the weather this way:

The probability that it will rain on Saturday is 50 percent. The chance that it will rain on Sunday is also 50 percent. Therefore, the probability that it will rain on the weekend is 100 percent.

Most of us will smile at this.3 But do you know what it means when the weather report announces a 30 percent chance of rain tomorrow? 30 percent of what? I live in Berlin. Most Berliners believe that it will rain tomorrow 30 percent of the time; that is, for seven to eight hours. Others think that it will rain in 30 percent of the region; that is, most likely not where they live. Most New Yorkers think both are nonsense. They believe that it will rain on 30 percent of the days for which this announcement is made; that is, there will most likely be no rain at all tomorrow.4

Are people hopelessly confused? Not necessarily. Part of the problem is the experts who never learned how to explain probabilities in the first place. If they clearly stated the class to which a chance of rain refers, the confusion would disappear. Time? Region? Days? What meteorologists intend to say is that it will rain on 30 percent of the days for which this prediction is made. And “rain” refers to any amount above some tiny threshold, such as 0.01 inches.5 Left on their own, people intuitively fill in a reference class that makes sense to them, such as how many hours, where, or how heavily it rains. More imaginative minds will come up with others still. As one woman in New York said, “I know what 30 percent means: Three meteorologists think it will rain, and seven not.”

Here is my point. New forecasting technology has enabled meteorologists to replace mere verbal statements of certainty (“it will rain tomorrow”) or chance (“it is likely”) with numerical precision. But greater precision has not led to greater understanding of what the message really is. The confusion over probabilities of rain has persisted in fact since the very first time they were broadcast to the public in 1965 in the United States. This confusion is not just limited to rain, but occurs whenever a probability is attached to a single event—such as “if you take an antidepressant, you have a 30 percent chance of developing a sexual problem.” Does that mean that 30 percent of all people will develop a sexual problem, or that you yourself will have a problem in 30 percent of your sexual encounters? The solution to clearing this widespread and long-standing muddle is surprisingly simple:

Always ask for the reference class: Percent of what?

If weathercasters were taught how to communicate to the public, you wouldn’t even have to ask.

Getting soaked is a minor risk, although for some, from the farmer to Ferrari, the chances of rain matter. Before the Formula 1 Grand Prix, one of the most-discussed issues is the weather forecast—choosing the right tires are the key to winning the race. The same holds for NASA: The weather forecast is essential for approving or canceling a space shuttle launch, as the Challenger disaster tragically illustrates. Yet for most of us, all that is at stake is canceling a family outing unnecessarily or getting wet feet. People may not make a special effort to understand chances of rain simply because the hazards are modest. Are we more risk savvy when something truly important is at stake?

Figure 1-1. What does a “30 percent chance of rain tomorrow” mean? Some believe it will rain tomorrow 30 percent of the time (upper panel). Others believe it will rain tomorrow in 30 percent of the region (middle panel). Finally, some believe that three meteorologists think that it will rain and seven do not (lower panel). What meteorologists in fact intend to say is something different: that it will rain on 30 percent of the days for which this announcement is made. The problem is not simply in people’s minds, but in the failure of experts to state clearly what they mean.

Pill Scare

Great Britain has many traditions, one of them being the contraceptive pill scare. Since the early 1960s, women are alarmed every couple of years by reports that the pill can lead to thrombosis, potentially life-threatening blood clots in the legs or lungs. In the most famous scare, the UK Committee on Safety of Medicines issued a warning that third-generation oral contraceptive pills increased the risk of thrombosis twofold—that is, by 100 percent. How much more certain can you get? This terrifying information was passed on in “Dear Doctor” letters to 190,000 general practitioners, pharmacists, and directors of public health and was presented in an emergency announcement to the media. Alarm bells rang around the country. Distressed women stopped taking the pill, which caused unwanted pregnancies and abortions.6

Just how big is 100 percent? The studies on which the warning was based had shown that of every seven thousand women who took the earlier, second-generation pill, about one had a thrombosis; and that this number increased to two among women who took third-generation pills. That is, the absolute risk increase was only one in seven thousand, whereas the relative risk increase was indeed 100 percent. As we see, in contrast to absolute risks, relative risks appear threateningly large and can cause a great stir. Had the committee and the media reported the absolute risks, few women would have panicked and dropped the pill. Most likely, no one would have even cared.

This single scare led to an estimated thirteen thousand (!) additional abortions in the following year in England and Wales. But the fallout lasted for longer than one year. Before the alert, abortion rates had been steeply on the decline, but afterward, this trend was reversed and abortion rates increased for years to come. Women’s confidence in oral contraceptives was undermined, and pill sales fell sharply. Not all unwanted pregnancies were aborted; for every abortion there was also one extra birth. The increase in both abortions and births was particularly pronounced among girls under sixteen, with some eight hundred additional conceptions.

Ironically, pregnancies and abortions are associated with a risk of thrombosis that exceeds that of the third-generation pill. The pill scare hurt women, hurt the National Health Service, and even brought down the stocks of the pharmaceutical industry. The resulting increase in costs to the National Health Service for abortion provision has been estimated at £4–6 million. Among the few who profited were the journalists who got the story on the front page.

An unwanted pregnancy and abortion is not something to be taken lightly. As one woman reports:

When I learned that I was pregnant, my partner and I were together for two years. His first reaction was: “Come back when it’s gone.” I threw him out and tried to find a solution. I wanted so much to begin with college. I fought for building a future for us, but I began to realize there was none. The one thing I did not want was to become dependent on the government, or—even worse—on a man. Therefore I decided last minute for an abortion. It’s now two days ago, and I have one nervous breakdown after the other. My mind says, it was the best decision, but my heart weeps.

The tradition of pill scares continues to the present day, and always with the same trick. The solution is not better pills or more sophisticated abortion technology, but risk-savvy young women and men. It would not be so difficult to explain to teenagers the simple distinction between a relative risk (“100 percent”) and an absolute risk (“one in seven thousand”). After all, batting averages and other sports statistics are common knowledge for many, young and old. Yet to the present day journalists have succeeded in causing scares with BIG numbers, and the public predictably panics, year after year.

Once again, the remedy is a simple rule:

Always ask: What is the absolute risk increase?

Journalists are not the only ones who play on our emotions with the help of numbers. Top medical journals, health brochures, and the Internet also inform the public in terms of relative changes, because bigger numbers make better headlines. In 2009 the prestigious British Medical Journal published two articles on oral contraceptives and thrombosis: One made the absolute numbers transparent in its abstract, while the other again touted relative risks, reporting that “oral contraceptives increased the risk of venous thrombosis fivefold.”7 The “fivefold” increase of course made for more spectacular headlines, and some newspapers, such as the London Evening Standard, didn’t even bother mentioning the absolute numbers. As a rule, although we have high-tech medicine, understandable information for patients and doctors remains the exception.

It should be the ethical responsibility of every editor to enforce transparent reporting and it should be on the agenda of every ethics committee and every department of health. But it is not. After publication of my book Calculated Risks, which explains how to help both the public and doctors understand numbers, the neuroscientist Mike Gazzaniga, then dean of the faculty at Dartmouth College, paid me a visit. Outraged by the tricks played on the public by the use of relative risks and other means, he said that he would propose this issue to the President’s Council on Bioethics, of which he was a member. After all, he argued, misleading the public with numbers happens in the United States just as often as in the United Kingdom, and is one of the few ethical problems to which a solution is known. Other less clear-cut issues, such as abortion, stem cells, and genetic testing, tend to occupy the council with endless discussions. I am grateful to Gazzaniga for trying. Yet the ethics committee did not recognize misleading the public as a significant issue and never took it up.

If ethics committees don’t protect people, why don’t doctors do it? The surprising answer is that many doctors themselves don’t know how to communicate risks, a skill rarely taught at medical schools. The damaging effect of the “Dear Doctor” letters illustrates that many of them were taken in by relative risks. Once again, the experts are in need of training. Otherwise, when the next pill scare arrives, they and those affected may be as unprepared as ever.

I have explained the difference between relative and absolute risks to hundreds of journalists, and many have stopped alarming the public and reported absolute risks—only to see their editors often reintroduce the BIG numbers. We may not always be able to halt those who like to play with our fears, but we can learn to see through their tricks.

Terrorists Use Our Brains

Most of us remember exactly where we were on September 11, 2001. The pictures of the planes crashing into the twin towers of the World Trade Center have been ingrained into our memories. In the meantime, everything appears to have been said about the tragic attack. The 9/11 Commission Report, which appeared three years later, focused on how al-Qaeda terrorism evolved and on diplomatic strategies, legal reform, and technological measures. The one measure the 636-page report did not pay attention to, however, was risk-savvy citizens.

Let us turn the clock back to December 2001. Imagine you live in New York and want to travel to Washington. Would you fly or drive?

We know that after the attack, many Americans stopped flying. Did they stay home or jump into their cars? I have looked for an answer in the transportation statistics. In the months after the attack, the miles driven increased substantially. The increase was particularly strong on the rural interstate highways where long-distance travel happens, jumping by as much as 5 percent in the three months after the attack.8 For comparison, in the months before the assault (January to August), individual monthly vehicle miles were up only less than 1 percent compared to 2000, which reflects the normal increase from year to year. All this extra driving lasted for twelve months; thereafter, car driving went back to normal. By then the images of the burning twin towers were no longer a daily feature in the media.

The increase in road travel had sobering consequences. Before the attack, the number of fatal traffic accidents remained closely around the average of the previous five years (the zero line in Figure 1-2). Yet in each of the twelve months after 9/11, the number of fatal crashes was above average, and most of the time, even higher than anything that happened in the previous five years. All in all, an estimated sixteen hundred Americans lost their lives on the road due to their decision to avoid the risk of flying.

Figure 1-2. Terrorists’ second strike. After the attacks on September 11, 2001, the number of fatal traffic accidents increased in the United States for a period of twelve months, resulting in an estimated sixteen hundred Americans losing their lives on the road in the attempt to avoid the risk of flying. Numbers are expressed as deviations from the five-year baseline 1996–2000 (the zero line). Before September 2001, the monthly fatal crashes were close to the zero line. In the twelve months following the attack, the number of fatal crashes was higher than the zero line for every month, and in most cases exceeded the maximum of the previous years (the vertical bars show the maximum and minimum). The peaks after 9/11 correspond to terrorism alerts.

Source: Gigerenzer (2004, 2006).

This death toll is six times higher than the total number of passengers (256) who died on board the four fatal flights. Every one of those traffic victims might still be alive if they had flown instead. From 2002 to 2005, 2.5 billion passengers took to the air on U.S. commercial flights. Not a single one died in a major airline crash. Thus, although the 9/11 attacks were reported to have cost the lives of about three thousand Americans, the number is at least half as many more.

Let’s give the statistic a face, but a lucky one—one who barely escaped death.

Justin Klabin, a twenty-six-year-old competitive rugby player and volunteer firefighter, watched the twin towers collapse from across the Hudson River. With his fire department, he rushed to Ground Zero. After this deeply emotional experience, he decided to stop flying. A month later, he and his girlfriend went on a trip to Florida—by car. Their pickup truck mastered the thousand-mile distance. But at the end of a long day on the road back home, they heard a loud pop: Both front tires turned toward each other, like snowplowing skis. The tie-rod that connected the steering column to the wheel had snapped, and the truck could not drive a foot farther. They were lucky that the disaster happened when they pulled into a parking space in South Carolina. Had the rod snapped minutes earlier on the highway at a speed of seventy miles per hour, it is likely that Klabin and his girlfriend would have joined those unfortunate travelers who lost their lives by avoiding the risk of flying.

Terrorists strike twice. First they assault with physical force, and then they assault us with the help of our brains. The first strike gains all the attention. Billions of dollars have been poured into developing gigantic bureaucracies, including Homeland Security, and new technologies, such as full body scanners that make visible the nude surface of skin beneath clothing. The second strike, in contrast, has received almost no attention. In fact, when I gave talks on risk management to international intelligence services and counterterrorism agencies across the world, from Singapore to Wiesbaden, my hosts were repeatedly surprised, having never even considered it. Osama bin Laden once explained with relish how little money he used to cause such huge damages: “Al-Qaeda spent $500,000 on the event, while America, in the incident and its aftermath, lost—according to the lowest estimate—more than $500 billion, meaning that every dollar of al-Qaeda defeated a million dollars.”9 It’s hard to prevent terrorists’ suicide missions, but it should be easier to put a stop to the dangerous reactions based on fears that their attacks create within us in their aftermath.

What exactly is our brain’s psychology that terrorists exploit? Low-probability events in which many people are suddenly killed, so-called dread risks,10 trigger an unconscious psychological principle:

If many people die at one point in time, react with fear and avoid that situation.

Note that the fear is not about dying per se. It is about dying in a specific manner, namely together at one point in time, or in a short interval. When many die spectacularly at one point in time, as in the 9/11 attacks, our evolved brain reacts with great anxiety. But when as many or more die distributed over time, such as in car and motorbike accidents, we are less likely to be afraid. In the United States alone, where about 35,000 people die on the road every year, few worry about dying while driving. What matters psychologically is not, as sometimes claimed, that people have control when driving but not when flying. Passengers sitting next to the driver, not to speak of those in the back seats, have no control either, yet show little fear. We don’t really fear dying in the steady stream of everyday incidents; we fear dying together suddenly with lots of others. We dread the rare nuclear power plant accident, not the steady death toll caused by pollution from coal power plants. We dreaded the swine flu pandemic after hearing the forecast of possibly tens of thousands of deaths—which never occurred—while few worry about being among the actual tens of thousands of people killed every year by the regular flu.

Where does this tendency to fear dread risks come from? In human history, it was likely a rational response. For most of our evolution, humans lived in small hunter-gatherer bands that may have consisted of up to twenty to fifty individuals and rarely exceeded one hundred people, similar to such bands in the world today. In small bands, the sudden loss of many lives could increase the risk of predation and starvation, and thus threaten survival of the whole group.11 But what was rational in the past is not rational today. In modern societies, an individual’s survival is no longer dependent on the support and protection of a small group or tribe. Yet the psychological response can still be easily elicited. To this day, real or imagined catastrophes have the potential to trigger panicky reactions.

The “old-brain” fear of dread risks can suppress any flash of thought in the new parts of our brains. As a professor from Loyola University Chicago wrote to me, “After 9/11, I explained the greater risk of driving compared to flying to my wife; that did not do the trick.” Rational argument does not always win over old-brain fear, particularly if one spouse tries to educate the other. Yet there is a simple rule of thumb that could have helped that professor:

If reason conflicts with a strong emotion, don’t try to argue. Enlist a conflicting and stronger emotion.

One such emotion that conflicts with dread-risk fear is parental concern. The professor might remind his wife that by making them drive long distances she puts the lives of her children—not just that of her husband—at risk. Parental emotions stand a better chance of overcoming the lingering fear of flying. A smart “new brain” can play one evolved fear against another to better survive in a modern world. Evolution is not destiny.

Terrorists’ second strike goes beyond the story told here. It has led to an erosion of civil liberties: Before 9/11, strip searches without probable cause were seen to violate human rights; they are now seen as citizens’ duty. Dread-risk fear makes us willing to tolerate long lines at the airport, put liquids in plastic bags, remove our shoes and belts and jackets, have our bodies touched by strangers. Higher security expenses in turn have gone hand in hand with reduced service and cramped seating, as if the airlines were competing for the worst customer service. People have become less lighthearted and more fearful. Last but not least, the wars in Afghanistan and Iraq have cost more than a trillion dollars together with the lives of thousands of soldiers and many more civilians. This financial strain also likely played a part in the financial crisis of 2008.12

If a similar attack ever repeats itself, we should not let our brains be misused again for a second strike. Only when we are risk savvy can we resist terrorist manipulation and create a safer and more resilient society. To get there, three tools are essential: understanding the nature of dread-risk fear, controlling it by enlisting conflicting emotions if reasons don’t work, and knowing the actual risk of flying.

Let’s go back to the question I posed before: Should you fly or drive? Assume again you live in New York and want to travel to Washington. You have only one goal, to arrive alive. How many miles would you have to drive by car until the risk of dying is the same as in a nonstop flight? I have asked this to dozens of expert audiences. The answers are all over the place: one thousand miles, ten thousand miles, driving three times around the world. However, the best estimate is twelve miles. Yes, only twelve. If your car makes it safely to the airport, the most dangerous part of your trip is likely already behind you.

Are People Hopeless in Dealing with Risk?

How can so many people not notice that they don’t understand probabilities of rain? Or end up with unwanted pregnancies and abortions because they don’t know the difference between relative and absolute risks? After all, probabilities of rain and pill scares have been around since the mid-1960s, and the fear of dread risks repeats itself with every new threat, from mad cow disease to SARS to bird flu, in an apparently endless circle. Why don’t people learn?

Many experts think the answer is that people are basically incapable of understanding such things. Attempts to educate people out of their errors, so the argument goes, have mostly failed. Based on this dismal view of the general public, a publication by Deutsche Bank Research features a list of errors that we “Homer Simpsons” commit against rationality.13 Popular books rehearse this message, portraying Homo sapiens as “predictably irrational” and in need of “nudges” into behaving sensibly by the few sane people on earth.14

My story is different. People aren’t stupid. The problem is that our educational system has an amazing blind spot concerning risk literacy. We teach our children the mathematics of certainty—geometry and trigonometry—but not the mathematics of uncertainty, statistical thinking. And we teach our children biology but not the psychology that shapes their fears and desires. Even experts, shockingly, are not trained how to communicate risks to the public in an understandable way. And there can be positive interest in scaring people: to get an article on the front page, to persuade people to relinquish civil rights, or to sell a product. All these outside causes contribute to the problem.

The good news is that there is a solution. Who would have thought, a few hundred years ago, that so many people on earth would learn to read and write? We will see that everybody who wants to can also become risk savvy. Based on my and other colleagues’ research, I will argue that:

1. Everyone can learn to deal with risk and uncertainty. In this book, I will explain principles that are easily understood by everyone who dares to know.

2. Experts are part of the problem rather than the solution. Many experts themselves struggle with understanding risks, lack skills in communicating them, and pursue interests not aligned with yours. Giant banks go bust for exactly these reasons. Little is gained when risk-illiterate authorities are placed in charge of guiding the public.

3. Less is more. When we face a complex problem, we look for a complex solution. And when it doesn’t work, we seek an even more complex one. In an uncertain world, that’s a big error. Complex problems do not always require complex solutions. Overly complicated systems, from financial derivatives to tax systems, are difficult to comprehend, easy to exploit, and possibly dangerous. And they do not increase the trust of the people. Simple rules, in contrast, can make us smart and create a safer world.

“Savvy” means acute, astute, and wise. But being risk savvy is more than being well informed. It requires courage to face an uncertain future as well as to stand up to authority and ask critical questions. We can take the remote control for our emotions back into our own hands. Using one’s mind without another’s guidance entails an inner psychological revolution. Such a revolt makes life more enlightening and less anxiety-ridden. I have written this book to encourage risk-savvy citizens.

Becoming Risk Savvy

In his essay “What Is Enlightenment?” the philosopher Immanuel Kant begins thus:15

Enlightenment is man’s emergence from his self-imposed nonage. Nonage is the inability to use one’s own understanding without another’s guidance. This nonage is self-imposed if its cause lies not in lack of understanding but in indecision and lack of courage to use one’s mind without another’s guidance. Dare to know!

Freedom of speech, the right to vote, and protection against harm are among the most important achievements since the Enlightenment. These liberties are a treasure. They refer to what doors are open to you, to your opportunities. Today, every Internet user has free access to more information than humankind ever had before. Yet the idea of open doors is a passive or “negative” concept of liberty. Positive liberty, in contrast, entails more than free access. The question is whether you are able to walk through these doors, whether you can master your life without the constant guidance of others.16 Now that people in democratic societies have vastly enlarged their opportunities, positive liberty has become the next challenge.

Risk-savvy citizens are indispensable pillars of a society that is ready for positive liberty. Whether the context is a weather forecast, a medical decision, or a large-scale disaster, being risk savvy requires a basic knowledge of our intuitive psychology as well as an understanding of statistical information. Only with both skills, and a portion of curiosity and courage, will we be able to take our lives in our own hands.

2

Certainty Is an Illusion

Nothing will ever separate us. We will probably be married another ten years.

Elizabeth Taylor, 1974, five days before she and Richard Burton announced their divorce

We think of uncertainty as something we don’t want. In the best of all worlds, all things should be certain, absolutely certain. So we buy insurance against everything, swear by horoscopes, or pray to God. We collect terabytes of information to turn our computers into crystal balls. Yet think of what would happen if our wishes were granted. If we knew everything about the future with certainty, our lives would be drained of emotion. No surprise and pleasure, no joy or thrill—we knew it all along. The first kiss, the first proposal, the birth of a healthy child would be about as exciting as last year’s weather report. If our world ever turned certain, life would be mind-numbingly dull.

The Illusion of Certainty

Nonetheless many of us ask for certainty from our bankers, our doctors, and our political leaders. What they deliver in response is the illusion of certainty, the belief that something is certain even when it isn’t. Every year we support a multibillion-dollar industry that calculates future predictions, mostly erroneous, from market tips to global flu pandemics. Many of us smile at old-fashioned fortune-tellers. But when the soothsayers work with computer algorithms rather than tarot cards, we take their predictions seriously and are prepared to pay for them. The most astounding part is our collective amnesia: Most of us are still anxious to see stock market predictions even if they have been consistently wrong year after year.

Throughout history, humans have created belief systems that promise certainty, such as astrology and divination. A glance on the Internet reveals that these systems are still in high demand. Modern technology has added further vehicles of apparent certainty, from genetic tests to personalized medicine to risk measures in banking.

Blind Faith in Tests

If a genetic test shows that the defendant’s DNA matches with the traces found on the murdered victim, isn’t this certain evidence that he is the murderer? If a woman who is pregnant takes an HIV test and the test comes out positive, isn’t this certain evidence that she—and likely her baby—is infected? In a word, no. To find out how widespread the illusion of certainty actually is, I surveyed a representative sample of one thousand German adults. They were asked in face-to-face interviews: “Which of the following tests are absolutely certain?” The result is shown in Figure 2-1.

Figure 2-1. Which test is absolutely certain? Among a representative sample of one thousand Germans, 4 percent believed that an expert horoscope is absolutely certain. When modern technology is involved, the illusion of certainty is amplified. All of these tests make errors.

When an astrologer calculates an expert horoscope for you and foretells that you will develop a serious illness and might even die at age forty-nine, will you tremble when the date approaches? Some 4 percent of Germans would; they believe that an expert horoscope is absolutely certain.1 Yet there is no evidence that horoscopes do better than a good friend asked to predict your future. But when technology is involved, the illusion of certainty is amplified. Forty-four percent of people surveyed think that the result of a screening mammogram is certain. In fact, mammograms fail to detect about ten percent of cancers, and the younger the women being tested, the more error-prone the results, because their breasts are denser.

Finally, nearly two thirds of Germans believe that HIV tests and fingerprints are absolutely certain, and an even higher number place their faith in DNA tests. These tests are indeed much more accurate than mammograms, but none of their results are certain. Fingerprints, for instance, are unique features of an individual, even for identical twins who share the same genes. If the fingerprints of a suspect matched with those found on the scene of a crime, what jury would acquit the suspect? But is our system for fingerprint identification infallible? Fingerprints were believed to be “foolproof” until 1998, when the FBI sent two fingerprints found on a getaway car to be matched with the fingerprints of the convicted perpetrator to labs at several U.S. state law enforcement agencies. From thirty-five laboratories, eight could not match one of the prints and six more found no match for the other.2 Clearly, this is not the exact science many believe it to be.

Not understanding a new technology is one thing. Believing that it delivers certainty is another. For those of us who suffer from the illusion of certainty, there is a simple remedy. Always remember what Benjamin Franklin said:

“In this world nothing can be said to be certain, except death and taxes.”

My Security Blanket, Please

Humans appear to have a need for certainty, a motivation to hold on to something rather than to question it. People with a high need for certainty are more prone to stereotypes than others and are less inclined to remember information that contradicts their stereotypes.3 They find ambiguity confusing and have a desire to plan out their lives rationally. First get a degree, a car, and then a career, find the most perfect partner, buy a home, and have beautiful babies. But then the economy breaks down, the job is lost, the partner has an affair with someone else, and one finds oneself packing boxes to move to a cheaper place. In an uncertain world, we cannot plan everything ahead. Here, we can only cross each bridge when we come to it, not beforehand. The very desire to plan and organize everything may be part of the problem, not the solution. There is a Yiddish joke: “Do you know how to make God laugh? Tell him your plans.”

To be sure, illusions have their function. Small children often need security blankets to soothe their fears. Yet for the mature adult, a high need for certainty can be a dangerous thing. It prevents us from learning to face the uncertainty pervading our lives. As hard as we try, we cannot make our lives risk-free the way we make our milk fat-free.

At the same time, a psychological need is not entirely to blame for the illusion of certainty. Manufacturers of certainty play a crucial role in cultivating the illusion. They delude us into thinking that our future is predictable, as long as the right technology is at hand. Yet the future can be one damned thing after another. False certainty is disseminated by many an expert, and sometimes shamelessly. “I am sure I have found the Holy Grail,” a financial expert divulged to an eager-looking client at a fancy Zurich hotel in such a bellowing baritone that I could not help but listen. After an hour of plugging a supposedly fail-safe investment without a waver of a doubt, he won over the client—and his money.

The quest for certainty is an old human endeavor. Magical cults, soothsayers, and authority figures who know what’s right and wrong are its proponents. Similarly, for centuries many philosophers have been misled by looking for certainties where none exist, equating knowledge with certainty and belief with uncertainty, as John Dewey, the great pragmatist philosopher, pointed out.4 Today, modern technologies, from mathematical stock prediction methods to medical imaging machines, compete for the confidence promised by religion and authority.

The quest for certainty is the biggest obstacle to becoming risk savvy. While there are things we can know, we must also be able to recognize when we cannot know something. We know almost for sure that Halley’s Comet will return in the year 2062, but we can rarely predict natural disasters and stock crashes. “Only fools, liars, and charlatans predict earthquakes,” said Charles Richter, namesake of the scale that measures their magnitude.5 Similarly, an analysis of thousands of forecasts by political and economic experts revealed that they rarely did better than dilettantes or dart-throwing chimps.6 But what the experts were extremely talented at was inventing excuses for their errors (“I was almost right”). The problem is that false certainty can do tremendous damage. As we will see, blind faith in tests and financial forecasts can lead to misery. Not only can it endanger your physical and mental health, but it can also ruin your bank account and the economy as a whole. We have to learn to live with uncertainty. It’s time to face up to it. A first step toward doing so is to understand the distinction between known risks and unknown risks.

Risk and Uncertainty

Two magnificently dressed young women sit upright on their chairs, calmly facing each other. Yet neither takes notice of the other. Fortuna, the fickle, wheel-toting goddess of chance, sits blindfolded on the left while human figures desperately climb, cling to, or tumble off the wheel in her hand (Figure 2-2). Sapientia, the calculating and vain deity of science, gazes into a hand-mirror, lost in admiration of herself. These two allegorical figures depict a long-standing polarity: Fortuna brings good or bad luck, depending on her mood, but science promises certainty.

Figure 2-2. Fortuna, the wheel-toting goddess of chance (left), facing Sapientia, the divine goddess of science (right). In this sixteenth-century woodcut, the two women are pictured in their traditional opposition: Fortune’s luck makes people climb and fall from her wheel, while science promises certainty. A century later, in one of the greatest scientific revolutions, chance became tamed and science lost its certainty. Courtesy of the Bridgeman Art Library, London.

This sixteenth-century woodcut was carved a century before one of the greatest revolutions in human thinking, the “probabilistic revolution,” colloquially known as the taming of chance. Its domestication began in the mid-seventeenth century. Since then, Fortuna’s opposition to Sapientia has evolved into an intimate relationship, not without attempts to snatch each other’s possessions. Science sought to liberate people from Fortuna’s wheel, to banish belief in fate, and replace chances with causes. Fortuna struck back by undermining science itself with chance and creating the vast empire of probability and statistics.7 After their struggles, neither remained the same: Fortuna was tamed, and science lost its certainty.

Today, we live in the mesmerizing world these two allegorical figures created. Our minds have become crowded with numbers and probabilities. Baseball grew out of sandlots and city streets, supported by a culture of working men and farm boys. Now it is unthinkable without statistics: batting averages, strikeout averages, and playing the percentages. If forced to choose, many a fan would prefer seeing the numbers to the game. Markets and trading emerged from daring, worldly-wise men who voyaged across empires and made their fortunes, surpassing the ruling aristocracy in wealth and eventually initiating a revolution so that others without titles of nobility could live a decent life. Today, traders no longer venture to make their fortunes on the road but on their high-speed computers with the help of mathematical models aimed at predicting the stock market. All the while blindfolded Fortuna is still at work, calmly spinning her wheel, fooling forecasters and plunging Nobel laureates’ hedge funds into ruin.

The twilight of uncertainty comes in different shades and degrees. Beginning in the seventeenth century, the probabilistic revolution gave humankind the skills of statistical thinking to triumph over Fortuna, but these skills were designed for the palest shade of uncertainty, a world of known risk, in short, risk (Figure 2-3, center). I use this term for a world where all alternatives, consequences, and probabilities are known. Lotteries and games of chance are examples. Most of the time, however, we live in a changing world where some of these are unknown: where we face unknown risks, or uncertainty (Figure 2-3, right). The world of uncertainty is huge compared to that of risk. Whom to marry? Whom to trust? What to do with the rest of one’s life? In an uncertain world, it is impossible to determine the optimal course of action by calculating the exact risks. We have to deal with “unknown unknowns.” Surprises happen. Even when calculation does not provide a clear answer, however, we have to make decisions. Thankfully we can do much better than frantically clinging to and tumbling off Fortuna’s wheel. Fortuna and Sapientia had a second brainchild alongside mathematical probability, which is often passed over: rules of thumb, known in scientific language as heuristics.8 When making decisions, the two sets of mental tools are required:

   • RISK: If risks are known, good decisions require logic and statistical thinking.
   • UNCERTAINTY: If some risks are unknown, good decisions also require intuition and smart rules of thumb.

Most of the time, a combination of both is needed. Some things can be calculated, others not, and what can be calculated is often only a crude estimate.

Figure 2-3. Certainty, risk, and uncertainty. In everyday language, we make a distinction between “certainty” and “risk,” but the terms “risk” and “uncertainty” are mostly used as synonyms. They aren’t. In a world of known risks, everything, including the probabilities, is known for certain. Here, statistical thinking and logic are sufficient to make good decisions. In an uncertain world, not everything is known, and one cannot calculate the best option. Here, good rules of thumb and intuition are also required.

Known Risk

The taming of chance created mathematical probability. I will use the term known risk or simply risk for probabilities that can be measured empirically, as opposed to uncertainties that cannot.9 Probabilities of rain, for instance, can be measured on the basis of observed frequencies, as can batting averages and the risk of thrombosis. Originally, the word “risk” referred not just to dangers or harms but also to both good or bad fortunes in Fortuna’s hands: A risk can be a threat or a hope. I will retain the original use of the word. After all, without risk taking there would be little innovation. And in many situations, a negative outcome can be viewed as positive from another perspective: A probability of rain can refer to a dangerous event, such as heavy rain causing car accidents, but also to a positive outcome, such as rain ending drought and famine. The risk of losing your fortune in a gambling casino is a calamity for you but a welcome one for the casino owners.

The Three Faces of Probability

One important fact is often overlooked. Probability is not one of a kind; it was born with three faces: frequency, physical design, and degrees of belief.10 And these have persisted to this day.

Frequency. In the first of its identities, probability is about counting. Counting the number of days with rainfall or the number of hits a baseball player makes and dividing these by the total number of days or strikes results in probabilities that are relative frequencies. Their historical origins lie in seventeenth-century mortality tables, from which life insurances calculated probabilities of death.

Physical Design. Second, probability is about constructing. For example, if a die is constructed to be perfectly symmetrical, then the probability of rolling a six is one in six. You don’t have to count. Similarly, mechanical slot machines are physically designed to pay out, say, 80 percent of what people throw in, and electronic machines have software that determines the probabilities. Probabilities by design are called propensities. Historically, games of chance were the prototype for propensity. These risks are known because people crafted, not counted, them.

Degrees of Belief. Third, probability is about degrees of belief. A degree of belief can be based on anything from experience to personal impression. Historically, its origin is in eyewitness testimony in courts and, more spectacularly, in the Judeo-Christian lore of miracles.11 To this day, the testimony of two independent witnesses counts more than that of two who talked with each other beforehand, and the same holds for the testimony of a witness who did not know the defendant than that of his brother. But how to quantify these intuitions? That was the question that gave rise to degrees of belief expressed as probabilities.

Unlike known risks based on measurable frequencies or physical design, degrees of belief can be quite subjective and variable. Frequencies and design limit probability to situations involving large amounts of data or a design that is clearly understood. Degrees of belief, in contrast, are more expansive, suggesting that probability can be applied to any and every problem. The danger is that by extending probability to everything, it is easy to be seduced into thinking that one tool—calculating probabilities—is sufficient for dealing with all kinds of uncertainty. As a consequence, other important tools, such as rules of thumb, are left in the cupboard.

Does this multiplicity of identities matter? Not much when playing dice, but it certainly does when it comes to modern technology. The risk of a major accident in a nuclear power plant can be estimated by counting earlier accidents, or by the physical design of the plant, or by experts’ degrees of belief, or some mixture of these. The resulting estimates can be strikingly different. While counting nuclear accidents is straightforward, propensities are hard to determine for the design of a power plant, allowing for widely diverging estimates that may depend on the estimators’ political attitudes and on their financial backer. For that reason, it is always important to ask how the risk of a nuclear meltdown, or any other risk, was actually calculated.

The Art of Risk Communication

US

Additional information

Dimensions 0.7600 × 5.5100 × 8.4300 in
Imprint

ISBN-13

ISBN-10

Author

Audience

BISAC

,

Subjects

statistics, investing, political science, Human nature, public health, Brain, neuroscience, psychology books, mental health books, critical thinking, personal development, finance, money, productivity, behavioral psychology, business books, psych, psychology book, social psychology, PSY021000, PSY032000, puzzle books for adults word games, health, self improvement, marketing, management, culture, mental health, psychology, business, self help, coaching, work, philosophy, medical, social, leadership, society, medicine, career, behavior, Sociology, economics