The outbreak of food-borne disease in Europe offers an interesting lesson in the psychology of risk perception, and how that psychology can contribute to the overall risk. To be sure, the danger from this outbreak is real. It has tragically killed nearly two dozen people so far, and sickened more than 2,000, hundreds of whom may suffer lifelong kidney damage. In addition there is the possibility that this is a new and more dangerous strain of harmful E. coli, a reminder of the constant battle we wage between medicine and public health and the phenomenal ability of germs to mutate to resist our controls.
Certainly this risk is far more real than, say, the hypothesized human health risks from genetically modified foods, or the disproved risk that vaccines can cause autism, other threats that demonstrate how our response to risk is more emotional than purely evidence-based. The number of dead and ill from the food-borne disease outbreak is already higher than the likely lifetime mortality and morbidity caused by the Fukushima nuclear power plant accidents in Japan, based on what we know so far about the dosages of radiation released (though that event is still unfolding).
But the actual danger to any vegetable-eating European, even in Hamburg or other places where the cases have been concentrated, is low. Statistically. Scientifically. But then, we don't just use the scientific evidence or statistical probabilities to figure out what's dangerous, and how to protect ourselves. Risk perception is a mix of facts AND feelings, intellect AND instinct, reason AND gut reaction. And in many cases, the feelings/instinct/gut reaction has the greater influence. This is neither right nor wrong, smart or stupid, rational or irrational. It's simply the reality of how we go about protecting ourselves, using the few facts we have, and applying a set of subconscious mental tools and instinctive risk perception 'fear factors' that help us gauge, quickly and subconsciously, how scary those few hints and clues feel.
The problem is, as good a job as this instinctive system has done getting this far through evolution's gauntlet, it can make mistakes. Dangerous mistakes. We can fear too much (vaccines), or too little (particulate pollution from coal-burning power plants), based on the evidence, and our perceptions can create risks all by themselves. Excessive fear of vaccines is allowing previously nearly-eradicated diseases to spread. Inadequate concern about coal as a fuel for electricity generation has contributed to energy policy in the past few decades that has favored coal over scarier nuclear power and raised the risk for millions of sickness and death from particulate air pollution.
So watching this living lesson of "Killer Cucumbers On the Loose" is important, and instructive. Why, if the actual risk for any given person in this case is so low, does it feel so scary to so many? The study of risk perception has found:
· Uncertainty raises fear. We are uncertain about this risk for two reasons. First, science doesn't have all the answers, about which foods are risky, and where they came from, etc. Second, any invisible/odorless/tasteless risk like this that we can't detect with our own senses is scary because we don't know all we need to know to protect ourselves. And in this case there is great uncertainty because of the unknown nature of the organism, and the difficulty in tracking down where it originated. That's a lot of unknowns, which make the risk scarier.
· If you think a risk can happen to you, it doesn't matter what the numbers say! Many risk communication experts (Gird Gigerenzer, Steve Woloshin, Lisa Schwartz) work hard to find clearer ways to help people understand risk numbers, as though that will make us think about those numbers more rationally, but if a risk is only, say, one in a million, but you think you could be the one, you are likely to worry at least a little, because your job is to keep yourself alive, not the other 999,999.
· High awareness increases fear. Subconsciously, the danger detection systems in brain gives extra weight to information that's coming in all the time, or that can be readily recalled. This "availability heuristic" then feeds on itself in a positive feedback loop (what Cass Sunstein has called 'Availability Cascades'). We pay more attention to information that could mean we are at risk, and the information media, in fierce competition to bring us information we want, feed this appetite, and feed our fears.
These are just three among many specific components of our affective/instinctive risk perception system that can lead to what in my book I call "The Perception Gap", the gap between our fears and the facts which can be a risk in and of itself. How so, in this case? There are a lot of people who aren't eating vegetables, any vegetables. That's not good for their health. Hundreds of thousands of people are more worried than necessary, and more worried than normal, and chronic worry (that lasts more than several days) produces the myriad damaging health effects of stress (including a weakened immune system, which makes us more vulnerable to the very bacterial infections about which people are worried in the first place). This outbreak will cost a huge amount of money, and damage the livelihoods, and lives, of thousands of people engaged in the produce and food industries across Europe.
Again, this is not a criticism that people are irrational about risk. No, this is not a purely fact-based way of assessing things, but that's not how we do it. Judging risk perception as irrational is irrational in and of itself, because it denies all that science has taught us about how inescapably instinctive/emotional the system is. But it is valuable to observe, using this current teaching moment, that the way we perceive and respond to risk can actually be a risk in and of itself. Understanding that, and understanding the specific elements that make a given risk more or less frightening than the facts alone suggest, is the first step toward avoid the dangers of the Perception Gap, and making healthier choices for ourselves and for society.
(This post originally ran on The Guardian)