It’s important to know how your emotions influence risk perception

This article originally appeared at Forbes and has been republished here with permission of the author.

Science applied to risks–of products, processes or activities–can be baffling to non-experts. Because people want certainty in their lives, the provisional nature of the scientific method–what we “know” only applies until disproven by new data–can be worrying. The public want simple black-or-white answers, rather than nuanced, qualified advice about relative risk, and they may become frustrated when they get the latter. That is where the “emotional dimension” of concerns about the potential risks of a new technology or activity comes in: Emotions are important because they can have a profound impact on acceptance by consumers.

Menninger Foundation psychiatrist Glenn Swogger wrote that as people make decisions about consumer products, fear and intimidation may distort the accurate assessment of risks, benefits and possible alternatives. That can lead to “regrettable substitutions”–replacement with inferior or actually harmful ingredients or processes, because of the blandishments of activists. And that in turn can result in unfortunate consequences from both an economic and humanitarian perspective.

In the late 1980′s, for example, in response to a widespread media campaign waged primarily by a radical U.S. environmental group, the Natural Resources Defense Council, the EPA pressured apple growers to abandon the use of the plant growth regulator Alar, a growth regulator that permits apples to ripen uniformly, increases yield and keeps fruit firm and full-colored beyond its natural shelf life (and, not coincidentally, lowers prices). The carefully orchestrated campaign of fear-mongering caused apple growers to abandon the product, and the collapse of the apple market caused many farms to fold.

A few other groups have been trying in a similar way to get government agencies to over-regulate out of existence foods improved with the molecular techniques of genetic engineering, in spite of the fact that they have been widely consumed for 30 years, without causing even a single documented tummy ache. And in any case, the techniques are an extension, or refinement, of earlier, more primitive ones that have given rise to virtually our entire food supply. Molecular genetic engineering boasts an incomparable record of safety and contributions to medicine and agriculture. And yet, some consumers still have concerns, based primarily on a sense of unease and a lack of knowledge.

Understanding the emotional dimension can help opinion leaders and consumers to address misapprehensions, to make more clear-headed decisions and to remain free from manipulation. Dr. Swogger listed several factors that can cloud thinking about risks, all of which have been prominent in various controversies about genetic engineering:

Uncertainty and ambiguity

Studies of risk perception have shown that people tend to overestimate risks that are unfamiliar, hard to understand, invisible, involuntary, and/or potentially catastrophic–and vice versa. Thus, they tend to underestimate risks that are relatively clear and comprehensible in their nature, such as using a chain saw or riding a motorcycle. On the other hand, they overestimate invisible “threats” such as electromagnetic radiation or trace amounts of pesticides in foods. Contributing to these emotions-driven fears are poor scientific literacy in general and unfamiliarity with the statistical aspects of risk in particular. How many people realize, for example, that there are radioisotopes (of elements like hydrogen, carbon and potassium) in every bite of food we take? And how significant is it if an individual learns that his high fat diet increases the probability of bowel cancer by 17%?

In the case of genetic engineering, several factors create unease in non-experts. First, few are aware of the long, safe history of “conventional” genetic engineering to produce vaccines, enzymes, vitamins and antibiotics, as well as virtually all our domesticated crops. In fact, unless your diet is limited to wild berries, wild mushrooms, game, fish and shellfish, it’s almost impossible to get through a day without eating food that has been genetically improved in some way.

The desire to return to a childlike world of purity and innocence

This romantic, puerile view of the physical world, which reflects a wish to escape from complex realities and choices, can give rise to a kind of puritanical, anti-technological view of the world. We see this in various advertising terms and slogans, such as “natural,” “GMO-free” and “Simply Raised.” Purity and simplicity become desired ends in themselves, to the exclusion of other goals such as feeding and sheltering the inhabitants of the planet.

Manipulation of environmental anxieties

The hidden agenda of many of those who have attempted the “greening” of American society and government–environmental organizations, certain political leaders and media–is their own self-interest. But a by-product of their misinformation campaigns is progressively more widespread acceptance of junk science. Clouding the public’s understanding of the development of new genetically engineered crop plants, certain environmental organizations, organic agriculture interests and the media have raised disinformation to an art form. As part of an apparent decades-long anti-genetic engineering crusade, the New York Times even coined the term, “Frankenfood.” What has been lost is the ability to discriminate between plausibility and reality, and between what one politician called “real facts” and “made-up facts.”

What are the take-home lessons, then, for food and health professionals, opinion leaders and scientists who need to communicate the risks and benefits of new products?

First, emotional responses to questions of technological risk are powerful but they can be tempered with knowledge.

Second, that knowledge needs to be imparted in a way that is scrupulously honest but also simple enough to be understood. Concrete examples, especially relevant historical analogies, are often useful.

Third, in both public forums and (especially) as advisers to government, experts should insist on the inextricable linkage between science and public policy. At every opportunity, they should reinforce the importance of science and the scientific method; science is organized knowledge, and knowledge is power.

Fourth, there has been far too much tolerance of outright misrepresentation and mendacity on what are fundamentally scientific dialogues. Too often, government policymakers have welcomed anti-technology activists to their advisory committees, hearings, conferences and bosoms. We have even seen this with panels of the National Academy of Sciences concerned with various aspects of public policy toward genetic engineering. Biologist Donald Kennedy, former FDA Commissioner and former Stanford University president, chided policymakers:

Frequently decision-makers give up the difficult task of finding out where the weight of scientific opinion lies, and instead attach equal value to each side in an effort to approximate fairness. In this way, extraordinary opinions, even those like [activist Jeremy] Rifkin’s, are promoted to a form of respectability that approaches equal status.

Always the consummate gentleman, Kennedy was too charitable. In the arena of biotechnology-related policymaking, the policymakers are not confused; rather, they have cynically used the high-profile demands of anti-science groups and the public’s confusion as cover to justify unscientific, excessive, hugely burdensome regulation.

How can the public guard against being bamboozled? Although emotional responses to questions of technological risk may be inevitable, they can and should be tempered with knowledge, so to distinguish genuine health or environmental concerns from fear-mongering, non-experts should try to ascertain what scientists really know about an issue and find reliable sources of information. For genetic engineering, a good start would be Biofortified, the American Council on Science and Health and my articles. Above all, everyone should heed Sherlock Holmes’ admonition in Scandal in Bohemia that “it is a capital mistake to theorize before one has data.”

Henry I. Miller, a physician, is the Robert Wesson Fellow in Scientific Philosophy & Public Policy at Stanford University’s Hoover Institution.  He was the founding director of the FDA’s Office of Biotechnology. Follow him on Twitter @henryimiller.

glp menu logo outlined

Get news on human & agricultural genetics and biotechnology delivered to your inbox.