Abstract
Despite remarkable advances over the past 35 years in the field of toxicology generally, and the development of a vast body of knowledge detailing the nature and degree of many human and environmental toxicological risks, the excessive fear of anything connected with chemicals that some refer to as “chemonoia” persists. So too, unfortunately, does the rationalist belief that once the facts are all in, everyone will agree on what those facts say. This article examines the roots of what is essentially a cultural conflict, explains what various bodies of social science research reveal about the psychological roots of that conflict, and offers suggestions on how to move forward.
Introduction: A look back
By 1981, the year in which this journal first published, public concern about the environment in the United States, Western Europe, and Asia was near an all-time high. Phrases like “hazardous waste” and “acid rain” and “ozone depletion” had become part of the public lexicon. Otherwise obscure places like Love Canal, Times Beach, Seveso, Minamata, and the Cuyahoga River had become known around the world. Google’s Ngram reader reveals that words like “pesticides” and “environmental” exploded into the common vernacular of a public that in just 20 years had gone from a naive faith in technology and science to a growing concern about what progress was doing to our natural world.
DDT, PCBs, and Dioxin: The stream of new chemical threats seemed never ending. The word “chemicals” itself had become deeply stigmatized. In 1962, Rachel Carson wrote in Silent Spring that “chemicals are the sinister and little-recognized partners of radiation entering into living organisms, passing from one to another in a chain of poisoning and death.” In the mid-1970s, when people were asked by risk perception research pioneer Paul Slovic what word first came to mind when they heard the word chemicals, top responses included dangerous, toxic, deadly, or cancer. A raft of US federal programs with chemically related acronyms—TSCA, FIFRA, and CERCLA—had been or were about to be established in the United States, to respond to the actual threat these substances might pose but also in large measure to respond to the public’s chemonoia, and there were similar moves in the Europe Community and internationally through the World Health Organization.
Many industries, threatened by financial losses and inherently resistant to regulation, fought back against that fear in all sorts of ways. A fierce publicity campaign was waged to discredit Silent Spring. In 1976, Monsanto famously advertised that “Without Chemicals, Life Itself Would be Impossible.” Too often the chemical industry followed the lead of the tobacco industry, hiding what it knew of the potential health risks of its products and processes, and repeatedly claiming that “the science” did not warrant chemophobic alarm.
In the United States, government agencies were created to deal with rising concerns about chemicals; the Environmental Protection Agency in 1970 and the National Toxicology Program in 1979. The scope of other agencies and laws created earlier to deal with the more visible problems of water and air pollution was expanded to sort out potential chemical threats. Into this explosive area of concern and complexity and controversy, toxicology was called on as never before to figure out the facts.
By 1981, the field of toxicology had exploded. Academic programs and professional organizations had sprung up. Regulatory, occupational, and environmental toxicology became firmly established disciplines. In the same year this journal began to publish, so did four other scientific publications devoted to toxicology. Ngram reader finds that the word “toxicology” appeared in books five times more frequently in 1985 than it did in 1960.
The search was on for answers about what our post-World War II industrial economy and its new synthetic chemicals were doing to human and environmental health. Toxicology was supposed to supply those facts. As the first editor of this journal wrote in the premier edition, “Politicians cannot be expected to come to rational and acceptable decisions without adequate impartial and objective information, and toxicologists have grave responsibilities to produce such information.”
It is nearly 35 years since those optimistic words were published, and in those three and a half decades, the toxicological sciences have produced a vast body of impartial and objective information about what chemicals do, or don’t do, to humans and the environment. Yet the conflict over the health effects of chemicals, mostly synthetic chemicals rather than those that occur naturally, rages on. Chemonoia runs as deep as ever. The promise implicit in those hopeful words from 1981—that toxicology would ride to the rescue and provide The Facts and thus help politicians produce “rational and acceptable decisions”—remains unfulfilled.
Sadly, that promise seems more naive now than it did when it was first made, because while toxicology has advanced dramatically, so has our understanding of the inherently subjective nature of human cognition. As surely as the study of poisons is moving toward more powerful in vitro and in silico and toxicogenomic approaches, our understanding of how people perceive and respond to risk is moving away from the assumption that the facts, alone, can produce perfectly objective evidence-based rationality. A vast and growing body of evidence (brilliantly summarized in Daniel Kahneman’s “Thinking, Fast and Slow”) has shown that our post-Enlightenment faith in the power of human reason to overcome our passions and make “rational” decisions is blind. Nothing so clearly demonstrates that as the fact that the fiery disputes about various chemical risks that gave rise to this journal, and to modern toxicology itself, burn on, decades after the facts became available that should have settled many of those disputes.
What is new?
The fight about chemical regulation, now more than 50 years old, continues to put toxicology in the middle of a battlefield that is clearly about something more than just the facts alone. To be sure, there are new factors. We have new chemicals to worry about—phthalates and bisphenol A, acrylamide and 4-methylimidazole, and perfluorooctanoic acetic acid. We have new technologies that allow us to modify the genes of the food we grow and to manipulate matter at the atomic level, technologies that raise toxicological concerns. We have more than a dozen different versions of a precautionary principle, the effort to embed in law a common sense “Better Safe Than Sorry” approach and the default assumption that “chemicals are guilty until proven innocent.” It is telling that the momentum for a precautionary approach toward chemical regulation developed in the late 1980s in the face of, and quite likely in response to, a spate of decisions about chemicals in the United States and Europe that seemed to assume that “chemicals are innocent until proven guilty,” a “sound science” approach favored by industry.
Also new is what many in the field of toxicology have described as the ugliest and most personal dispute they have ever witnessed, the fight between competing interpretations of the evidence regarding endocrine disruptors; personal invective flying in all directions; ad hominem attacks on the honesty and integrity of any scientist who dares offer a view with which someone else disagrees; and rejection of governmental decisions not just as wrong but as corrupt when those decisions don’t satisfy one party or another.
In the past 35 years, we have also developed rich new insights into the workings of human cognition that explain why intelligent people can see the same facts in such radically different ways and why, though some of the details may be new, the decades-old fight about chemical regulation continues. As toxicology was advancing, the study of human cognition was also making remarkable discoveries, and one of the most disturbing findings from that research is that the human brain prefers things nice and easy. Kahneman, who won the 2002 Nobel Gold Medal in Economics for his work on judgment and decision making, calls this cognitive ease.
It takes calories to think, to pay attention, to learn, to keep an open mind and not just default to what we already know and believe, and the brain instinctively avoids spending those precious calories unless it needs to. The brain weighs a few pounds but uses as much as 20% of the calories a resting body burns, a rate that increases in short bursts when circumstances call on us to “pay” attention. As our modern brains were evolving, we weren’t sure when the next meal might come, so we developed a suite of cognitive shortcuts—heuristics and biases—to do our thinking with as little effort as was needed.
This bodes poorly for the sort of effortful careful thinking that complex chemical risk issues require. The new Internet age, despite offering vastly more information, makes things worse because it offers information in ways that play to our instinctively lazy brains. The Information Age dumbs us down as much as it smartens us up.
The news media provide shorter stories, with important details about dose and exposure missing, so we never learn some of the critical details we need to understand complex chemical risk issues that require more knowledge and careful thinking, not less. The Internet provides a global megaphone to anyone who wants to advocate a point of view, and since it’s easier for the lazy brain to seek affirmation than information, we end up with our own views reinforced and minds that are more closed about issues that require more open-minded reflection. The immediacy and global reach of the media mean that we hear about the latest risk du jour as soon as the first alarm sounds, long before all the facts are in, and often from social media sources that are less than trustworthy for accuracy or objectivity. This is also dangerous for intelligent risk assessment because research about judgment and decision making has revealed that we instinctively scan early information to see if it portends danger, and once we’ve made our first quick judgment, subsequent information has to compete against our initial assessment. We jump to conclusions, and it’s cognitively easier just to stick to them.
Another new aspect of our understanding of human cognition is the Theory of Cultural Cognition of Dan Kahan et al. This research has found that we shape our views on many issues so they agree with those in the groups with which we most closely identify. This establishes us as a member in good standing of our group, our tribe, and reinforces tribal cohesion and solidarity. Both are important for the survival of a social animal like humans.
Kahan defines “group” around four criteria that describe the general norms by which people prefer society to operate. These worldviews shape how people see issues like chemical risk. Individualists prefer a society that leaves most decision making and control up to the individual. Communitarians prefer a society based on a “we’re all in it together” ethos, in which individuals willingly cede some decision-making authority to the collective in the name of the greater common good. Hierarchists prefer a society that is ordered by unchanging and predictable hierarchies of social and economic wealth. Egalitarians prefer a society that is more flexible, in which opportunity is not bounded by such class constraints and in which government intervenes when necessary to insure fairness.
Kahan has found that on many environmental issues, particularly more polarized issues such as climate change and nuclear power, the underlying worldviews of the group shape individuals’ views. Communitarians and Egalitarians hold more classically environmentalist views, since society-wide pollution and other environmental-scale risks require a “we’re all in this together” government response favored by Communitarians, and since government regulation of industry to reduce pollution shakes up the status quo in the name of protecting the general public, and that appeals to how an Egalitarian wants society to operate.
This culturally informed way of seeing the world does not diminish based on how intelligent people are. In fact, at least on the issue of climate change, research by Kahan and colleagues found just the opposite. Research described in Motivated Numeracy and Enlightened Self-Government describes how subjects were tested for numeracy and on the Cognitive Reflection Test, which measures the amount of effort people are willing to put into figuring out tough questions. The people who scored higher on these two measures of intelligence actually held more polarized views on the climate change issue, particularly those who deny the evidence about climate change. Some of the people who see the same facts about chemical risk in such dramatically different ways are among the smartest people out there.
What is not new … and why the battle rages on
These cognitive processes operate subconsciously, quickly, and powerfully influence how we feel about the facts. So too do a number of psychological characteristics that help us subconsciously gauge whether something might be a risk, and how worried we ought to be. These characteristics have been identified by decades of research that began alongside the growth in toxicology in the late 1970s, research that was triggered by the same issue … chemonoia.
Prompted in part by regulators and scientists who wanted to understand why people were more worried about chemical and radiation risks than seemed warranted by the evidence, Paul Slovic and Baruch Fischhoff and many others identified a number of affective characteristics that help explain deep and persistent fear of chemical risk. They found that: We are more afraid of human-made risks than natural threats. We are more afraid of threats we can’t detect with our own senses or which are complex and hard to understand. Either circumstance produces uncertainty, which leaves us feeling powerless and threatened because we don’t know what we need to know to protect ourselves. We are more afraid of “dread” risks that involve greater pain and suffering, like cancer, the central focus of concern about many synthetic chemicals. We are more worried by risks that are imposed on us, like contaminants and pollutants, than risks we engage in voluntarily. Our judgments about risk depend on who we trust. We trust environmental organizations that, despite their advocacy of certain points of view, are believed to be on the public’s side more than big companies which profit from the products and processes that expose us (involuntarily) to synthetic (human-made) chemicals.
(The rich body of evidence on risk perception is explained in greater depth in the book “How Risky Is It, Really? Why Our Fears Don’t Always Match the Facts.”)
These influential risk perception characteristics help us quickly gauge whether something might be a risk, and how worried about it we should be, before we have all the facts. They have helped us survive, and as Slovic has noted, still work reasonably well. We mostly get risk right. But across all risk domains, not just chemicals, these “fear factors” sometimes cause us to worry more than the evidence warrants or sometimes to worry less than the evidence warns. This phenomenon has been called The Risk Perception Gap, a gap between our fears and the facts that can produce risks all by itself.
Examples of the Risk Perception Gap are abundant, Chemonoia is just one. People worry less about smoking than they should, less about obesity than they should, and less about climate change than the evidence clearly tells us we should. Some people worry more than the evidence warrants about vaccines, more than the evidence warrants about ionizing radiation, and more about child abduction than the evidence supports.
Disturbingly however, the research on risk perception psychology, confirmed again and again by robust real-world evidence across domains, is dismissed by parties on all sides in the fight over chemical safety. Those sounding the most dire alarms about chemical risks dismiss this evidence as “BS” and “just merchants of doubt spin” designed to undermine their alarmist interpretation of the evidence. (Those quotes are from personal correspondence with a leader in the community raising the most dramatic alarms about endocrine disruption, who shall remain unnamed as a matter of courtesy.) Those with vested interests in synthetic chemicals who think the fears are overblown say that the evidence that “risk is a feeling”, as Slovic has put it, just means that people are irrational and that with adequate information and effective risk communication, they can be educated and calmed down. This denies robust evidence from the study of risk, health, and science communication, the research that has firmly established the naivete of the rationalist belief in the “deficit model”—the idea that people just need to be educated and as soon as they understand the facts they’ll ‘get it’.
Sadly, the view that “they just need to be educated” is also held by many in the toxicological sciences who, by training and culture, hold to the belief in human reason and the idea that The Facts alone can produce rational decision making. Curiously, research on the risk perceptions of toxicologists themselves demonstrates the fallacy of this assumption. The views of these ostensibly objective scientists are also based on more than just the facts alone. The authors of “Intuitive Toxicology: Expert and Lay Judgment of Chemical Risks” reported that the general public and toxicologists rate risks differently. But among toxicologists, there were differences as well. They wrote “We also found an affiliation bias: toxicologists working for industry saw chemicals as more benign than did their counterparts in academia and government.” They suggested that “… controversies over chemical risks may be fueled as much by … disagreements among experts as by public misconceptions.”
Moving forward, making progress
For all these reasons, 35 years after Human and Experimental Toxicology began publishing and despite all we have learned about what chemicals do and don’t do to people and the environment, chemonoia, and the fight over The Facts persist. How can we temper stigmatized fear of chemicals in general and how might we at least moderate the fierce and nasty conflict over how chemical risks should be regulated?
To combat chemonoia, scientists must engage far more actively in public communication and education. This is particularly true for toxicologists who are uniquely qualified to help the public understand chemical issues. They must become more prominent in the public conversation and not just quietly publish in academic journals or speak only to the policy makers and regulators.
One way they help inform the public is to engage more with journalists. This can be accomplished in several ways: Most journalists who report about chemical issues don’t know the central elements of risk assessment—dose and exposure and the critical details in each half of the risk formula—that have to be included in a story so that readers can make an informed judgment about how risky something might be. Journalists need to be educated about what basic questions to ask when reporting on a story involving chemical risk. Many are eager to learn. I offered such training, in a program called “Improving Media Coverage of Risk,” to nearly 400 journalists in newsrooms, journalism conferences, and journalism classes, worldwide. Toxicologists should also find ways to offer this knowledge to journalists, either with in-person seminars, face-to-face in-newsroom meetings, or printed or Web-based resource material. Scientists should identify the journalists whose work covering their issues has been solid and intelligent and mature, and reach out to them, offering to be a source of information should stories arise where their expertise might be of use. This assistance can be offered as an on-the-record source, or just as an off-the-record background resource to explain scientific and technical issues. Journalists love this, and call on such sources actively. When a new chemical risk issue arises (acrylamide), or new evidence develops about an ongoing issue (bisphenol A), toxicologists should consider it their responsibility to reach out to journalists to add their expertise to the coverage.
In general, scientists should speak less as advocates than as educators, offering not a point of view or an argument but expertise and insight about the evidence that can help the public differentiate between the facts of an issue and the values debate about those facts. This has already begun in the United Kingdom around agricultural biotechnology, helping influence British attitudes about genetically modified food.
In some circumstances, however, it is entirely appropriate for scientists to offer their perspectives and opinions about controversial issues such as chemical risks or agricultural biotechnology. When this is done, care must be taken to avoid sounding arrogant, intellectually superior, or dismissive of the views of those less educated. Dismissing people’s views as “irrational” is particularly counterproductive.
To maintain trust, it is vital for scientists to reveal all potential conflicts of interest, particularly when taking a position on a controversial issue.
Critically, scientists must be willing to compromise their own cultural norms about what public communication should look like. The communication must be done and messages must be crafted with the information needs of the audience in mind, not the needs of the communicator. Brevity, simplicity, clarity, jargon-free language, letting go of the need to include all the qualifiers important in careful scientific communication, are just a few of the adjustments scientists must be willing to make in order to communicate with the general public more effectively.
More proactive participation in public information can help moderate chemonoia. Scientists can also help address the other problem, the emotional fight over the facts and chemical risk regulation. The first thing they can do to end this fight is to accept that they can’t.
Not altogether, anyway. As the evidence laid out above clearly establishes, outside the world of science, risk is a feeling. The subjective fear factors we use to gauge whether something is a threat are built-in. They help us feel safe. Group identity/affiliation is literally part of how we protect ourselves. These are intrinsic elements of the risk perception system we rely on to survive. It is rational to use them, in the sense that rational means using every available tool to stay safe.
These instinctive emotional tools have more influence on our perceptions than cold hard reason. Neuroscientific research on fear by Joseph LeDoux and others has discovered that the wiring and chemistry of the brain are such that emotion and instinct play a larger role in how we perceive and respond to risk than does dispassionate rational fact-based analysis.
So continuing to assume that rational argument and The Facts can entirely overcome our innately subjective perceptions of risk is either arrogant or simply unaware of the wealth of academic research and real-world evidence that belies such assumptions. It’s clear that the deficit model of communication, as in “Here Are the Facts. Now You Can Make An Intelligent (i.e. rational) Decision,” is insufficient.
Indeed it is often counterproductive. A message that suggests “Stop feeling the way you do, and feel about these facts the way I do,” disrespects the other person’s feelings. It’s not only not persuasive. It’s destructive, breeding mistrust, defensiveness, and hostility to the message, rather than openness and consideration.
This is particularly true when someone’s views about chemicals are shaped in order to reflect his group’s view. Espousing those views is important for that person’s group/tribal identity and affiliation, so challenging those views presents a tangible threat to their safety. Asking a devout environmentalist to change their views on pesticides is the same as asking them to abandon their tribe. That is literally dangerous. It triggers a Fight or Flight or Freeze response, a biological process that includes the release of hormones like adrenaline and norepinephrine and glucocorticoids, all of which make the argument that follows even more about emotion and self-protection than it was when it began.
So a fresh approach to discourse about chemical risk and regulation is needed. Of course The Facts must still inform the debate, but anyone trying to move the consensus view of those facts in his direction must be honest about the values and emotions that inform his view of the facts and respectfully acknowledge the values that inform how the other person feels. That doesn’t mean agreeing with those other views. It just means showing a respect for, rather than a combativeness towards, the affective reasons why the other person sees the facts in a different way.
The conversation must be a true dialogue, which means not just giving the other person a chance to speak but demonstrating that they are being heard. It means what any good therapist would call “active listening”—demonstrating a respect for the other person’s perspective (their feelings) even while trying to advance your own.
Let’s not be naive, however. Given the intrinsically subjective and often tribal nature of our views, this fresh approach can only help a bit. Fears about chemicals arise in part from deeply held views among some groups about the damage that industry and technology and progress have unquestionably done to our natural world. Those who dismiss the risk of chemicals often do so through the lenses of their personal vested interests. Those stakes are huge, on both sides. No amount of respectful and empathetic communication can disabuse people of these deeply held attitudes.
Even when the roots of general public chemonoia are not driven by group values and fundamental worldviews, but just arise because of the instinctive psychological characteristics that make some risks feel scarier than others, no amount of communication can entirely overcome these deep and ancient instincts. An uncertain risk that might kill you in a really painful way and which is imposed on you by a greedy company is going to feel scary, even when the probability is 1 in a million. That risk will still feel scary even when compared to other risks that may be statistically more likely. (Probabilistic risk comparisons are a lousy form of risk communication. They appeal to mathematical reason, when risk is more about our feelings than the facts alone. They imply that someone whose fears don’t match the statistics is less than rational, and should feel about the risk the way the communicator does, which is disrespectful and breeds mistrust and defensiveness.) Ultimately, even to scientists, risk is a matter of how we feel about the facts more than just what the facts tell us all by themselves and not even perfect risk communication can make those feelings disappear entirely.
Conclusion
The passionate disagreement over what the evidence tells us about the threat of industrial chemicals will persist. Even as the facts continue to come in from toxicologists and other scientists, the fight over what those facts mean will rage on. Society will continue to struggle to separate facts from feelings, as it tries to make policy choices about chemical risk that are both rational and evidence based, and at the same time politically acceptable, which is about how the evidence feels more than what it objectively says.
That is as it should be in a democracy. Values and feelings must have a voice. The problem is that policy driven by emotion can sometimes produce laws and regulations that are politically acceptable, but which fail to actually protect us in the smartest and most effective way. The Risk Perception Gap, worrying too much or too little, is a risk in and of itself.
That is why, in addition to the knowledge that toxicology and journals like Health and Experimental Toxicology can provide, more must be done to address the public’s chemonoia. More must be done to diffuse the destructive values-based enmity poisoning the debate over how to regulate chemical risk. We can never take the emotion out of risk perception. But the more we can give careful objective reason and The Facts a louder voice in how we think these issues through, the greater our chances of making more informed and healthier choices for ourselves and for the global community to which we all belong.
Footnotes
Conflict of interest
The author has taught at and consulted for many organizations involved in a wide range of chemical and other risk issues, including corporate, government, civic, environmental, and academic institutions, around the world.
Funding
The author received no financial support for the research, authorship, and/or publication of this article.
