The description of uncertainty below details research that Branden Johnson and Paul Slovic have conducted. These findings are now being extended in a new study that will examine how people in both Japan and the United States (California) react to uncertainty in forecasts of earthquake risks. Seismic forecasts already contain cues to probabilities (for example, “10% chance of a magnitude 8 or 9 earthquake along this fault within the next 5 years”); this research will examine how reactions to such forecasts differ if an explicit qualitative statement by seismologists will stress how uncertain these forecasts are. Trust in experts is a critical factor in how their statements are received by the public, and it is unclear whether such statements would raise (“at least they are being honest”) or lower (“they clearly don’t know what they are talking about”) trust in a group of seismologists.
What Difference Does It Make If Authorities Communicate the Uncertainty in Their Estimates of Risk Magnitude?
Government agencies, corporations, and other institutions announce numbers about the things they deal with, often as if these numbers are certain (e.g., unemployment is such-and-such, or profits are this much), although this is rarely true. But people tend to react badly to expressions of uncertainty, even though this has become quite common in risk magnitude estimates, which are risk assessors’ calculations of how “bad” a hazard might be (e.g., “the added lifetime cancer risk of consuming this substance in drinking water would be 1 in one million”). The uncertainty usually comes from the inherent difficulty of calculating such risk when most or all of the scientific data come from animals rather than humans, humans can react in varying degrees to the same amount of a cancer-causing substance, and the degree to which they are exposed to it may not be well-understood, among other challenges to accurate and precise estimates.
Several studies presented people with a scenario in which an agency or a company announced that the “most likely” estimate of added cancer risk was 1 in 1 million, but that it could be as high as X or as low as Y (X and Y varied across studies; for example, the low number could be 1 in 10 million or zero). Then they were asked to indicate their level of agreement or disagreement with statements about this scenario, including whether “discussion of the range of possible risk levels makes [the organization] seem more honest” and “discussion of the range of possible risk levels makes [the organization] seem less competent.” The table shows that people had mixed reactions, tending to emphasize that such a message would indicate honesty but perhaps incompetence as well. Other questions revealed that people tended to assume that uncertainty indicated either scientific incompetence—“scientists are supposed to know the answers”—or lying to promote the self-interest of the scientists and their employers, rather than the inherent difficulty of this kind of scientific calculation.
Thus communication of uncertainty about risk estimates elicits both positive (honest) and negative (incompetent) reactions from citizens, thus posing a conundrum for hazard managers. Such communication seems to be ethical behavior in a democratic society, as citizens cannot make informed decisions without being informed. However, if it leads people to ignore your risk estimates because they think you are scientifically incompetent, that could hinder efforts to lower public risks or get people to focus on more important hazards.