The hidden side of politics

Lessons from Montecito’s Mudslides: Science’s Credibility Is At Stake

Reported by WIRED:

For applied scientists—that intrepid cadre who get their hands dirty in the sometimes dangerous world beyond the ivory tower, participating in difficult decisions with little time and major consequences—getting the right answer is only half the battle. The other half is successfully explaining what they’ve found, and what it means. This winter’s debris flows in the posh community of Montecito, California, which led to more than 20 deaths, provided examples of success and failure on both counts. And those successes and failures have ramifications far beyond managing geophysical risks.

Predicting debris flows—you might know them better as mudslides— is literally more difficult than rocket science, involving deep concepts in complexity theory like nonlinear feedbacks, fractal dynamics, and self-organized criticality.

A flood-landslide hybrid, debris flows are the most dangerous natural hazard you’ve probably never heard of. Yet they’re common in mountain regions worldwide, and the associated destruction can be spectacular: One night in 1999, debris flows on Venezuela’s Caribbean coast left 30,000 dead. And some landscape alterations, including wildfires, can increase their likelihood or severity. This is where Montecito’s problems began. Spanning coastal mountains to the Pacific shore, and having endured the Thomas fire—the largest wildfire in California history—only one month earlier, experts saw a recipe for disaster.

Scientists and emergency managers accurately predicted the Montecito disaster and told folks living there to get out of the way. The experts absolutely nailed it, issuing evacuation notices for the precise locations where the debris flows later happened. This represented a huge success for applied scientists, getting the right answer to a very hard question and effectively communicating it to community leaders. In other ways, though, it all went horribly wrong: Many did not evacuate, increasing the death toll. Why?

It will take years to disentangle the complex web of cause and effect that culminated in this disturbing tragedy. Early media reports pointed (as they often do) to simple explanations: For example, mandatory evacuation apparently was only undertaken in steep headwater areas, with less compelling voluntary evacuation notices issued for flatter downstream locations where the debris flows, running out from the mountains, ultimately took their heaviest toll. This seems likely to have played some role in enlarging the scale of the disaster.

But an evacuation notice is an evacuation notice, and few instincts top survival. I suspect that if residents believed they were at mortal risk, they would have run. It then follows that they doubted the predictions of disaster, perhaps without even fully realizing it. To phrase this in the language of risk management and disaster psychology the decisions of individuals to comply with calls for evacuation are multifaceted and depend on the perceived credibility of the source of risk information. The choice by some Montecito residents in the voluntary evacuation zone to decline science-based advice may point to a more fundamental issue, with implications far beyond southern California.

To see these connections, consider that the credibility of scientists as a source of relevant information has always ebbed and flowed. Despite what prevailing views might indicate, skepticism of science isn’t limited to some working-class folks who may not have embraced university-bred ideas on evolution and climate change. Rather, profound ambivalence about the intellectual and moral reliability of scientists has an old and refined pedigree—one that extends even to the residents of wealthy and highly educated places like, for instance, Montecito, and that could have played a quiet but powerful role in some of the decisions people made there.

Literature and film often record the promise offered by science (Independence Day, The Day After Tomorrow, Jurassic Park). But a parallel, often highbrow, strand of literary and cinematic expression is deeply skeptical of the scientific enterprise and its impacts (Frankenstein, Dr. Strangelove, Neuromancer). The common thread is that science can be exciting, beautiful, and helpful, but also irresponsible, perverse, and destructive.

The artistic elite is not unique in the long-standing worry about what some scientists do behind closed doors without seeking the permission of society as a whole. Even giants of science and technology like Elon Musk and the late Stephen Hawking have raised the alarm about particular directions in artificial intelligence research, for example, and public suspicion of GMO foods remains as acute as ever.

This persistent image problem is exacerbated by public relations gaffes. For instance, exaggerated research claims and visible political affiliations compromise scientists’ credibility as objective sources of accurate technical expertise.

Worse still, even the most earnest communication efforts often demonstrate, to paraphrase The Big Bang Theory, how dumb smart people can be. At a recent physics conference, I heard an outreach expert tell us how as scientists we need to “teach people how to think” (a classist and anti-democratic notion), that only science allows us to “process complex issues” (evidently, Hemingway, Rachmaninoff, and Picasso brought nothing to the table), and that a society’s degree of spirituality gauges its collective level of ignorance (effectively demanding that traditional peoples from Haida Gwaii to Tibet choose between indigenous culture and Western science).

Such prejudiced, tone-deaf, and increasingly strident megaphoning is echoed by some of the best-known STEM communicators. Even the recent and well-intentioned March for Science, wasn’t immune, where the sloganeering included t-shirts reading, “We are scientists. Ask us anything!” The organizers presumably thought this would be cute, and it sort of is, but it also reinforces the stereotype of scientists as obnoxious know-it-alls. That’s precisely the kind of image we shouldn’t put on public display—least of all when populism, a worldview that’s inherently skeptical of the integrity of experts, is taking hold across the entire political spectrum.

The lesson? The extent to which expert scientific advice is something on which the public is willing to base their lives is proportional to how positively scientists are viewed. People may not understand what’s happening in a particle accelerator, genetics lab, or climate supercomputing center, but they understand other people, and scientists are just other people. No one likes being told what to do by arrogant and prejudicial experts, and when people are treated this way they are likely to reject the messenger and our message. Conversely, everyone loves high achievers—in sports, business, and even science and engineering—who balance their confidence with demonstrations of humility and respect toward others. Put simply, it’s a popularity contest, and scientists aren’t winning it right now.

Failure to understand this lesson, and to act thoughtfully and proactively on it, will incur incalculable costs. Our world faces a dark constellation of threats: overcrowding, dwindling resources, and collapsing ecosystem function, to name a few. The American tradition of profound scientific discovery, undertaken in a context of deeply individualistic liberal democracy, is our best hope. But the most brilliant technical performance is socially useless if the public doesn’t have enough faith in us to act on our recommendations. Credibility is a valuable and fragile thing. The penalty for losing it, as Montecito may help remind us, can be death.

WIRED Opinion publishes pieces written by outside contributors and represents a wide range of viewpoints. Read more opinions here.

Our Warming Climate

Source:WIRED

Share

FOLLOW @ NATIONAL HILL