The Science of Crisis Communication

7 Min Read

Early in the film Minority Report, police arrest a man for murder. It is not a killing he has actually carried out, but rather a “future murder” that has been predicted he will soon commit. The premise of the 2002 movie, and the short story on which it is based, is that society in the mid-twenty-first century has the means to portend and prevent such unwanted events; a capability many businesses and organizations today would surely welcome.

Early in the film Minority Report, police arrest a man for murder. It is not a killing he has actually carried out, but rather a “future murder” that has been predicted he will soon commit. The premise of the 2002 movie, and the short story on which it is based, is that society in the mid-twenty-first century has the means to portend and prevent such unwanted events; a capability many businesses and organizations today would surely welcome. But while this may have seemed like science-fiction just a decade ago, it is fast becoming science fact. Albeit, what science?

Past (and Present) as Prelude

On-screen, the source of this extraordinary foresight are three young psychics called “pre-cogs” who can envision wrongdoings before they occur. Off-screen, it takes a combination of data mining techniques, text analysis and predictive modeling. Leveraging these tools to dissect vast amounts of information about the past and present, data scientists are increasingly able to make their own astute projections about the future.

Researchers who work in the emerging field of “culturomics” – a form of computational linguistics that scrutinizes digitized text to study human behavior – claim to have correctly augured the Arab spring after applying geographic and tone analysis to 30 years of global news archives. IBM’s Smarter Cities unit has built a system in Rio de Janeiro that exploits real-time data to anticipate an array of urban problems. And in an example of life imitating art, police in Santa Cruz California have adapted models, originally designed to forecast earthquake aftershocks, to foresee potential crimes.

It is only a matter of time then before other enterprises use the same systems and technologies to divine all kinds of latent troubles. Yet crises rarely, if ever, happen in hermetically sealed environments in which the type and amount of available data can be easily controlled. Given the ever-expanding size of the digital universe and the fact that information comes from more sources and at greater speeds than ever before, successfully managing emergencies requires a working knowledge of at least one other science: complexity.

Complexity is Everywhere

In complex systems, writes Microsoft researcher Duncan Watts in his book  Everything Is Obvious, “the best we can hope for is to correctly predict the probability that something will happen.” This is because such systems are comprised of many separate yet interdependent parts. The more parts there are, the more complex a system becomes. Likewise, since these multiple components interact in nonlinear ways, the consequences are often unexpected.

The 2008 global financial crisis has become the poster child of complexity gone bad. In the years leading up to the disaster, traders introduced an array of new algorithms, formulas and models; some of which they barely understood. Because the world’s banks and related institutions had become so densely interconnected, when things went wrong the chain reaction happened too rapidly for analysts to prevent an international meltdown.

To make matters worse, changes in nonlinear dynamics can be exponential. Even the smallest error at the start of a process (such as a misplaced digit or decimal) can result in outsized outcomes that may have once seemed unimaginable. This is the basis of Chaos Theory and, as Nate Silver notes in his book The Signal and The Noise, “in complex systems, mistakes are not measured in degrees, but in whole orders of magnitude.”

Thus, it is ill-advised to rely too heavily on machines and massive amounts of data to foretell the future. As Silver points out, “in data analysis, humans improve the accuracy of forecasts.” But in doing so, they also create the need for yet another body of knowledge: cognitive science.

The Future is a State of Mind

Cognition is how people process information, and it significantly determines the ways in which we gather, evaluate and understand data. Underlying much of cognition are a host of biases – flaws in perception – that color how we define the world and influence many of our most important strategic decisions.

Cognitive biases can result from relying on a limited intelligence, as was the case in Japan in 2011 when government officials failed to go back far enough in time to find geological evidence of past tsunamis on the scale of that which damaged three nuclear power plants in Fukushima. In turn, preset beliefs can inhibit impartial judgment, which may account for why several Republican pundits misread the data from the last presidential election. Whatever form it takes, misguided reasoning can present the same serious risks as misused data.

One possible solution to both problems is the advent of cognitive computing systems like IBM’s Watson, which in 2011 beat its flesh and blood opponents in the Jeopardy! Challenge by studying and extracting meaning from myriad books, newspapers, magazines, web sites and social media. Future Watsons will discern interactions between people and machines and draw insights from them; regularly reprogramming themselves accordingly. More than simply execute stored software, they will sense, learn and, at least in theory, provide sage advice.

In the meantime, humans will continue to handle that responsibility. And though it is unlikely communication practitioners will ever don lab coats, expanding their knowledge across sciences such as data, complexity and cognition, among others, will enable them to address ever more daunting crises.

(Crisis prevention through data? / shutterstock)

Share This Article
Exit mobile version