Skip to main content Skip to secondary navigation
Main content start

​How alternative narratives emerge

​An exploration of our increasing vulnerability to disinformation in online contexts

Recent attention around “fake news” has highlighted the growing challenge of determining information veracity online. | iStock/kostenkodesign

Recent attention around “fake news” has highlighted the growing challenge of determining information veracity online. | iStock/kostenkodesign

In a recent seminar at the Stanford Center for International Security and Cooperation, Kate Starbird, ’97, professor at the Department of Human Centered Design & Engineering at the University of Washington, discussed her research into the way rumors and disinformation spread in online contexts, especially in the wake of crisis events.

Starbird’s work bridges computer science and social science to explore how human cognition and information communication technologies interact. She explained that, while rumors certainly existed before the internet, due to the ways that people make sense of the world and get information in online environments, we “may be particularly vulnerable to the spread of rumors and misinformation in these online spaces.”

Starbird traced the flow of information on platforms like Twitter and Facebook after events like the Boston Marathon bombing, the disappearance of flight MH17 and mass shootings from Paris to San Bernardino.

While each event was unique, Starbird said that “we started seeing the same kind of rumors show up over and over and over again.”

In short, she saw strikingly similar patterns of information being twisted into disinformation—or deliberate misinformation—meant to sow confusion, doubt and more.

Those patterns included appeals to critical thinking even while subverting it, the use of leading questions and speculative language, and acting out extreme caricatures of all sides of the political spectrum. And it wasn’t just happening on the fringes, Starbird noted: “As 2016 progressed, we began to recognize that some of the themes we were seeing in these rumors—which we thought were really marginal—were being repeated by people in places of power in ways that we hadn’t expected.”

By tracing those patterns, mapping out networks of users and performing website content analyses, Starbird came to see the deeper goal behind the strategic spread of online disinformation. “The purpose of disinformation is to confuse, to create muddled thinking across society,” Starbird said. “A society that stops trusting information, stops trusting the media … is one that is easy to manipulate.”