Skip to main content Skip to secondary navigation
Main content start

What can Mary Shelley’s classic, Frankenstein, teach us today?

Two hundred years after the book’s publication, we’re still grappling with how science affects society, and the role of scientists and engineers in managing that relationship.

Is it possible to get the upside of technology without the downside? | Illustration by Stefani Billings

Is it possible to get the upside of technology without the downside? | Illustration by Stefani Billings

In a celebration of the 200th anniversary of Mary Shelley’s FrankensteinPhilosophy Talk radio show hosts and Stanford philosophers Ken Taylor and Josh Landy were joined by Persis Drell, university provost and former dean of the School of Engineering, in a conversation about the responsibility of scientists to weigh the impact of their inventions.

At turns sprawling and incisive, uproarious and deadly serious, the discussion explored the dark side of technology, from nuclear weapons, social media and self-driving cars to biotechnologies, global warming and artificial intelligence. Throughout, the speakers returned again and again to the fundamental issue of balancing the freedom to innovate with the social responsibility to prevent “monstrous outcomes,” whether they come in the form of workforce displacement, democratic erosion or global calamity. Is it possible, they wondered, to get the upside of technology without the downside?

“It’s not as though everything is the Wild West, and we just have to throw up our hands,” said Landy. “We could decide as a society we want to shift some of these areas of technological innovation more in the direction of things that have a little bit of in-built policing and responsibility.”

Drell agreed, noting that since we don’t want to stop innovation, it’s incumbent on the creators of technology to be leaders and think through the full range of outcomes from their work. “It’s absolutely critical for the technologists themselves to have a sense of moral and social responsibility toward what they see themselves developing,” she said. For example, while social media companies have largely been slow to police themselves, she said, look to technologies like recombinant DNA, where researchers have placed clear limits in the name of social good.

The first step to mitigating the risks of technology is to understand the nature of the threat, Drell said. And while the culture of Silicon Valley in particular has been to build things first and ask for forgiveness later, Drell said she believes we’re starting to move away from that. “I see, optimist that I am, the beginnings of that development of social responsibility. We see that in our students here as well.”

“Engineers need to be educated not just in engineering,” Drell said. “They need to care about the impacts of the technologies they’re going to be involved in inventing. They don’t get that by taking more physics classes or more math classes or more engineering classes. They get that by taking a philosophy course, or by taking a literature course.”