Skip to main content Skip to secondary navigation
Main content start

Analyzing the ethical and societal impacts of proposed research

A group of scholars come together to create the Ethics and Society Review, which considers how proposed research could have harmful effects on society as well as positive effects.
Illustration of a network
The initiative aims to enable the consideration of the ethical implications of research to become second nature. | Illustration by the Laundry Room.

An interdisciplinary Stanford team has created and seeks to scale a new Ethics and Society Review (ESR) that prompts researchers seeking funding to consider the ethical and societal impacts of their research and how to mitigate potential harms.

Currently, Institutional Review Boards (IRBs) are the main form of ethical review for research done in the United States that involves human subjects. IRBs are groups designated to review and monitor research and ensure the rights and welfare of the people taking part in the study. According to the U.S. Food and Drug Administration’s regulations, IRBs have the authority to approve, deny, or require modifications to a research project, but their scope is limited to assessing the impact the of research on the individuals in the study.

The newly proposed ESR fills a critical need by considering how proposed research could have harmful effects on society as well as positive effects. Consider the effects for example of AI algorithms on fairness in sentencing or on who is prioritized for treatments, or the effects of a proposed technology on privacy. If there are negative risks or known negative effects, how might these be anticipated and mitigated?

Earlier this year, the ESR was tested in a pilot program that reviewed proposals submitted by researchers seeking funding from the Stanford Institute for Human-Centered Artificial Intelligence (HAI). The first faculty review panel included experts from fields including anthropology, communication, computer science, history, management science and engineering, medicine, philosophy, political science, and sociology. A paper published December 28 in the Proceedings of the National Academy of Sciences (PNAS) details the findings and how the ESR could be applied in other areas of research and institutions elsewhere.

Here, four of the paper’s six co-authors, Michael Bernstein, associate professor of Computer Science in the School of Engineering; Margaret Levi, the Sara Miller McCune Director of the Center for Advanced Study in the Behavioral Sciences (CASBS); David Magnus, the Thomas A. Raffin Professor of Medicine and Biomedical Ethics at Stanford Medicine; and Debra Satz, the Vernon R. and Lysbeth Warren Anderson Dean of the School of Humanities and Sciences discuss how the ESR came to be, why it’s needed, and the impact they hope it will have.

""
Michael Bernstein, Margaret Levi, David Magnus, and Debra Satz

What is the process for the ethics and society review that you propose?

Bernstein: The engine that we usually associate with ethics review—the Institutional Review Board, or IRB—is explicitly excluded from considering long-range societal impact. So, for example, artificial intelligence projects can be pursued, published, and shared without engaging in any structured ethical or societal reflection. But even if many of these projects do not need to engage with IRBs, they need to apply for funding. The ESR is designed as a gate to funding: funding from collaborating grant programs isn't released until the researchers complete the ESR process.

Levi: The ESR depends on a partnership with a funding group that is willing to release funds to successful proposals only after the project investigators provide a statement outlining any problematic ethical implications or societal consequences of their research. Of particular interest to the review panel are mitigation strategies. If the outline is adequate, the funds are released. If the panel deems it necessary, there is iterated discussion with the panel to help figure out where there are problems, trade-offs that need to be addressed, and appropriate mitigation steps. This is more of a collaborative than a compliance model.

Why do we need an ethics review and why is the focus on potential impacts to society important?

Satz: Our current review processes do a good job of protecting individuals from unnecessary risks or harms. But some of our social problems do not show up directly as harms or risks to individuals but instead to social institutions and the general social fabric that knits our lives together. New technologies are upending the way we work and live in both positive and negative ways. Some of the negative effects are not inevitable; they depend on design choices that we can change.

Magnus: Because this is not part of the IRB process, it is easy for researchers to focus solely on the risks to individual participants without consideration of the broader implications of their research. For example, a project that was developing wearable robotic devices did a great job of considering all of the relevant risks that research participants would be exposed to and how to mitigate those. But they did not at all consider the literature on the importance of taking downstream implications of the technology (for example privacy issues that are likely to arise when implemented in real world settings that do not arise in the laboratory research setting) into account in the design process.

An interdisciplinary group of authors worked on this paper, how did that come about?

Satz: The problems posed by new technologies require input from many fields of knowledge, working together. The problems cannot be adequately addressed by ethicists or philosophers pronouncing from “on high”—removed from those creating and thinking about technology and science. We have found that deliberation among computer scientists, philosophers, political scientists, and others yields a deeper understanding of the challenges and provides better guidance for improving our practices.

Levi: All four of the faculty have been active—in different domains—in promoting standards for research that take into account ethical and societal implications, not just harms to individual subjects and participants. Within Stanford’s Ethics, Society, and Technology Hub, CASBS has been coordinating the implementation and evaluation of the ESR. Betsy Rajala, the program director of CASBS, and Charla Waeiss, a consultant to CASBS, have been the key players and are full partners in the writing of the PNAS paper.

Bernstein: What initially catalyzed this effort was an email that Debra Satz sent about a (rejected) grant that we were on, where she mentioned that IRBs were focused on risks to human subjects rather than risks to human society. Her comment gave words to much of the uncertainty I had faced in my career as a computer scientist, and it rattled around in my brain until I translated it into the basic concept of the ESR—ethics and societal review connected to grant funding. I quickly connected with Margaret Levi, who had been pursuing similar goals in the social sciences and had a strong interest in societal impacts of AI. We pitched it to the leadership of Stanford's HAI; they connected us with David Magnus, who has vast experience in ethics review, and the four of us were off to the races.

Did any of the results surprise you?

Bernstein: Two results surprised me. First, I expected substantial pushback from researchers along the lines of "you're adding red tape!" However, all the respondents to our survey were willing to submit to the ESR again in the future. Second, over half of researchers felt that the process had positively influenced the design of their research project. For a fairly lightweight process to benefit the design of half of projects was a huge—and very pleasant—surprise to me.

What’s next for the ESR?

Magnus: The biggest challenge is to find a way to make this scalable. It is one thing to do an ESR for 35 or 40 proposals, it is quite another to do 400 or 4,000. We hope this scaffolding will make it easier for researchers to think through the ethical and social issues raised by their research and identify strategies to mitigate any problems. [We also hope this process] becomes a routine part of research.

Levi: We also are eager to collaborate with other universities and firms to see how best to transfer our process broadly. In addition, we are considering ways to help researchers when they discover new ethical implications or societal consequences in the process of their research. In terms of improving the ESR, our plans are two-fold. First, we are determining ways to staff and support the faculty panels so that we are not misusing or over-demanding of faculty time. Second, and perhaps most importantly, we are building the scaffolding that will inform and transform thinking so that considering the ethical implications and societal consequences becomes second nature.

Related Departments