Skip to content Skip to navigation

Research & Ideas

Search this site

Mehran Sahami: The evolution of computer science education

An expert takes stock of the future of the CS curriculum.

Illustration of a white computer on a black chalkboard background

CS is not just about sitting in a cube programming, it’s about solving social problems through computation. | Illustration by Kevin Craft

Once the core American curriculum meant reading, writing and arithmetic, but Stanford professor Mehran Sahami says we might soon have to add a fourth skill to that list, “coding.”

Sahami thinks deeply about such matters. He’s the leading force behind recent changes in Stanford’s computer science curriculum. He notes that it may not be surprising that more students are choosing to major in computer science than ever before, but what might turn heads is the changing face and intellectual landscape of the field. With concerted effort, more women and minorities, and even students from traditional liberal arts and sciences backgrounds, are venturing into computer science.

Sahami says coding has become more than just video-games, social media and smartphone apps. The field is an intellectual endeavor taking on the biggest issues of our day — particularly in its influence on data-driven decision making, personal privacy, artificial intelligence and autonomous systems, and the role of large platforms like Google, Facebook and Apple on free speech issues.

Sahami says that computers and algorithms are now part of the fabric of everyday life and how the future plays out will depend upon realizing more cultural and gender diversity in computer science classrooms and encouraging multidisciplinary thinking throughout computer science.

Join host Russ Altman and expert in computer science education Mehran Sahami for an inspiring journey through the computer science curriculum of tomorrow. You can listen to The Future of Everything on Sirius XM Insight Channel 121iTunesGoogle PlaySoundCloudSpotifyStitcher or via Stanford Engineering Magazine.

Full Transcript

Russ Altman: Today on The Future of Everything, the future of computer science education. Let’s think about it. Computer science is the toast of the town. Students are flocking to learn how to program computers and what the underpinnings are of computational systems are, how they work, how they should be designed, implemented, evaluated. At Stanford University and many other places, computer science has become the number one major, in some cases eclipsing really popular traditional majors, like economics, psychology, biology.

The job market seems great for these students who have skills that are needed in almost every industry. It’s not just about creating software for PCs or iPhones, but increasingly it’s about building systems that interact with the physical world.

Think about self-driving cars, robotic assistants and other things like that. AI, artificial intelligent systems, have also become powerful with voice recognition, like Siri and Alexa, the ability to translate, the ability to recognize faces, even in our cell phones, and the kind of big data mining that is transforming the financial community, real estate, entertainment, sports, news, even healthcare. These systems promise efficiencies, but do add some worry about the loss of jobs and displaced workers.

Professor Mehran Sahami is a professor of computer science at Stanford and an expert at computer science education. Before coming to Stanford, he worked at Google and he has led national committees that have created guidelines for computer science programs internationally.

Mehran, there is a boom in interest in computer science as an area of study. Of course, students are always encouraged to follow their passion when they choose their majors. But should we worry that there’s enough English majors, and history majors, and all these traditional majors that I mentioned before? Is this a blip or is this a change in the ecosystem that we’re expecting now for a long time to come?

Mehran Sahami: Sure, that’s a great question. I do think it’s a sea change. I think it makes, when looking forward, there’s a really difference in terms of the decisions students are making in terms of where they wanna go, the kinds of fields they wanna pursue. I do think we would lament the fact if we lost all the English majors and lost all the economics majors, because what we’re actually seeing now is more problems require multi-disciplinary expertise to really solve, and so we need people across the board.

But I think what students have seen, especially in the last 10 years is that computer science is not just about sitting in a cube programming 80 hours a week. It’s about solving social problems through computation, and so that’s really brought the ability of students from computing and the ability of students in other areas to come together and solve bigger problems.

Russ Altman: Are you seeing an increase in interest in kind of joint majors where people have two feet in two different camps, say English and computer science, or the arts and computer science? Is that a thing?

Mehran Sahami: That is a thing. We actually even had specialized program called CS+X where X was a choice of many different humanities majors at Stanford. We actually, rather than having that program saw that students were just choosing anyway to do double majors with computer science, lots of students who minors with computer science, and vice versa. They’ll major in something else and minor in computer science. So many students are already making this choice to combine fields.

Russ Altman: We kinda jumped right into it, but let’s step back. Tell me what a computer science. You’re an expert at this. What is a computer science education look like? I think everybody would say, “Well, they learn how to program computers.” But I suspect, in fact, I know that it’s more than that. Can you give us a kind of thumbnail sketch of what a computer science training should look like.

Mehran Sahami: Sure. I think most people, like you said, think of computer science as just programming. And what a lot of students see when they get here is that there is far more richness to it as an intellectual endeavor. There is mathematics and logic behind it. There is the notions of building larger systems, the kinds of algorithms you can build, how efficient they are, artificial intelligence, which you alluded to, has seen a huge boom in the last few years, because it’s allowed us to solve problems in ways that potentially were even better than could’ve been hand crafted by human beings.

When you see this whole intersection of stuff coming together in computing, how humans interact with machines, trying to solve problems in biology, for example, as you’ve done for many years, the larger scale impact of that field becomes clear.

What students do isn’t just about programming, but it’s about understanding how programming is a tool to solve a lot of problems, and there’s lots of science behind how those tools are built and how they’re used.

Russ Altman: Great. That’s actually very exciting. It means that we’re giving them a tool set that’s gonna last them for their whole life, even as the problem of the day changes. As we think about the future, how are we doing in terms of diversity in computer science? And I mean diversity along all axes, sex and gender, underrepresented minorities, different socioeconomic groups. Are they all feeling welcome to this big tent, or do we still have a recruitment issue?

Mehran Sahami: Well, we still have a recruitment issue, but the good new sis that it’s getting better. For many years, and it’s still true now, there’s a large gender imbalance in computing. It’s actually gotten quite a bit better. At Stanford now, for example, more than a third of the undergraduate majors are women, which is more than double the percentage that we had 10 years ago. The field in general is seeing more diversity. And along the lines of underrepresented minorities, different socioeconomic classes, we’re also seeing some movement there. Again, those numbers are nowhere near where we’d like them to be, that would be representative of the population as a whole. We still have a lot of work to do. But directionally, things are moving the right way. And I think part of that is also that earlier in the pipeline in K through 12 education, there is more opportunities for computing.

Russ Altman: Actually, I’m glad you mentioned that because I did wanna ask you.

At the K through 12 level, if you’re a parent, in my head, this is all, and I don’t know if this is even fair, but in my head this is all confused with the rise of video games. Because I know there’s an issue of young people using video games far beyond what I would’ve ever imagined and certainly far beyond what was available to me as a youth.

But what is the best advice about the appropriate level of computer exposure for somebody in their K through 12 education? Should parents be introducing kids to programming early? Can they just wait and let it evolve as a natural interest? I think of it as like, is it the new reading, writing, arithmetic, and coding? Is that the new fourth area of basic competency for grammar school in K through 12? And I know you’ve thought about these issues. What’s the right answer? What’s our current best understanding?

Mehran Sahami: Well, one of the things that’s touted a lot is a notion of what’s called computational thinking, which is a notion that would encompass some amount of programming, but also understanding just how computational techniques work and what they entail. So understanding something about data and how that might get used in a program without necessarily programming the actual program yourself.

And that brings up lots of social issues as well like privacy. How do you protect your data? How do you think about the passwords that you have.

For a lot of these things, generally, it’s not too early to have kids learn something about it. As a parent myself, I worry about how much time they spent on screens. And the current thinking there is it’s not just about how much time is actually spent in front of a screen, but what that time is spent doing. And there’s even lots of activities that don’t involve spending time in front of the screen.

So, sometimes people think about what’s the notion of having a kindergartner or a first grader program? Can we even do that?

We say, well, the notion of programming there isn’t about sitting in front of a computer and typing. It’s about making a peanut butter and jelly sandwich. So, what does that mean? It means it’s a process and you think about, well, you need to get out the bread, you need to get out the peanut butter, you open the jar. There’s these set of steps you have to go through. And if you don’t follow the steps properly, you don’t get the result you want. And that gets kids to think about what an algorithmic process is, without actually touching a computer.

Russ Altman: This is The Future of Everything. I’m Russ Altman. I’m speaking with Professor Mehran Sahami about computer science and, just in the last few moments, computer science for young people. On this issue of diversity and the pipeline, is there evidence that we need new ways? Are there emerging ways of teaching computer science that are perhaps more palatable to these groups that have not traditionally been involved in computer science? I’m wondering of curriculum are evolving to attract, for example, young women who might not have traditionally been attracted, or, again, underrepresented minorities. Do we see opportunities for changing the way things are taught to make it more welcoming?

Mehran Sahami: Sure, and it has to do both with the content and the culture. So, from the standpoint of content, I think one of the things that’s happened in the last few years that’s helped with increasing the diversity numbers is more people understanding the social impact of computing. And there are studies that have shown, for example, that in terms of the choice of activities that women, for example, might wanna do, they are tended to be drawn more toward majors that they see the social impact of the work.

There in terms of thinking about the examples you use in computer science classes, the kinds of problems that you could actually solve, integrating in more of the social problem solving aspect. How can we think about using computers, for example, to try to understand or solve diseases or to think about understanding climate change? That brings in a wider swath of people than otherwise previously came.

The other is culture. I think there’s been, and this is well documented, a lot of sexist culture in the computing community. Bright light has been shined on that in the past few years, and slowly the culture is beginning to change. And so, when you change that culture, you make it more welcoming and you have a community of people who now feels as though they’re not on the margin but actually right in the middle of the field. It helps bring other people in who are from similar communities.

Russ Altman: So I really find that interesting. I’m not an organizational behavior expert at all, but I’ve heard that a lot cultural change often needs to come from the top. There needs to be leadership committed to changing the culture.

So in this world of, in this distributed world of education, who is the one who’s charged with changing culture? Is it the faculty? Is it the student leadership? Who does that fall to, in terms of changing the culture, and what is their to-do list?

Mehran Sahami: Yeah, that’s a great question. It actually requires many levels of engagement. Certainly in the university, faculty need to be supportive. They need to think about the curriculum that they define, the examples that they use, how the language they use, how welcoming they are to students. One of the things we’ve seen here at Stanford is the students have been very active in terms of student organizations, creating social events to bring people together, and to help not only create the community, but show others that the community exists.

But if you think in the larger scope in industry, leaders of companies need to show that those companies are welcoming, that they’re taking steps toward diversity. They’re really listening to their employees. That’s the place where, in fact, it’s changed on a larger cultural level.

Russ Altman: So, that really does make sense and it’s great to hear that we’re making some progress in terms of the numbers. The 30%, I remember when I was in training, it was less than 10%, and it was a big problem. That was ancient history.

I know that one of the new things that you’ve been involved with and I definitely want to have some time to talk about it, is an ethics class and ethics for computer science students. I don’t recall if it’s required or just recommended.

Tell me about why. You’re a very famous, I didn’t mention this in my introduction, but you also are well-known to be one of the most distinguished teachers at Stanford with a great record of getting people fired up about the material in your classes. Why did you turn your attention to ethics and what are the special challenges in teaching ethics to this group? Do they get it? Are they excited about it, or are they like, “Why are we doing this?”

Mehran Sahami: Right, well, first of all, you’re too kind in your assessment.

But I would say, for many years, there’s been classes on ethics with computing at Stanford. What we’ve done in this most recent revision, and this is what with collaborators, Rob Reich and Jeremy Weinstein are both in the political science department. Rob is a moral philosopher. Jeremy is a public policy expert.

And then I as the computer scientist came together to say what we wanna do is a modernized revision of this class where the problems we look or the problems that students are gonna be be grappling with in sort of the zero to 10-year time pan from the time they graduate, that it brings together these different aspects. The computing is on of them, but we need to understand philosophically.

What are the societal outcomes we want? What are the moral codes we wanna live by? How do we think about value trade-offs?

And then from the public policy standpoint, what can we do not only as engineers but as potentially leaders of companies, as academics, as citizens, in order to help see these kinds of changes through so we actually get the outcomes we like.

Russ Altman: This is The Future of Everything. I’m Russ Altman. I’m speaking with Mehran Sahami, and we just turned our attention to a new course in ethics, looking at the big picture ethics. So this is not you shouldn’t cheat, you shouldn’t steal stuff, you should make sure that your authors are all recognized. These are the big ticket items in ethics of computer science. Can you give me a feeling for some of the issues that are hot button issues? I love what you said about it’s zero to 10, which means literally these are issues that could be big today or tomorrow. How did you decide which issues need to come to their attention as young trainees in computer science?

Mehran Sahami: Sure. I mean, first, we sat down on the white board and wrote out a bunch of topics that we thought were relevant and quickly realized the full set was far larger than we could cover in one class. So we focused on four that we could really do deep dives into. First one was algorithmic decision making, that computers and algorithms are used more and more to make meaningful decisions that impact our lives, for example, whether or not we get loans, mortgages, whether or not if we have some trouble with the criminal justice system, if we get bail or not.

Russ Altman: And these may be without a human in the loop of that decision making?

Mehran Sahami: For some of them. Some of them have a human in loop that’s required. For example, in the financial industry, there are some decisions that a human has to take responsibility for, but there are some micro transactions, for example, when you try to run your credit card where the decision might get made to deny that without a human being involved. That was the first area.

Then we looked at issues around data privacy and how different companies, what kinds of policies they have, the different views say in the United States versus Europe around privacy, so we could also look at different cultural norms.

The third unit was around AI and autonomous systems. So, our deep dive was mainly on autonomous vehicles, something that students are looking at now and society in general is gonna have to deal with, both from the technological stand point, but more so from the issues around economics. Job displacement, what is automation gonna mean in the long term, how do we turn our attention to think about what sorts of automation we wanna build in terms of weighing the positives and negatives in society, safety versus job displacement.

And then the last unit was on the power of large platforms, say Facebooks, and Apples, and Googles of the world where now, for example, the questions around who has free speech are governed more by the large platforms who can determine who’s on them or not, versus governments.

We’re seeing these changes of things that previously happen in the public sphere moving to the private sphere. And how do we think about that? Because they’re in the same kind of electoral recourse if you don’t like a policy that Facebook wants to implement for who can say what on their platform.

Russ Altman: So that sounds really exciting. So tell me who signed up for this class? Was it your folks from computer science? Was it a bunch of other people saying, “Wow, I might be able to contribute to this conversation”? I mean it sounds like an incredibly open set of issues that a lot of people could contribute to, but who actually signed up?

Mehran Sahami: Yeah, the vast majority of students who signed up are computer science majors or related majors, electrical engineering, something like that.

Russ Altman: Was it required? I forgot to ask.

Mehran Sahami: It satisfies certain requirements, but it’s not the only class that does.

Russ Altman: Gotcha.

Mehran Sahami: So it’s a popular choice though for a lot of students. But we did get students from many different disciplines. Certainly we got some from political science, some students from the law school. We got students from other areas like anthropology from the humanities, and so there was lots of different perspectives that were brought into class.

Russ Altman: Without putting too find a point on it, did you find that the computer science students were well-prepared to have these conversations that were not about technical stuff? Were you pleased to see their level of preparation, or did it highlight for you the need to do more of this kind of training for everybody?

Mehran Sahami: Yeah, there’s a spectrum. There are some students that were hugely engaged, who actually made different decisions about, for example, what they wanted to pursue for their career as a result of some of the things they learn in that class, very deeply engaged on the social issues and thinking about what can I do as an engineer as well as a citizen to try address these problems. And then we had some students who are seeing some of these things, the social issues for the first time. And so, we’re trying to make it as broad as we can to have something of interest to everyone who’s in there.

Russ Altman: This is The Future of Everything. I’m Russ Altman. More with Professor Mehran Sahami about computer science, ethics, and the future of education, next on SiriusXM Insight 121.

Welcome back to The Future of Everything. I’m Russ Altman. I’m speaking with Professor Mehran Sahami about computer science and ethics training and ethics education.

So, Mehran, at the end of the last segment, you talked about these four kind of pillars of your course, and they were great. And the fourth one which I wanna dig in a little bit more was the power of platform, and you said something very intriguing about how in some ways the First Amendment is now adjudicated not by the courts or by the government, but by these huge platforms that if they turn me off, my voice on Twitter or on Facebook is gone.

How do you set that up for the students and how do you lead the discussion about the responsibilities and obligations of computer scientists and others in society in the light of this new phenomenon?

Mehran Sahami: Sure. So, one of the things we do in the class is we have for each one of our areas a case study that we’ve actually gotten professional writers to help write for us. And there in the power of platforms, one of the things we look at are cases where, for example, people have been banned from particular platforms, like say Alex Jones on Twitter. And so, part of that is what are the guidelines that these platforms have? How did they get applied? How can they do it in a way that scales? And so, you find all kinds of interesting phenomena there.

Some of them are things that a platform will keep information on the platform, even if it may not be information that they deem as entirely trustworthy, because they have to make that determination, and that becomes a very strange place for what the technology companies now making determinations about what is correct information.

There’s also a lot of automated techniques that go onto it. And so the engineers need to build something that they think can actually detect hateful speech or things that may be beyond the boundary of the acceptable guidelines for the company.

That brings up the deeper question of what are the acceptable guidelines. They don’t have to be in line necessarily with government practices. And sometimes there is also individual human reviewers that will look at contents and see whether or not —

Russ Altman: Yeah, these have been in the news recently, because some of them have very stressful jobs because of the content that they’re looking at.

Mehran Sahami: Exactly. Imagine spending eight hours a day looking at videos of people being beheaded, all kinds of horrible things that people are posting out there. In some cases, it’s actually been reported some of these workers have things like post-traumatic stress disorder from doing this job all day.

You get into this tension between what is the platform’s responsibility, how can citizens potentially affect what the platforms were doing, and in many cases, because of the voting structures of the shares of the platform, there’s not actually a lot that individuals can do. But what is this — what do we think of meaningfully through thinking about should there be legislation, should there be regulation, at what point does too much move out of the public sphere into the private decision making?

Russ Altman: So yeah, I’m struck by this problem, because there’s so many facets to it. So one of them is that Facebook, all of these platforms that you mentioned, they’re international platforms. Yes, there are some companies that might ban them. So, they might, even though they’re, in many cases, sitting in the US, and in fact in the case of Facebook, it’s a couple of miles from where you and I are sitting right now, they have an international audience where the laws are different. We might talk about data privacy later, but there are new laws in Europe that are quite different from the laws in the US.

How do you train the students to think about international-level issues that have to be adjudicated sometimes at a national or even sub-national level?

Mehran Sahami: Right. Understanding what are the, for one level, what are the cultural reasons why there are some of these different norms.

And then secondly is understanding that the platforms do need to abide by particular policies and counties, and those policies may be different.

For example, what Google can show in search results in Germany, where they have restrictions on showing information related to Nazis is different than in the United States. And certainly as we saw with Google eventually pulling out of China, the kinds of policies that they had to abide by if they wanted to continue to provide search there was outside of what they felt comfortable doing.

And that becomes a question for the company certainly to have to make, but it also becomes a decision for individuals who, say, wanna work at those companies or support those companies, that how do they make their voice heard with respect to the companies choosing to make particular policy decisions as to what they do in different locales.

Russ Altman: It’s interesting. What I hear you saying is that even though Google is a global platform, it has different flavors and different countries. One of the choices is pull out of the country because the rules are just to compatible with something that they wanna do. As an engineer for these companies, you might be working on a product that will never be deployed in country X, but will be used a lot in country Y, and you have to think about the implications and your comfort level with building these technologies that may or may not wind up being used in different settings.

I can see, how much do you empower the students to actually voice their opinions to the people who are signing their paychecks? It’s a tricky, you don’t wanna train a bunch of folks who wind up coming back and saying, “Oh, by the way, I was fired because of all the great things that I learned in this class.”

Mehran Sahami: Yeah, but at the same time you want students to have their own personal moral responsibility. You want them to make decisions that they feel are in line with their one personal ethics.

But at the same time, there’s a lot of decisions that get made at the engineering level that have far-reaching consequences. So if you’re the engineer who’s working on how do I filter results out of, say the search engine in China or in Germany, there are decisions you’re making deep down in the code in terms of what algorithms you might be using, what techniques, what kind of data that are gonna have real impact on the information people see. And that’s at a level that is affecting individuals, but is at a more granular level than the decisions that are being made by the executives of the company.

And so, the engineers themselves need to be aware of that when they’re building these systems.

Russ Altman: This is The Future of Everything. I’m Russ Altman. I’m speaking with Professor Mehran Sahami now about these great scenarios that you’re using in your teaching. So, if you’ll allow me, I would just love to move to another one of your areas and find out what kind of cases you’re using, for example, in data privacy.

Mehran Sahami: Sure. In data privacy, one of the big things we’re looking at is facial recognition, which is getting… These days you see pretty much articles every day on the use of the technology, what localities have it, which don’t.

San Francisco, for example, recently banned the use of facial recognition by the public sector at least, whereas many airlines are now moving to using facial recognition to check you in on a plane. As you can imagine that happening, it has these tensions between privacy and security. At one level, why would we use facial recognition to get people on the plane? It only estimates to save a couple minutes of time when you’re boarding a 747.

Russ Altman: It’s the overhead luggage that they should work on, not the facial recognition.

Mehran Sahami: Exactly. But the real reason is the folks who are responsible for airline security wanna be able to detect if there’s someone getting on the plane who shouldn’t be getting on the plane.

The flip side is your personal privacy. To what extent do you take facial recognition to an extreme where everyone can be tracked.

London, for example, has half a million closed-circuit TV cameras around the city. You combine that with facial recognition, you can get a pretty good map of what most of the people in the city are doing at a given time, who are outdoors at least.

How do we trade off the privacy implication versus security? Different people will make that decision differently. And part of the public policy question, which is why we look at the multidisciplinary facet of this class, is how as a society do we decide what we wanna do?

Russ Altman: So you must’ve had… I’m presuming you had small group discussion sessions, because I know you had a big registration in this class, a couple of hundred people perhaps, but you can have a meaningful discussion of 200 people, I would not think. So, did you actually break them down and have them have smaller discussion groups?

Mehran Sahami: Yeah. There were weekly small discussions. And then one of the things we would do, which actually worked better than you would think, is we took 250 students. We had this large room that had a bunch of tables that seat about eight or 10. And so we could get them in there, seated around these tables, after they read the case study and sort of give them guiding questions for discussion. So then they could have the discussion. We could have call out, so they could share their findings or their insights across the whole class.

Russ Altman: So when you have discussion sessions, who leads them? Are these computer scientist or a… You said there were a lot of political scientists involved in the class.

It’s not clear to me at all who I would want leading that discussion, because what you need to know and kind of the frameworks of ethics are quite diverse. So, who leads those discussions?

Mehran Sahami: Yeah, it’s a great question. We had a large number of TAs that span a bunch of areas. So we have some computer scientists. We had some law students. We had students with backgrounds in philosophy and anthropology, bunch of different areas. And in some cases, the sections were co-taught by computer scientists, say, and someone from the law school. And so you would get these different perspectives to really bring out the richness in the conversation.

Russ Altman: And I would guess that because of the international nature of the students, people are bringing very different perspectives on issues of privacy, state power, individual human rights. I would guess there’s a huge diversity in the student body on those issues.

Mehran Sahami: Absolutely. And it’s also a nice way for students to be able to connect, because when you hear about the different kinds of issues in different countries, it’s easy to just think about them in an abstract sense and not understand why. But when you actually have someone sitting across the table saying, “I grew up here and I can tell you about why we believe the particular things we do,” it makes it much more meaningful in terms of that student engagement.

Russ Altman: Well, there you have it, the next generation of computer scientist trained in the ethics of their technologies.

Thank you for listening to The Future of Everything. I’m Russ Altman. If you missed any of this of episode, listen any time on-demand with the SiriusXM app.Russ Altman: Today on The Future of Everything, the future of computer science education. Let’s think about it. Computer science is the toast of the town. Students are flocking to learn how to program computers and what the underpinnings are of computational systems are, how they work, how they should be designed, implemented, evaluated. At Stanford University and many other places, computer science has become the number one major, in some cases eclipsing really popular traditional majors, like economics, psychology, biology.

The job market seems great for these students who have skills that are needed in almost every industry. It’s not just about creating software for PCs or iPhones, but increasingly it’s about building systems that interact with the physical world.

Think about self-driving cars, robotic assistants and other things like that. AI, artificial intelligent systems, have also become powerful with voice recognition, like Siri and Alexa, the ability to translate, the ability to recognize faces, even in our cell phones, and the kind of big data mining that is transforming the financial community, real estate, entertainment, sports, news, even healthcare. These systems promise efficiencies, but do add some worry about the loss of jobs and displaced workers.

Professor Mehran Sahami is a professor of computer science at Stanford and an expert at computer science education. Before coming to Stanford, he worked at Google and he has led national committees that have created guidelines for computer science programs internationally.

Mehran, there is a boom in interest in computer science as an area of study. Of course, students are always encouraged to follow their passion when they choose their majors. But should we worry that there’s enough English majors, and history majors, and all these traditional majors that I mentioned before? Is this a blip or is this a change in the ecosystem that we’re expecting now for a long time to come?

Mehran Sahami: Sure, that’s a great question. I do think it’s a sea change. I think it makes, when looking forward, there’s a really difference in terms of the decisions students are making in terms of where they wanna go, the kinds of fields they wanna pursue. I do think we would lament the fact if we lost all the English majors and lost all the economics majors, because what we’re actually seeing now is more problems require multi-disciplinary expertise to really solve, and so we need people across the board.

But I think what students have seen, especially in the last 10 years is that computer science is not just about sitting in a cube programming 80 hours a week. It’s about solving social problems through computation, and so that’s really brought the ability of students from computing and the ability of students in other areas to come together and solve bigger problems.

Russ Altman: Are you seeing an increase in interest in kind of joint majors where people have two feet in two different camps, say English and computer science, or the arts and computer science? Is that a thing?

Mehran Sahami: That is a thing. We actually even had specialized program called CS+X where X was a choice of many different humanities majors at Stanford. We actually, rather than having that program saw that students were just choosing anyway to do double majors with computer science, lots of students who minors with computer science, and vice versa. They’ll major in something else and minor in computer science. So many students are already making this choice to combine fields.

Russ Altman: We kinda jumped right into it, but let’s step back. Tell me what a computer science. You’re an expert at this. What is a computer science education look like? I think everybody would say, “Well, they learn how to program computers.” But I suspect, in fact, I know that it’s more than that. Can you give us a kind of thumbnail sketch of what a computer science training should look like.

Mehran Sahami: Sure. I think most people, like you said, think of computer science as just programming. And what a lot of students see when they get here is that there is far more richness to it as an intellectual endeavor. There is mathematics and logic behind it. There is the notions of building larger systems, the kinds of algorithms you can build, how efficient they are, artificial intelligence, which you alluded to, has seen a huge boom in the last few years, because it’s allowed us to solve problems in ways that potentially were even better than could’ve been hand crafted by human beings.

When you see this whole intersection of stuff coming together in computing, how humans interact with machines, trying to solve problems in biology, for example, as you’ve done for many years, the larger scale impact of that field becomes clear.

What students do isn’t just about programming, but it’s about understanding how programming is a tool to solve a lot of problems, and there’s lots of science behind how those tools are built and how they’re used.

Russ Altman: Great. That’s actually very exciting. It means that we’re giving them a tool set that’s gonna last them for their whole life, even as the problem of the day changes. As we think about the future, how are we doing in terms of diversity in computer science? And I mean diversity along all axes, sex and gender, underrepresented minorities, different socioeconomic groups. Are they all feeling welcome to this big tent, or do we still have a recruitment issue?

Mehran Sahami: Well, we still have a recruitment issue, but the good new sis that it’s getting better. For many years, and it’s still true now, there’s a large gender imbalance in computing. It’s actually gotten quite a bit better. At Stanford now, for example, more than a third of the undergraduate majors are women, which is more than double the percentage that we had 10 years ago. The field in general is seeing more diversity. And along the lines of underrepresented minorities, different socioeconomic classes, we’re also seeing some movement there. Again, those numbers are nowhere near where we’d like them to be, that would be representative of the population as a whole. We still have a lot of work to do. But directionally, things are moving the right way. And I think part of that is also that earlier in the pipeline in K through 12 education, there is more opportunities for computing.

Russ Altman: Actually, I’m glad you mentioned that because I did wanna ask you.

At the K through 12 level, if you’re a parent, in my head, this is all, and I don’t know if this is even fair, but in my head this is all confused with the rise of video games. Because I know there’s an issue of young people using video games far beyond what I would’ve ever imagined and certainly far beyond what was available to me as a youth.

But what is the best advice about the appropriate level of computer exposure for somebody in their K through 12 education? Should parents be introducing kids to programming early? Can they just wait and let it evolve as a natural interest? I think of it as like, is it the new reading, writing, arithmetic, and coding? Is that the new fourth area of basic competency for grammar school in K through 12? And I know you’ve thought about these issues. What’s the right answer? What’s our current best understanding?

Mehran Sahami: Well, one of the things that’s touted a lot is a notion of what’s called computational thinking, which is a notion that would encompass some amount of programming, but also understanding just how computational techniques work and what they entail. So understanding something about data and how that might get used in a program without necessarily programming the actual program yourself.

And that brings up lots of social issues as well like privacy. How do you protect your data? How do you think about the passwords that you have.

For a lot of these things, generally, it’s not too early to have kids learn something about it. As a parent myself, I worry about how much time they spent on screens. And the current thinking there is it’s not just about how much time is actually spent in front of a screen, but what that time is spent doing. And there’s even lots of activities that don’t involve spending time in front of the screen.

So, sometimes people think about what’s the notion of having a kindergartner or a first grader program? Can we even do that?

We say, well, the notion of programming there isn’t about sitting in front of a computer and typing. It’s about making a peanut butter and jelly sandwich. So, what does that mean? It means it’s a process and you think about, well, you need to get out the bread, you need to get out the peanut butter, you open the jar. There’s these set of steps you have to go through. And if you don’t follow the steps properly, you don’t get the result you want. And that gets kids to think about what an algorithmic process is, without actually touching a computer.

Russ Altman: This is The Future of Everything. I’m Russ Altman. I’m speaking with Professor Mehran Sahami about computer science and, just in the last few moments, computer science for young people. On this issue of diversity and the pipeline, is there evidence that we need new ways? Are there emerging ways of teaching computer science that are perhaps more palatable to these groups that have not traditionally been involved in computer science? I’m wondering of curriculum are evolving to attract, for example, young women who might not have traditionally been attracted, or, again, underrepresented minorities. Do we see opportunities for changing the way things are taught to make it more welcoming?

Mehran Sahami: Sure, and it has to do both with the content and the culture. So, from the standpoint of content, I think one of the things that’s happened in the last few years that’s helped with increasing the diversity numbers is more people understanding the social impact of computing. And there are studies that have shown, for example, that in terms of the choice of activities that women, for example, might wanna do, they are tended to be drawn more toward majors that they see the social impact of the work.

There in terms of thinking about the examples you use in computer science classes, the kinds of problems that you could actually solve, integrating in more of the social problem solving aspect. How can we think about using computers, for example, to try to understand or solve diseases or to think about understanding climate change? That brings in a wider swath of people than otherwise previously came.

The other is culture. I think there’s been, and this is well documented, a lot of sexist culture in the computing community. Bright light has been shined on that in the past few years, and slowly the culture is beginning to change. And so, when you change that culture, you make it more welcoming and you have a community of people who now feels as though they’re not on the margin but actually right in the middle of the field. It helps bring other people in who are from similar communities.

Russ Altman: So I really find that interesting. I’m not an organizational behavior expert at all, but I’ve heard that a lot cultural change often needs to come from the top. There needs to be leadership committed to changing the culture.

So in this world of, in this distributed world of education, who is the one who’s charged with changing culture? Is it the faculty? Is it the student leadership? Who does that fall to, in terms of changing the culture, and what is their to-do list?

Mehran Sahami: Yeah, that’s a great question. It actually requires many levels of engagement. Certainly in the university, faculty need to be supportive. They need to think about the curriculum that they define, the examples that they use, how the language they use, how welcoming they are to students. One of the things we’ve seen here at Stanford is the students have been very active in terms of student organizations, creating social events to bring people together, and to help not only create the community, but show others that the community exists.

But if you think in the larger scope in industry, leaders of companies need to show that those companies are welcoming, that they’re taking steps toward diversity. They’re really listening to their employees. That’s the place where, in fact, it’s changed on a larger cultural level.

Russ Altman: So, that really does make sense and it’s great to hear that we’re making some progress in terms of the numbers. The 30%, I remember when I was in training, it was less than 10%, and it was a big problem. That was ancient history.

I know that one of the new things that you’ve been involved with and I definitely want to have some time to talk about it, is an ethics class and ethics for computer science students. I don’t recall if it’s required or just recommended.

Tell me about why. You’re a very famous, I didn’t mention this in my introduction, but you also are well-known to be one of the most distinguished teachers at Stanford with a great record of getting people fired up about the material in your classes. Why did you turn your attention to ethics and what are the special challenges in teaching ethics to this group? Do they get it? Are they excited about it, or are they like, “Why are we doing this?”

Mehran Sahami: Right, well, first of all, you’re too kind in your assessment.

But I would say, for many years, there’s been classes on ethics with computing at Stanford. What we’ve done in this most recent revision, and this is what with collaborators, Rob Reich and Jeremy Weinstein are both in the political science department. Rob is a moral philosopher. Jeremy is a public policy expert.

And then I as the computer scientist came together to say what we wanna do is a modernized revision of this class where the problems we look or the problems that students are gonna be be grappling with in sort of the zero to 10-year time pan from the time they graduate, that it brings together these different aspects. The computing is on of them, but we need to understand philosophically.

What are the societal outcomes we want? What are the moral codes we wanna live by? How do we think about value trade-offs?

And then from the public policy standpoint, what can we do not only as engineers but as potentially leaders of companies, as academics, as citizens, in order to help see these kinds of changes through so we actually get the outcomes we like.

Russ Altman: This is The Future of Everything. I’m Russ Altman. I’m speaking with Mehran Sahami, and we just turned our attention to a new course in ethics, looking at the big picture ethics. So this is not you shouldn’t cheat, you shouldn’t steal stuff, you should make sure that your authors are all recognized. These are the big ticket items in ethics of computer science. Can you give me a feeling for some of the issues that are hot button issues? I love what you said about it’s zero to 10, which means literally these are issues that could be big today or tomorrow. How did you decide which issues need to come to their attention as young trainees in computer science?

Mehran Sahami: Sure. I mean, first, we sat down on the white board and wrote out a bunch of topics that we thought were relevant and quickly realized the full set was far larger than we could cover in one class. So we focused on four that we could really do deep dives into. First one was algorithmic decision making, that computers and algorithms are used more and more to make meaningful decisions that impact our lives, for example, whether or not we get loans, mortgages, whether or not if we have some trouble with the criminal justice system, if we get bail or not.

Russ Altman: And these may be without a human in the loop of that decision making?

Mehran Sahami: For some of them. Some of them have a human in loop that’s required. For example, in the financial industry, there are some decisions that a human has to take responsibility for, but there are some micro transactions, for example, when you try to run your credit card where the decision might get made to deny that without a human being involved. That was the first area.

Then we looked at issues around data privacy and how different companies, what kinds of policies they have, the different views say in the United States versus Europe around privacy, so we could also look at different cultural norms.

The third unit was around AI and autonomous systems. So, our deep dive was mainly on autonomous vehicles, something that students are looking at now and society in general is gonna have to deal with, both from the technological stand point, but more so from the issues around economics. Job displacement, what is automation gonna mean in the long term, how do we turn our attention to think about what sorts of automation we wanna build in terms of weighing the positives and negatives in society, safety versus job displacement.

And then the last unit was on the power of large platforms, say Facebooks, and Apples, and Googles of the world where now, for example, the questions around who has free speech are governed more by the large platforms who can determine who’s on them or not, versus governments.

We’re seeing these changes of things that previously happen in the public sphere moving to the private sphere. And how do we think about that? Because they’re in the same kind of electoral recourse if you don’t like a policy that Facebook wants to implement for who can say what on their platform.

Russ Altman: So that sounds really exciting. So tell me who signed up for this class? Was it your folks from computer science? Was it a bunch of other people saying, “Wow, I might be able to contribute to this conversation”? I mean it sounds like an incredibly open set of issues that a lot of people could contribute to, but who actually signed up?

Mehran Sahami: Yeah, the vast majority of students who signed up are computer science majors or related majors, electrical engineering, something like that.

Russ Altman: Was it required? I forgot to ask.

Mehran Sahami: It satisfies certain requirements, but it’s not the only class that does.

Russ Altman: Gotcha.

Mehran Sahami: So it’s a popular choice though for a lot of students. But we did get students from many different disciplines. Certainly we got some from political science, some students from the law school. We got students from other areas like anthropology from the humanities, and so there was lots of different perspectives that were brought into class.

Russ Altman: Without putting too find a point on it, did you find that the computer science students were well-prepared to have these conversations that were not about technical stuff? Were you pleased to see their level of preparation, or did it highlight for you the need to do more of this kind of training for everybody?

Mehran Sahami: Yeah, there’s a spectrum. There are some students that were hugely engaged, who actually made different decisions about, for example, what they wanted to pursue for their career as a result of some of the things they learn in that class, very deeply engaged on the social issues and thinking about what can I do as an engineer as well as a citizen to try address these problems. And then we had some students who are seeing some of these things, the social issues for the first time. And so, we’re trying to make it as broad as we can to have something of interest to everyone who’s in there.

Russ Altman: This is The Future of Everything. I’m Russ Altman. More with Professor Mehran Sahami about computer science, ethics, and the future of education, next on SiriusXM Insight 121.

Welcome back to The Future of Everything. I’m Russ Altman. I’m speaking with Professor Mehran Sahami about computer science and ethics training and ethics education.

So, Mehran, at the end of the last segment, you talked about these four kind of pillars of your course, and they were great. And the fourth one which I wanna dig in a little bit more was the power of platform, and you said something very intriguing about how in some ways the First Amendment is now adjudicated not by the courts or by the government, but by these huge platforms that if they turn me off, my voice on Twitter or on Facebook is gone.

How do you set that up for the students and how do you lead the discussion about the responsibilities and obligations of computer scientists and others in society in the light of this new phenomenon?

Mehran Sahami: Sure. So, one of the things we do in the class is we have for each one of our areas a case study that we’ve actually gotten professional writers to help write for us. And there in the power of platforms, one of the things we look at are cases where, for example, people have been banned from particular platforms, like say Alex Jones on Twitter. And so, part of that is what are the guidelines that these platforms have? How did they get applied? How can they do it in a way that scales? And so, you find all kinds of interesting phenomena there.

Some of them are things that a platform will keep information on the platform, even if it may not be information that they deem as entirely trustworthy, because they have to make that determination, and that becomes a very strange place for what the technology companies now making determinations about what is correct information.

There’s also a lot of automated techniques that go onto it. And so the engineers need to build something that they think can actually detect hateful speech or things that may be beyond the boundary of the acceptable guidelines for the company.

That brings up the deeper question of what are the acceptable guidelines. They don’t have to be in line necessarily with government practices. And sometimes there is also individual human reviewers that will look at contents and see whether or not —

Russ Altman: Yeah, these have been in the news recently, because some of them have very stressful jobs because of the content that they’re looking at.

Mehran Sahami: Exactly. Imagine spending eight hours a day looking at videos of people being beheaded, all kinds of horrible things that people are posting out there. In some cases, it’s actually been reported some of these workers have things like post-traumatic stress disorder from doing this job all day.

You get into this tension between what is the platform’s responsibility, how can citizens potentially affect what the platforms were doing, and in many cases, because of the voting structures of the shares of the platform, there’s not actually a lot that individuals can do. But what is this — what do we think of meaningfully through thinking about should there be legislation, should there be regulation, at what point does too much move out of the public sphere into the private decision making?

Russ Altman: So yeah, I’m struck by this problem, because there’s so many facets to it. So one of them is that Facebook, all of these platforms that you mentioned, they’re international platforms. Yes, there are some companies that might ban them. So, they might, even though they’re, in many cases, sitting in the US, and in fact in the case of Facebook, it’s a couple of miles from where you and I are sitting right now, they have an international audience where the laws are different. We might talk about data privacy later, but there are new laws in Europe that are quite different from the laws in the US.

How do you train the students to think about international-level issues that have to be adjudicated sometimes at a national or even sub-national level?

Mehran Sahami: Right. Understanding what are the, for one level, what are the cultural reasons why there are some of these different norms.

And then secondly is understanding that the platforms do need to abide by particular policies and counties, and those policies may be different.

For example, what Google can show in search results in Germany, where they have restrictions on showing information related to Nazis is different than in the United States. And certainly as we saw with Google eventually pulling out of China, the kinds of policies that they had to abide by if they wanted to continue to provide search there was outside of what they felt comfortable doing.

And that becomes a question for the company certainly to have to make, but it also becomes a decision for individuals who, say, wanna work at those companies or support those companies, that how do they make their voice heard with respect to the companies choosing to make particular policy decisions as to what they do in different locales.

Russ Altman: It’s interesting. What I hear you saying is that even though Google is a global platform, it has different flavors and different countries. One of the choices is pull out of the country because the rules are just to compatible with something that they wanna do. As an engineer for these companies, you might be working on a product that will never be deployed in country X, but will be used a lot in country Y, and you have to think about the implications and your comfort level with building these technologies that may or may not wind up being used in different settings.

I can see, how much do you empower the students to actually voice their opinions to the people who are signing their paychecks? It’s a tricky, you don’t wanna train a bunch of folks who wind up coming back and saying, “Oh, by the way, I was fired because of all the great things that I learned in this class.”

Mehran Sahami: Yeah, but at the same time you want students to have their own personal moral responsibility. You want them to make decisions that they feel are in line with their one personal ethics.

But at the same time, there’s a lot of decisions that get made at the engineering level that have far-reaching consequences. So if you’re the engineer who’s working on how do I filter results out of, say the search engine in China or in Germany, there are decisions you’re making deep down in the code in terms of what algorithms you might be using, what techniques, what kind of data that are gonna have real impact on the information people see. And that’s at a level that is affecting individuals, but is at a more granular level than the decisions that are being made by the executives of the company.

And so, the engineers themselves need to be aware of that when they’re building these systems.

Russ Altman: This is The Future of Everything. I’m Russ Altman. I’m speaking with Professor Mehran Sahami now about these great scenarios that you’re using in your teaching. So, if you’ll allow me, I would just love to move to another one of your areas and find out what kind of cases you’re using, for example, in data privacy.

Mehran Sahami: Sure. In data privacy, one of the big things we’re looking at is facial recognition, which is getting… These days you see pretty much articles every day on the use of the technology, what localities have it, which don’t.

San Francisco, for example, recently banned the use of facial recognition by the public sector at least, whereas many airlines are now moving to using facial recognition to check you in on a plane. As you can imagine that happening, it has these tensions between privacy and security. At one level, why would we use facial recognition to get people on the plane? It only estimates to save a couple minutes of time when you’re boarding a 747.

Russ Altman: It’s the overhead luggage that they should work on, not the facial recognition.

Mehran Sahami: Exactly. But the real reason is the folks who are responsible for airline security wanna be able to detect if there’s someone getting on the plane who shouldn’t be getting on the plane.

The flip side is your personal privacy. To what extent do you take facial recognition to an extreme where everyone can be tracked.

London, for example, has half a million closed-circuit TV cameras around the city. You combine that with facial recognition, you can get a pretty good map of what most of the people in the city are doing at a given time, who are outdoors at least.

How do we trade off the privacy implication versus security? Different people will make that decision differently. And part of the public policy question, which is why we look at the multidisciplinary facet of this class, is how as a society do we decide what we wanna do?

Russ Altman: So you must’ve had… I’m presuming you had small group discussion sessions, because I know you had a big registration in this class, a couple of hundred people perhaps, but you can have a meaningful discussion of 200 people, I would not think. So, did you actually break them down and have them have smaller discussion groups?

Mehran Sahami: Yeah. There were weekly small discussions. And then one of the things we would do, which actually worked better than you would think, is we took 250 students. We had this large room that had a bunch of tables that seat about eight or 10. And so we could get them in there, seated around these tables, after they read the case study and sort of give them guiding questions for discussion. So then they could have the discussion. We could have call out, so they could share their findings or their insights across the whole class.

Russ Altman: So when you have discussion sessions, who leads them? Are these computer scientist or a… You said there were a lot of political scientists involved in the class.

It’s not clear to me at all who I would want leading that discussion, because what you need to know and kind of the frameworks of ethics are quite diverse. So, who leads those discussions?

Mehran Sahami: Yeah, it’s a great question. We had a large number of TAs that span a bunch of areas. So we have some computer scientists. We had some law students. We had students with backgrounds in philosophy and anthropology, bunch of different areas. And in some cases, the sections were co-taught by computer scientists, say, and someone from the law school. And so you would get these different perspectives to really bring out the richness in the conversation.

Russ Altman: And I would guess that because of the international nature of the students, people are bringing very different perspectives on issues of privacy, state power, individual human rights. I would guess there’s a huge diversity in the student body on those issues.

Mehran Sahami: Absolutely. And it’s also a nice way for students to be able to connect, because when you hear about the different kinds of issues in different countries, it’s easy to just think about them in an abstract sense and not understand why. But when you actually have someone sitting across the table saying, “I grew up here and I can tell you about why we believe the particular things we do,” it makes it much more meaningful in terms of that student engagement.

Russ Altman: Well, there you have it, the next generation of computer scientist trained in the ethics of their technologies.

Thank you for listening to The Future of Everything. I’m Russ Altman. If you missed any of this of episode, listen any time on-demand with the SiriusXM app.