Skip to main content Skip to secondary navigation
Main content start

The future of computer-aided education

An expert in creating algorithms that help students learn says that the future of education almost certainly involves AI, but we must never lose the human element.
Student learning online python coding class on laptop computer at home.
How can computers improve educational experiences for students? | iStock/ATHVisions

Chris Piech is a professor of computer science who studies how computers can help students learn.

In comparing human- and computer-aided education, he says humans are great one-on-one, but AI is more consistent at grading and feedback. He and colleagues have created several generative AI grading apps to take advantage of these relative strengths, as he tells host Russ Altman on this episode of Stanford Engineering’s The Future of Everything podcast.

Embed Code

Listen on your favorite podcast platform:


[00:00:00] Chris Piech: How do you get joy in education? I think there's plenty of kids in K-12 education. There's plenty of adults who need to get retrained, uh, and they're not necessarily finding the experience to be that motivating or joyful. Um, and part of my job is to come up with tools and things that can bring that joy into education.

[00:00:17] Russ Altman: This is Stanford Engineering's The Future of Everything, and I'm your host Russ Altman. If you enjoy The Future of Everything, please hit follow in the app that you're listening in. This will guarantee that you never miss an episode. 

[00:00:34] Today, Professor Chris Piech from Stanford University will tell us how he's training computers to help students learn better and to help teachers teach better. It's the future of computer aided education. 

[00:00:47] Before we get started, please remember to follow the show to ensure that you get alerted to all of our new episodes and never miss an episode on the future of anything. 

[00:00:56] Computers are everywhere in our lives. But one of the places we don't think about them as much is in the school room. We have teachers. We look up to our teachers. They teach us stuff. Then they give us grades. That's how we've always done it. Well, guess what? Things are changing. We're starting to see sophisticated computer programs that actually make a difference in education. What do I mean? They make students learn better. The students may even enjoy the experience more. And they helps teachers think about how they're going to present information and with grading and assessment. 

[00:01:35] Well, Chris Piech is a professor of computer science at Stanford University and an expert on using AI and computers to teach. He focuses a little bit on teaching computer science, his chosen area, but he's had a wide array of applications that may surprise you. 

[00:01:50] Chris, you focus on teaching people how to program, how to do computer coding, uh, with computer assistance. For people who haven't ever coded, can you just define what that's like and why it's challenging?

[00:02:04] Chris Piech: Yeah. A good way to describe it could be, imagine you want to give instructions to a computer, maybe you want it to open a file, read through and then do some work for you, or maybe you want to make a beautiful game. All of that, everything you experience on your computer, on your phone. It's all been created by a person who's written some code.

[00:02:22] Now, the experience of writing code is maybe surprisingly mundane. It's a little bit like opening up a Google doc and writing down your instructions. You need to learn to speak the language of these instructions. And that's what I teach in my class. 

[00:02:35] Russ Altman: Okay. So that's what we're trying to get to. And now I know that you are, you're committed to trying to understand how we can kind of use computers to improve that experience. That will include AI and I'm sure we'll get to AI, but what is, what's the problem that we're trying to solve here? Are there not enough coders who know how to teach? Are we unable to create the environment for people to learn how to code? What's the problem? 

[00:02:59] Chris Piech: Yeah, so, um, you know, I'll actually broaden it a little bit in that I love teaching, like, I think I was one of those people who was just born to teach. It's just brings me so much joy. I would probably teach anything. I just happen to love coding as well. But I think a lot of the challenges I think are more broad.

[00:03:13] It's like, what are the challenges in learning? And if I had to name one, I think motivation is a pretty big one. Like, how do you get joy in education? I think there's plenty of kids in K-12 education. There's plenty of adults who need to get retrained, uh, and they're not necessarily finding the experience to be that motivating or joyful. Um, and part of my job is to come up with tools and things that can bring that joy into education. 

[00:03:37] Russ Altman: Okay. So let me just ask, um, it is not my instinct, when I think about setting up a joyful learning experience, to do, to go to a computer screen and interact with a computer, right? I think about a group of people, maybe a teacher who is inspiring, who relates to me and also gets me thinking about how I can improve myself and learn stuff. So tell me how this turns into, we should make computers better at teaching. 

[00:04:05] Chris Piech: Yeah, so actually, um, the question is really well posed, and I feel like, uh, I approach it from a similar perspective. You know, I'm an old man now. I've done a lot of experimentations. 

[00:04:13] Russ Altman: You have no idea, but we'll get to that later.

[00:04:17] Chris Piech: Um, you know, the single thing that I've seen have the biggest impact on learners is that relationship building with a teacher. Um, and I think we can all appreciate and relate to that. Um, so surprisingly, uh, the, or when, or maybe not surprisingly, but when the pandemic came and I had my chance to try and build my own version of joyful education, I had all the AI tools at my disposal.

[00:04:41] I sit in the AI lab at Stanford, um, but I actually turned to a rather simple idea that I think might really be important for education, uh, which is, people who have recently learned are actually can be remarkable teachers if they're set up properly. So, you know, when I had my chance, I set up a classroom with ten thousand students, but a thousand teachers. Uh, the teachers were all just a little bit beyond the students and they're all leading a group of ten. I think learning's so relationship driven. 

[00:05:12] Russ Altman: I, you know, I love that because I tell, when I'm giving kind of mentorship to new teachers, and I've been teaching for a long time as well, I say to them, you don't have to know everything. You just have to be about an hour ahead of all the students that you're about to teach, right?

[00:05:27] Because the class will be an hour. And so if I can get them to everything that I know, they'll never know that the next minute I would have been clueless. 

[00:05:35] Chris Piech: Yeah. And, you know, to play it really safe, we get teachers who are about six weeks ahead of a student but. 

[00:05:40] Russ Altman: But so that's very powerful. And how does that work?

[00:05:43] So there's a whole bunch of things there. You, first of all, you have a whole range of students coming in with, at different levels of skill and knowledge, um, these teachers, um, who are recent enthusiasts and recent acquirers of knowledge. Um, I, it's easy to believe that they're enthusiastic, but how do you set it up for their success?

[00:06:04] Chris Piech: Yeah, you know, there's, the program called Code in Place is a Stanford program and folks are welcome to join if they find it interesting. Um, so the way we set it up is we make sure that the teachers have good training. Uh, we kind of have an application process and we select for who we think is ready to take that step.

[00:06:24] Uh, and here's one of the interesting things. By this point, Russ, we've had about four thousand teachers. Um, and in that experience, we've really gotten to learn what makes for a good teacher and you would be surprised how great amateurs can be. Um, you know, there's one way of talking about it is, if they can learn to be humble and not overstate what they know, they have this real advantage, which is that they know the struggle. They know what it's like. 

[00:06:50] Russ Altman: And it's been recent. 

[00:06:52] Chris Piech: Yeah. Um, and so, you know, as I said, we're learning a lot about what makes for great teaching. And certainly one of the big findings is we were really underestimating how great amateurs could be. 

[00:07:02] Russ Altman: So you really gave a great answer because I was asking about how are we going to have those inspiring people in front of us? And even if it's through a screen or through a Zoom, it sounds like you've addressed that. And that's part of the plan. 

[00:07:14] Let's zoom out. I want to ask about the status of automatic coding and AI grading, and there's just so many topics that your work touches upon. I guess, I just want to, for people who don't, aren't aware of it, uh, computer programming has kind of been revolutionized even among professionals in the last couple of years because of AI.

[00:07:34] And could you just describe for us what the status is? 'Cause I'm sure that impacts the ways in which you think about how to get a new coder, uh, into the field. 

[00:07:43] Chris Piech: Yeah. So for those of you guys, um, who are interested, there's this big revolution that's happened in the last few years, uh, they maybe could call it the generative AI revolution. Uh, basically realized that we throw insane amount of computer power at a thing called a neural network, it can start to do crazy things like produce language. And you might have interfaced with something like ChatGPT or heard about it. What you might not know is that ChatGPT isn't just able to write text or poems about pirates, it can also write Python and computer code.

[00:08:16] Uh, Python is one of the languages that we program in. And I would almost argue it could be, I would say it's probably better at coding than at language. So if you've experienced these things, uh, you know, maybe a good metaphor is whatever you're thinking about its capabilities to produce fluent text, um, you know, I would say it's doing a pretty good job of coding as well. 

[00:08:34] Russ Altman: Okay. So that is now a thing. And if I understand it even professional coders are taking advantage of the, of these tools. Um, so, 

[00:08:42] Chris Piech: Yeah, I program all the time. I use them all the time. I mean, I don't know if you need to talk about this, but I've never had more fun programming.

[00:08:48] Russ Altman: Okay. So it has not taken the joy away from programming because I, as a youth, I did a lot of programming myself and it was super fun. It was literally the only thing that could keep me up all night was a thorny, exciting programming challenge. Uh, and, um, well, that's a whole different story. Uh, so how does that change how you want to teach people the principles of coding? 

[00:09:12] It's a lot, you know, it's, I'm sure it's very analogous to the problem that English teachers are having right now about writing, is that they have, that students have access to these powerful tools. It forces you to rethink, how am I going to teach them to write an essay? And what's the equivalent question in your world? 

[00:09:28] Chris Piech: Yeah. Well, okay. So let's start by how fun programming is. And let's talk about why it's so fun. And then I think that will help answer all your questions. So programming, it's a joy if done right. And one of the reasons the joy is you're creating things out of just what is in your imagination.

[00:09:44] You're like, I dream of this particular game, and I can turn that dream into something I can share with my loved ones. And that creation is such a human experience, like you've just made something that you can give, and if a loved one says that what you made is valuable, uh, it's like the best feeling. Um, and that's one of the reasons that I'm having more fun now, is because actually these tools have elevated what I'm able to create.

[00:10:05] You know, there used to be limitations to how quickly I could read documentation and learn about new APIs and how quickly I could pick up new languages. Um, and now with these new tools, I'm just having the time of my life. And speaking on that, you brought up English teachers having to teach essays, right?

[00:10:24] As I said, I think a lot about motivation and education. Um, and I, I'll say that there's this opportunity for creation all over, and it's not just in coding. Funny story, the three-year-old and I did the funniest thing with my three year old. Of course, I want her to be able to read and write just like every other father.

[00:10:46] Uh, but also, I got bored one morning and you know what we did? We just pulled up, we pulled up these large language models and we just wrote a book. It was producing the images, um, and it was helping up with the story when we got stuck. And then once we got that book written, I just spent thirty more minutes and I got it printed with Google Photos, uh, and then I had her holding a book.

[00:11:06] Russ Altman: Oh my goodness. 

[00:11:06] Chris Piech: It was the coolest thing. You can imagine my three-year-old, like, she likes books, but to hold a book that she had helped craft, it was just a neat experience for her. 

[00:11:13] Russ Altman: And she was fully aware of being part of this creative process, it sounds like. 

[00:11:17] Chris Piech: Oh, yeah. A little bit too aware. Now, if you come to my house, it's like the first thing she'll show you, like, we're trying to, like, tone that down, but that's a different story. But you know, like, that joy is cool. Hey, so Russ, what am I getting at? 

[00:11:30] This moment for education could really be different. And if you talk to a lot of smart people, you'll hear a lot of people say this moment will be different because AI will be a great tutor. I'm not saying that's wrong, but I'm saying one of the reasons this moment will be different is not because of what AI can do, it's what humans can do. It's using these tools to expand what we can create. Could be a way to unlock making learning more fun. You could go much faster from learning the intros to I'm creating something of value. And I'm excited about that. 

[00:11:57] Russ Altman: I can't tell you how reassuring it is to me, and I'm sure lots of people, to have someone, who like you, who's working on computer aided education. And that, I know that's an old-fashioned term. To have you say those things about joy and creation and the human aspect. It is so reassuring because as a child of the nineteen sixties and seventies, um, you probably have studied as an archaeologist, the kinds of tools I was exposed to, where I remember as like a seventh grader saying, I will never sit in front of a computer and try to learn anything.

[00:12:29] And this is, you know, I became some, something of a computer professional, but they were such atrocious experiences that it probably delayed my open mindedness by at least thirty years. Okay. 

[00:12:41] Chris Piech: Do you want to hear a story on that? 

[00:12:45] Russ Altman: I do. 

[00:12:45] Chris Piech: Okay, so, you know, I'm a scientist as well. Um, and as you said, I'm a teacher, but I also, and I make tools, but I also like to do some science to see what's working, what's not working. Last year, as we mentioned, I had this big class, ten thousand students, a thousand teachers, and we actually ran two experiments. 

[00:13:04] In one experiment, we had early access to GPT4. Half the students got early access, half the students did not. We do this A B testing when we're not sure if a tool is going to help people. We ran a second experiment, going on at the same time. In this second experiment, we kind of had a breakthrough in how we could do one on one teaching. It's hard to do with ten thousand people. We don't need to get into the details, but we'd had a breakthrough. And it allowed us to do one on one teaching. We did a similar experiment where some people would get access to one-on-one teaching.

[00:13:31] And then some people would, eventually everyone gets access to everything. But at the beginning, we would like to learn what's working. Okay, I'll first tell you about the one-on-one teaching. In the one-on-one teaching, fifteen minutes with a near peer. So somebody who's like six weeks ahead of you, ten percentage point improvement in your chance of completing the material of the class, huge. 

[00:13:49] Russ Altman: Yes, yes.

[00:13:49] Chris Piech: I've never seen a result like it. That's the biggest result ever seen. So I'll invite you to think what, do you think the AI did? Did it achieve that human level of ten percentage point improvement? 

[00:14:02] Russ Altman: So this is a human who has just learned, who's enthusiastic. You already told us that you filtered them for a bunch of characteristics. I can't imagine the AI was as good. 

[00:14:15] Chris Piech: It not only wasn't as good, it has a four percentage point less likely to finish the class if you had access to GPT. So I looked at the, we looked at the conversations. They were healthy conversations, they were talking about concepts. I think you're not the only child of the sixties.

[00:14:32] There's a lot of us who, there's something about the human that I don't want us to lose. Hey, there's something about the AI that's really cool. 

[00:14:38] Russ Altman: Yes. 

[00:14:38] Chris Piech: We'll talk about how we can bring that into the future of learning, but we better not let the human part go. 

[00:14:43] Russ Altman: Good. Good. Okay. I feel validated and thank you very much. Okay. So let's go into, I know one of the areas, you've done a lot of stuff and I, you know, I Google stalked you last night. So I know about all your papers. One of the things you've looked at is assessment of students and that's getting serious, right? Because this has impacts on their future, on their, um, uh, on their job potential, on their ability to get into the next level of, uh, either education or job. So it's very sensitive and people are, I'm sure worried about it. 

[00:15:15] Tell me how you approach computational assessment of learners. How should we think about it? And I'm guessing you've done humanistic things in that direction, but I have no idea what they are.

[00:15:26] Chris Piech: No, no, it's such a good question. So like, you know, um, for a while there, one of my main quests was to help us understand humans based off their work. So you're learning physics, you're learning English, you're learning programming. While you're doing this learning, you're producing work. And, you know, I think it's a grand challenge in algorithmic education to understand you from the open-ended work. 

[00:15:48] Hey, if you're doing multiple choice, by the way, boy, can I really model what you know. It turns out I've got this algorithm that’s called Deep Knowledge Tracing. We use it in Duolingo, but, um, I don't dream of people doing multiple choice questions. I dream of people, uh, doing more complicated things. So that's the first bit. You know, no one likes assessment, but what we're able to grade is the assessments we can give.

[00:16:12] And if we can grade more complicated things, we'll be able to, as teachers, have more interesting learning experiences. So if the only thing we can grade is multiple choice, you're going to get a lot of multiple choice tests. If the only thing we can grade is you programming, you're getting a lot of programming.

[00:16:28] You know, the dream, Russ, though, is that you could be doing an open-ended project. Doing something that you really find motivating and enjoyable, and then we can give you feedback. Okay, so that's just a framework. From there we could talk a lot about the state of the art. 

[00:16:42] Russ Altman: Well, yeah, so I would like to know, um, uh, where are we, you know, I, one of the things that comes to mind and it's terrible is, and I don't even want to go there yet because we haven't talked about grading and assessment, but there's also the issue of cheating, which is kind of intimately tied in with all this.

[00:16:58] And I don't even like saying that word, right? It's an ugly word. It's an ugly idea. So maybe we'll put that aside, but how do you approach grading fairly? Uh, and what do you do about the variety of backgrounds and cultural assumptions that students come in? And then when you try to reduce all of that, either to a number between one and a hundred or a letter from A to F or whatever it is, or even a paragraph. You know, our friends at UC Santa Cruz, they write paragraphs describing how the student did. Um, how do you, how are you thinking about how to do that? And where is the state of the art? 

[00:17:33] Chris Piech: Yeah. 

[00:17:34] Russ Altman: How do you want your three-year-old to be graded when they get to first grade? 

[00:17:37] Chris Piech: Oh, I love the question. You know, um, and I'll bring the same sensitivity, like, who am I to judge anyone? You know, I'm a curious human. You're a curious human. Like, I don't presume that, I don't really like this position of power. And you mentioned how it could affect people's lives and that really makes me nervous. Um, but it's a big deal and there's a lot of nuance to it. You know, the cheating nuance is an interesting one. Uh, the effect on people's lives, interesting one, but maybe a safe place to start is actually the demographic fairness. 

[00:18:07] Russ Altman: Okay. 

[00:18:07] Chris Piech: So, uh, you know, Russ, I've actually never used, so I might be one of the few people in the world who's like spent a decade studying this. So we've got a bunch of algorithms that you could call state of the art. There's almost zero algorithms I've used for real assessment. Um, and you know why? For me, grading is a different thing. Grading is the proof that an algorithm understands students. So a lot of the algorithms I wanna make, they're gonna help students in some way. They're gonna help teachers in some way. 

[00:18:42] And the central piece is, can you understand a student? It just so happens that grading is one of the few numerical things that I've got to see if an algorithm is doing a good job understanding. Um, so I have done experiments, were we to use this grading for real assessment, would it be fair? As I said, we haven't actually used it for real assessments. Uh, and the answer was yes. Now, what was I grading? We took ten thousand students writing an exam for code in place. Now, right now the exam is, we call it a self-diagnostic. You know, there's no feedback. You just take the exam and then you're done.

[00:19:17] Just the experience is the experience. But then we thought, what if we ran an AI grader on this? And then we ran the AI grader, um, and we compared it to human graders. And you might not find this that surprising, but human graders are not all that accurate. 

[00:19:32] Russ Altman: And they're a little biased in their own special ways.

[00:19:35] Chris Piech: Yeah. Yeah. And, um, and then we can see that we were a little bit more accurate. We then actually gave it to the students and said like, hey, this is not a grade, but here's some automatic feedback if you find it interesting. And then we spliced in the human feedback and the AI feedback and the students in my class were actually preferring the AI feedback.

[00:19:50] So that just gives you an idea of what's possible. Uh, I will say that the answer probably would be different depending on the subject. I think coding, you know, we have people from a hundred and fifty countries Russ. So like, I don't, I got to find out, is it biased against Nigerians? And the answer is no, you know, coding is something like a universal language.

[00:20:09] Um, there's not too much about your demographics that leaks out when you're doing a little coding task. It could be really different if you're writing a personal essay. 

[00:20:17] Russ Altman: Yes. Yes. 

[00:20:18] Chris Piech: Um, it could be, and actually it could be really different if it's evaluating based off your resume, like all sorts of biases, you know, uh, if you're interested in these large language models that are trained off the internet, that is a wild, wild world.

[00:20:32] Russ Altman: This is The Future of Everything with Russ Altman. We'll have more with Chris Piech next.

[00:20:52] Welcome back to The Future of Everything. I'm Russ Altman, your host, and I'm speaking with Professor Chris Piech from Stanford University. 

[00:20:58] In the first segment, we talked about the promise and possibilities for using computers in education. In this next segment, we're going to talk about how can computers grade less structured information, things where there's a lot of creativity and spontaneity involved. Also, how can you use computers for other kinds of tests, not just academic tests and grading? And finally, what is this idea of generative grading? We've heard of generative AI. What's generative grading?

[00:21:26] I know that you've also done some work recently looking at feedback and assessment, and in a less structured environment. So tell us what's the state of that kind of work? 

[00:21:36] Chris Piech: Yeah. So, you know, if you're interested in giving somebody feedback, there's different degrees of difficulty. Multiple choice is the easiest. Uh, short answer is the next easiest. A step up from that is something like coding. Uh, but then there is this particular type of coding, which represents a goal we're shooting for, which is giving people feedback when they're just having fun and doing something unstructured.

[00:22:00] Um, and so this does express itself in programming and it expresses itself when people are making games or web apps, particularly ones where it's just like, hey, you go make a, a great application. Use the concepts we've learned in class. This is a really difficult grading task. We use a particularly neat idea. Have you ever seen a program learn how to play chess? 

[00:22:24] Russ Altman: Uh, I think so. Yes. 

[00:22:25] Chris Piech: Yeah. So it kind of like learns by playing. It just like plays a whole bunch of games of chess and then it gets really good at it. So what we do when we're grading, those open end things is really different than multiple choice. The way we, what we do is we make a program. Um, now it's called DreamGrader, uh, made with a bunch of wonderful colleagues. And what DreamGrader is going to do is it's, it's going to play your work. It's going to interact with it. It's going to, you know, if you made a game, it's going to move the paddle. If you made an app, it's going to try and press the buttons.

[00:22:54] Um, and through interacting with its work, that's how it's gaining its understanding of what you're doing. And it's really wonderful. See, it's like the power of these chess engines in the hands of graders. I'm excited about it. Not just technically. Okay, I'd say like nerdy. Very fun. But also as a teacher, 'cause I want to give those things. I just can't because grading them is so hard. 

[00:23:16] Russ Altman: So have you created kind of a rubric of what make, so like, give me an example of what the feedback would be. So I've played your game, like, dear Joan, Joan, I played Pong. Here's what I think of Pong. Like, tell me like how that looks.

[00:23:32] Chris Piech: Yeah. Well, okay. So right now we actually do use it in the class I'm teaching. So maybe this is the one example where I'm actually using it. Um, here's what it looks like, we have something called Breakout. It's just like Pong. Uh, you have a pad. 

[00:23:45] Russ Altman: I just guessed Pong. 

[00:23:47] Chris Piech: Yeah. Yeah. It was a good guess, fantastic. And students like to go above and beyond, they like to change the colors. Sometimes they'll add like, you know, level ups. So what the algorithm is going to do is it's going to play it. And it's going to try and make little movies. It's like, here's a moment when I was playing your game, and it did something that I thought was wonderful. Or, I was playing your game, Joan, and when I was playing it, this happened, and I'm pretty sure you didn't expect for this to happen, and that was a mistake.

[00:24:14] Russ Altman: And so, and you've been able to figure out how to do that in a very general way, so that no matter what game they throw in front of you, or is this kind of a Pong specific feedback? 

[00:24:23] Chris Piech: No. Okay, it's somewhere in between. I don't want to overstate what's possible. It's, this is still research. It's still science. Um, it's not just about Pong. And it should be anything that's interactive, but we're at the early stages of seeing where it breaks down. So when you get to really complicated things, like, I don't know if somebody programmed World of Warcraft, I don't think this thing is going to be able to play World of Warcraft and say like, that was a pretty good job, Joan. 

[00:24:44] Russ Altman: Right. Well, I, you know, I have, I'm sure you know this, but I'm just going to make it explicit. When I do research with my graduate students, it's like a game. Yes, we're doing research on biology and medicine, but on the day-to-day basis, we're writing code and we're generating graphics and some of the graphics get us excited and even giggling with like how good they look.

[00:25:06] And some of the graphics are like profoundly depressing and like indistinguishable from noise. And so as you were describing this, I'm thinking to myself, this is not so far from being able to evaluate the fun and the productivity of a research project, because it's in many ways we think of it and it can be construed as a game.

[00:25:26] Chris Piech: I mean, I love it. I've not thought about it from that perspective. 

[00:25:30] Russ Altman: All right. So we'll have to have a meeting. We'll have to have a lunch discussion. I want to get to another topic, which is you even talking about tests like for students, but you've actually generalized your work to look at other kinds of tests. So tell me about that. 

[00:25:42] Chris Piech: Yeah, you know, earlier in this, uh, conversation, we talked about how tests can be a little bit depressing, like who wants to be evaluated. But there's some tests where you just, like, really, really, really want to get the right answer. Um, and I'd say that those, the class of tests I'm talking about are medical tests. Like sometimes you go into a doctor's and you really want to get an accurate evaluation of what's going on. Now, sometimes that involves taking a photo, but sometimes it involves a human response. Uh, so let me paint you a picture, an eye test. You walk into an ophthalmologist's office, something's wrong with your eyes, and we need to know how well you can see.

[00:26:13] The only way we can know how well you can see is by asking you questions. Uh, and it turns out through decades of studying how people learn and how we can give feedback to things like coding assignments, we've actually figured out how we can get much better at giving feedback to people who are doing medical tests, like the eye test, for example.

[00:26:30] Russ Altman: Wow. So that really hits home. I recently was diagnosed with double vision and I was well aware that it was a qualitative description I was given of these symptoms and they were clearly struggling, I mean, in a good way, to understand exactly what the problem was. And so are you mostly helping the ophthalmologist or the optometrist, not so much the patient. I mean, of course it's helping the patient ultimately, but you're trying to help them make a more accurate assessment of the situation. 

[00:26:57] Chris Piech: Yeah. You know, the, we can get a more accurate reading of how well you can see in shorter time. That's both for the ophthalmologists. It was designed for people with more serious eye diseases. Like if you're just trying to find glasses, you know, not a, you probably don't need to know to incredible precision. Where it matters a little bit more is if you have like, um, a chronic eye disease that you have to track every day. So I actually made this is not that important, but I happen to have a chronic eye disease.

[00:27:23] So, and I want every day, I kind of want to know, has my vision gotten worse? And if even subtle changes in my vision can be really important for me to treat quickly. Um, so that was this case where I really wanted that high fidelity measurement, but the problem was me, like I needed to have this qualitative interpretation of how well I could see.

[00:27:42] Russ Altman: Yes. Yes. And your ability to discuss and describe on any given day, it's stressful because you don't want to miss the chance for the, uh, for the physician, for the clinician to make an appropriate inference by, and so you want to give them as good information as possible, but anything that they could have to help understand and appreciate what you're saying, I totally am with you there, and I saw that. 

[00:28:01] Okay, in the last few minutes, I want to ask you about this idea of generative grading. First of all, generative AI has been on lots of people's minds, and I don't know exactly what you mean, but it sounds like an exciting idea. So I just want to give you a chance to describe what is generative grading, and is it the future?

[00:28:19] Chris Piech: Yeah, okay, so I'm glad you asked. Generative grading is this algorithm that we made in the lab, and it's kind of near and dear to our hearts because, um, both it's had great impact, but also we like the ideas behind it. Uh, we all know that generative AI is impacting folks in, in lots of ways. Uh, and it, one of the hard tasks for a teacher is to grade.

[00:28:41] And there is this open question of how we can make this work useful to a teacher. Um, there is something really special about generative grading that's a little bit different than your classic large language model. So a large language model is, that's the neural network behind a lot of things like OpenAI.

[00:28:58] What's different about the way we do it is, the insight came from this. If you want to grade open ended work, it is much easier to generate an example of a student with a misconception than it is to take broken work and guess what the misconception was. So if I tell you you're a teacher, and I say, a student doesn't understand a for loop, and they tried to do this assignment, what could their work look like?

[00:29:25] It turns out, teachers find this much easier. Uh, and instead I say like, here's a broken program, what don't they understand? You have to guess through every single decision they could have made, every thought they could have had. That inference task is just insanely difficult. So we have this idea, kind of, even before large language models came out, that generative thinking is much easier and could be really important for grading.

[00:29:48] So now in the modern world, the way we employ this is in two ways. One, you can imagine artificial intelligence can be very helpful for generating the same sort of thought process, but you know what's great? Turns out humans are still amazing at it as well. Not only do we invite AI to think about generative stories of how students can go from misconceptions to their work, that's the model we need to be able to do good grading.

[00:30:13] It turns out teachers are still beating the state of the art. Like teachers are so good at this task that if you get a great teacher, uh, they can way outperform a neural network app.

[00:30:24] Russ Altman: And when you say outperform, you mean you say to the teacher, let's assume this student doesn't understand concept X. 

[00:30:31] Chris Piech: Yeah.

[00:30:31] Russ Altman: What will their assignment look like? 

[00:30:33] Chris Piech: Yeah. 

[00:30:33] Russ Altman: They're good at that?

[00:30:34] Chris Piech: Oh, so good at it. Way better than any neural network at the moment. This could change, but at the moment. And I think the future is probably going to look like a hybrid because, you know, teachers will know their students in ways that AI probably never will. Uh, AI will be able to assist because it has, you know, all the power of all the knowledge of all the programs it's ever read. Um, and together, I think we're going to be a fantastic team at understanding students. 

[00:30:55] Russ Altman: It also strikes me that new teachers, you know, in, I'm a physician. One of the things I am is a doctor. And everybody knows that you have to see a few thousand patients before you get really good and you don't, and of course it's stressful to be one of those patients early on because you don't know if you should have confidence in the physician.

[00:31:12] And you can imagine with this generative capability, that teachers might go into a situation now, a new situation, a new job, and they can say, I've graded this a thousand times or a hundred times. And so I am not clueless about how to grade this assignment. 

[00:31:26] Chris Piech: Yeah. And you know, we can imagine that teamwork is both training the teacher, getting great feedback to students, um, and maybe this is the future of education. 

[00:31:37] Russ Altman: Thanks to Chris Piech. That was the future of computer aided education. Thanks for tuning into this episode. With over 250 episodes in our archive, you have instant access to a huge array of discussions on the future of pretty much everything. If you're enjoying the show, a reminder to please consider sharing it with your friends and colleagues.

[00:31:58] Personal recommendations are the best way for us to grow the show. You can connect with me on X or Twitter, @RBAltman. And you can connect with Stanford Engineering @StanfordENG.

Related Departments