Skip to main content Skip to secondary navigation
Main content start

The future of computational linguistics

An expert in understanding language using machine learning explains why even he was surprised by the linguistic capabilities of ChatGPT.
A 3D illustration of text in a computer
The intersection of linguistics and computer science has led to the rise of intelligent chatbots. |iStock/wildpixel

Our guest, Christopher Manning, is a computational linguist. He builds computer models that understand and generate language using math.

Words are the key component of human intelligence, he says, and why generative AI, like ChatGPT, has caused such a stir. At one time a language model could hardly produce one coherent sentence, and suddenly ChatGPT is composing five-paragraph stories and doing mathematical proofs in rhyming verse, Manning tells host Russ Altman in this episode of Stanford Engineering’s The Future of Everything podcast.

Listen and subscribe here

oEmbed URL

Subscribe to The Future of Everything podcast

Related Departments