Skip to content Skip to navigation

Research & Ideas

Search this site

Karen Liu: How robots perceive the physical world

A specialist in computer animation expounds upon her rapidly evolving specialty, known as physics-based simulation, and how it is helping robots become more physically aware of the world around them.

A young girl in a patterned, button-up shirt with short sleeves. She is intently focused on the action of securing a button to dress herself.

Researchers hope to develop robots to help patients dress and undress. How do we teach them skills that children must learn? | Stocksy/Amanda Worrall

Stanford’s Karen Liu is a computer scientist who works in robotics.

She hopes that someday machines might take on caregiving roles, like helping medical patients get dressed and undressed each day. That quest has provided her a special insight into just what a monumental challenge such seemingly simple tasks are. After all, she points out, it takes a human child several years to learn to dress themselves — imagine what it takes to teach a robot to help a person who is frail or physically compromised?

Liu is among a growing coterie of scientists who are promoting “physics-based simulations” that are speeding up the learning process for robots. That is, rather than building actual robots and refining them as they go, she’s using computer simulations to improve how robots sense the physical world around them and to make intelligent decisions under changes and perturbations in the real world, like those involved in tasks like getting dressed for the day.

To do that, a robot must understand the physical characteristics of human flesh and bone as well as the movements and underlying human intention to be able to comprehend when a garment is or is not going on as expected.

The stakes are high. The downside consequence could be physical harm to the patient, as Liu tells Stanford Engineering’s The Future of Everything podcast hosted by bioengineer Russ Altman. Listen and subscribe here.