Skip to main content Skip to secondary navigation
Main content start

Researchers improve patient safety with bedside computer vision

What if clinician imperfection could be improved by a form of artificial intelligence that continuously detects, and prompts correction of, defects in bedside care?

The use of computer vision could offload low-value work better suited to machines, augmenting rather than replacing clinicians. | Unsplash/Kevin

The use of computer vision could offload low-value work better suited to machines, augmenting rather than replacing clinicians. | Unsplash/Kevin

Medical errors at the bedside continue to harm many patients across the U.S., although nearly two decades have passed since the Institute of Medicine’s 1999 report on preventable patient harm first raised the issue.

Doctors and nurses are human after all: They strive for – but rarely achieve – perfect care.

But what if clinician imperfection could be neutralized by a form of artificial intelligence that continuously detects, and prompts correction of, defects in bedside care? That’s the proposition that a Stanford research team from the engineering and medical schools explains in a perspective piece published in the New England Journal of Medicine. They’ve been using imaging sensors at hospital room doorways and neural network technology to create an algorithm to detect hospital staff use and non-use of hand sanitizers, an important driver of patient safety.

The work began at Lucile Packard Children’s Hospital Stanford and Intermountain LDS Hospital, via research teams launched by Stanford’s Arnold Milstein, professor of medicine, and Fei-Fei Li, associate professor of computer science, and supported by clinicians and electrical engineering students including PhD student and first author Serena Yeung.

To protect patient and staff privacy, the team used depth and thermal sensors to create images of human shapes in motion without revealing their identity. The sensors were mounted in the doorways of patient rooms adjacent to hand hygiene alcohol gel dispensers. The researchers exposed a neural network layer to labeled images that showed people using and failing to use a wall-mounted alcohol gel dispenser. The initial algorithm distinguishes between use and non-use of proper hand hygiene at greater than 95 percent accuracy.

“Essentially, these types of machine learning-based approaches offer us the potential to learn at scale, from large amounts of data,” Yeung said.

“We intend to detect actions such as hand hygiene and monitor them 24/7 across entire hospitals at very low cost.”

The researchers are gathering clinician advice on how to best convey real-time alerts. They write:

“Such systems could remind a doctor or nurse to perform hand hygiene if they begin to enter a patient room without doing so, alert a surgeon that an important step has been missed during a complex procedure, or notify a nurse that an agitated patient is dangerously close to pulling out an endotracheal tube. The use of computer vision to continuously monitor bedside behavior could offload low-value work better suited to machines, augmenting rather than replacing clinicians.”

Related Departments