Skip to content
glasses and computer screen

Improving patient safety with bedside computer vision

Can computers carry out hospital safety-monitoring tasks better than humans? A Stanford research team has been testing the idea; so far, it's working well.

Medical errors at the bedside continue to harm many patients across the U.S., although nearly two decades have passed since the Institute of Medicine's 1999 report on preventable patient harm first raised the issue. Doctors and nurses are human after all: They strive for - but rarely achieve - perfect care.

But what if clinician imperfection could be neutralized by a form of artificial intelligence that continuously detects, and prompts correction of, defects in bedside care? That's the proposition that a Stanford research team from the engineering and medical schools explains in a perspective piece published in the New England Journal of Medicine. They've been using imaging sensors at hospital room doorways and neural network technology to create an algorithm to detect hospital staff use and non-use of hand sanitizers, an important driver of patient safety.

The work began at Lucile Packard Children's Hospital and Intermountain's LDS Hospital, via research teams launched by Stanford's Arnold Milstein, MD, and Fei Fei Li, PhD, and supported by clinicians and electrical engineering students including PhD student and first author Serena Yeung.

To protect patient and staff privacy, the team used depth and thermal sensors to create images of human shapes in motion without revealing their identity. The sensors were mounted in the doorways of patient rooms adjacent to hand hygiene alcohol gel dispensers. The researchers exposed a neural network layer to labelled images that showed people using and failing to use a wall-mounted alcohol gel dispenser. The initial algorithm distinguishes between use and non-use of proper hand hygiene at greater than 95 percent accuracy.

"Essentially, these types of machine learning-based approaches offer us the potential to learn at scale, from large amounts of data," Yeung told me. "We intend to detect actions such as hand hygiene and monitor them 24/7 across entire hospitals at very low cost."

The researchers are gathering clinician advice on how to best convey real-time alerts. They write:

Such systems could remind a doctor or nurse to perform hand hygiene if they begin to enter a patient room without doing so, alert a surgeon that an important step has been missed during a complex procedure, or notify a nurse that an agitated patient is dangerously close to pulling out an endotracheal tube. The use of computer vision to continuously monitor bedside behavior could offload low-value work better suited to machines, augmenting rather than replacing clinicians.

Milstein, a professor of medicine and senior author on the research, noted that "bedside computer vision will bring us much closer to clinicians' multi-century unfulfilled aspiration to 'do no harm.'"

Photo by Kevin

Popular posts

How the tobacco industry began funding courses for doctors

Earlier this year, the largest tobacco company in the world paid millions to fund continuing medical education courses on nicotine addiction —16,000 physicians and other health care providers took them.