Skip to content
Footage from computer vision-based ambient intelligence

When AI is watching patient care: Ethics to consider

Ethical and legal issues accompany the potential benefits of using computer vision-based ambient intelligence in health care.

The potential benefits of artificial intelligence to health care are enormous, but these emerging technologies also raise a number of ethical and legal considerations.

These questions are particularly relevant to a subset of AI known as computer vision-based ambient intelligence, which uses a video camera or sensors to monitor activity in a physical space, such as a patient room or hospital hallway. The technology analyzes -- in real time -- the resulting video data, which can appear as standard footage, depth or thermal data captured as silhouette-like moving images, or in other forms.

Such tools can potentially help nurses ensure that patients in an intensive care unit are enhancing their recovery by moving around. Computer vision technology also can be used to document whether nurses and doctors are following proper hand-washing protocol, considered the first line of defense against hospital-acquired infections.

Though these and similar applications have vast potential to improve patient care, they also prompt a meaningful discussion about privacy, including the rights of patients and hospital workers.

Serena Yeung, PhD, a Stanford biomedical data scientist and assistant professor, has spent the past five years exploring ways to use computer vision-based ambient intelligence in health care. In a Viewpoint published recently in JAMA, Yeung, of Stanford's Clinical Excellence Research Center, considered the ethical and legal issues with co-authors Sara Gerke and I. Glenn Cohen, JD, of Harvard Law School.

In the piece, the authors examined implications for privacy, consent and liability.

They wrote:

...what is distinct about ambient intelligence is that the technology not only captures video data as many surveillance systems do, but does so targeting the physical spaces where sensitive patient care activities take place, and furthermore interprets the video such that the behaviors of patients, health care workers, and visitors may be constantly analyzed.

This means that there could be a record of a patient's care activity that the patient -- or the person providing their medical care -- would prefer did not exist. One possible solution is to collect only the minimal data necessary for a research project or application -- particularly when specific data points are sufficient, according to Yeung and her co-authors. For example, she and her colleagues chose to capture silhouettes, rather than full-color video, when training a computer-vision algorithm to detect patient mobility in an ICU setting. 

But no matter the safeguards, there's always a risk that someone will be able to figure out the identity of a patient or health care worker in footage or images. The authors urge hospitals to consider this, and to have clear, straight-forward policies dictating when it's permittable to identify a person appearing in this visual data.

Hospitals should take the same upfront approach with consent forms for patients, the authors wrote, with a goal of building trust by being as transparent as possible. "Far from hiding the technology," they wrote, "hospitals should be eager to educate patients about it, explain the anticipated benefit to patient care, and demonstrate to them the safeguards that have been taken to protect privacy."

A particularly challenging question, however, is whether hospitals must obtain the explicit consent of health care workers or offer them the right to opt out.

Though it's generally legal to record employees at work if there's a strong business or public interest, the authors wrote, ethically speaking, hospitals should try to meet certain conditions before extracting computer-vision data. The recording should take place in quasi-public spaces, such as hallways or in hospital rooms. The goal should be to improve care. Precautions should be taken in place to protect individual privacy. And workers should be informed that recording will take place.

Then there is the liability question. Ambient intelligence systems may capture images or footage related to an adverse outcome for a patient or that illustrates problems with a worker or care team. To be prepared, hospitals should have procedures for retention and destruction of the data, as they do with other types of records, the authors wrote. They also should plan in advance how to respond to problems that may be uncovered, including information relevant to criminal court cases.

I asked Yeung why these issues are surfacing now. She told me,

It's still very early in the design and creation of ambient intelligence systems for hospitals. To be able to achieve an effective system, we have to consider all the ways it is going to be deployed, and the ethical and legal considerations are an important part of that.

Photo courtesy of Serena Yeung

Popular posts