Skip to content

Is AI up to snuff? Cardiac clinical trial points to yes

Stanford Medicine researchers studied how AI can enhance evaluation of cardiac tests in the clinic and found it improved accuracy.

There's a lot of talk about the potential for artificial intelligence in medicine, but few researchers have shown through well-designed clinical trials that it could be a boon for doctors, health care providers and patients.

Now, researchers at Stanford Medicine have conducted one such trial; they tested an artificial intelligence algorithm used to evaluate heart function. The algorithm, they found, improves evaluations of heart function from echocardiograms -- movies of the beating heart, filmed with ultrasound waves, that show how efficiently it pumps blood.

"This blinded, randomized clinical trial is, to our knowledge, one of the first to evaluate the performance of an artificial intelligence algorithm in medicine. We showed that AI can help improve accuracy and speed of echocardiogram readings," said James Zou, PhD, assistant professor of biomedical data science and co-senior author on the study. "This is important because heart disease is the leading cause of death in the world. There are over 10 million echocardiograms done each year in the U.S., and AI has the potential to add precision to how they are interpreted."

Echocardiograms are crucial cardiac imaging but can depend on the interpretation of clinicians, according to David Ouyang, MD, former Stanford Medicine postdoctoral scholar and current cardiologist at Cedars-Sinai Medical Center who is a co-senior author on the study. "A more precise measurement from AI can streamline the workflow and allow for detection of earlier, subtle changes in heart function. This is really exciting, as it will allow for better patient care."

Susan Cheng, MD, professor of cardiology at Cedars-Sinai Medical Center, is also a co-senior author on the study. Bryan He, a Stanford computer science graduate student, is the first author on the study.

Tracing heart health

The algorithm, Echonet, was developed in 2020 by Stanford Medicine researchers using more than 10,000 echocardiograms from Stanford Health Care. That study validated the algorithm's performance at assessing multiple measures of heart health from echocardiograms. But the researchers wanted to test the algorithm at a different site, so they conducted the newest study at Cedars-Sinai Medical Center.

The four chambers of the heart are never empty. A healthy heart continually pumps 50% to 70% -- a measurement known as the ejection fraction -- of the blood from one of its chambers, the left ventricle. Cardiologists use the left ventricle ejection fraction to diagnose heart failure and track responses to treatment because that chamber sends oxygenated blood to the body with each heartbeat. For that reason, the AI algorithm was trained to calculate the left ventricle ejection fraction in this study.

Evaluating an echocardiogram is a two-step process: Sonographers make an initial estimate and then cardiologists review it. Calculating the ejection fraction requires finding the echocardiogram movie frame when the left ventricle is at its largest, most expanded size and the frame when it is at its smallest, most contracted size. Sonographers find these frames by eye and trace the left ventricle boundaries by hand. Software then calculates an initial ejection fraction based on the tracing. Cardiologists redraw boundaries to calculate a more accurate ejection fraction when they feel the preliminary estimate is imprecise.

A total of 3,495 echocardiograms from real patients were used in the study. Roughly half were first assessed by sonographers, and the other half were first assessed using Echonet. All evaluations were then reviewed by cardiologists, who didn't know whether the assessment was generated by the algorithm or a sonographer.

Faster and more accurate

Cardiologists updated about 17% of the ejection fractions generated by Echonet, adjusting them by 2.79% on average. They updated about 27% of the sonographer-estimated ejection fractions, adjusting them by 3.77%, on average. These findings show that the AI output was more consistent with cardiologists' assessments.

Overall, the algorithm saved time during initial assessments and final evaluations by cardiologists. Echonet was, on average, 130 seconds faster per echocardiogram than sonographers, and cardiologists were 8 seconds faster on average when adjudicating ejection fractions from the algorithm than those from sonographers.

"This study represents the right implementation of AI in health care. From a patient's perspective, nothing changes while AI speeds up the tedious, manual parts of sonographer work. Similarly, the workflow for cardiologists is also unchanged while supervising the AI," Ouyang said.

Improving patient care

The American Society of Echocardiography recommends health care professionals calculate the ejection fraction from multiple heartbeats to address heartbeat variability. Because the initial tracing process is time intensive when done by hand, it's often calculated from a single heartbeat, Zou explained. (The algorithm can generate ejection fractions from multiple heartbeats in just milliseconds, but in this study, the algorithm was programmed to analyze one heartbeat to match the work of the sonographers.) Zou hopes that support from AI means more reliable estimates of the ejection fraction are on the horizon.

The team plans to seek approval from the Food and Drug Administration for Echonet, and they hope this study demonstrates the importance of blinded, randomized clinical trials for AI in medicine, which are not currently required by regulatory agencies.

Photo by Who is Danny

Popular posts

Category:
Awards & Honors
Match Day 101: How does the medical residency match work?

Graduating medical students go through an unusual springtime ritual known as Match Day to find out where they’ll continue their training. Here’s everything you wanted to know about the big day.