echocardiogram

AI may revolutionize rheumatic heart disease early diagnosis

echocardiogram

Researchers at Children’s National Hospital have created a new artificial intelligence (AI) algorithm that promises to be as successful at detecting early signs of rheumatic heart disease (RHD) in color Doppler echocardiography clips as expert clinicians.

Researchers at Children’s National Hospital have created a new artificial intelligence (AI) algorithm that promises to be as successful at detecting early signs of rheumatic heart disease (RHD) in color Doppler echocardiography clips as expert clinicians. Even better, this novel model diagnoses this deadly heart condition from echocardiography images of varying quality — including from low-resource settings — a huge challenge that has delayed efforts to automate RHD diagnosis for children in these areas.

Why it matters

Current estimates are that 40.5 million people worldwide live with rheumatic heart disease, and that it kills 306,000 people every year. Most of those affected are children, adolescents and young adults under age 25.

Though widely eradicated in nations such as the United States, rheumatic fever remains prevalent in developing countries, including those in sub-Saharan Africa. Recent studies have shown that, if detected soon enough, a regular dose of penicillin may slow the development and damage caused by RHD. But it has to be detected.

The hold-up in the field

Diagnosing RHD requires an ultrasound image of the heart, known as an echocardiogram. However, ultrasound in general is very variable as an imaging modality. It is full of texture and noise, making it one of the most challenging to interpret visually. Specialists undergo significant training to read them correctly. However, in areas where RHD is rampant, people who can successfully read these images are few and far between. Making matters worse, the devices used in these low resource settings have their own levels of varying quality, especially when compared to what is available in a well-resourced hospital elsewhere.

The research team hypothesized that a novel, automated deep learning-based method might detect successfully diagnose RHD, which would allow for more diagnoses in areas where specialists are limited. However, to date, machine learning has struggled the same way the human eye does with noisy ultrasound images.

Children’s National leads the way

Using approaches that led to successful objective digital biometric analysis software for non-invasive screening of genetic disease, researchers at the Sheikh Zayed Institute for Pediatric Surgical Innovation, including medical imaging scientist Pooneh Roshanitabrizi, Ph.D., and Marius Linguraru, D.Phil., M.A., M.Sc., principal investigator, partnered with clinicians from Children’s National Hospital, including Craig Sable, M.D., associate chief of Cardiology and director of Echocardiography, and cardiology fellow Kelsey Brown, M.D., who are heavily involved in efforts to research, improve treatments and ultimately eliminate the deadly impacts of RHD in children. The collaborators also included cardiac surgeons from the Uganda Heart Institute and cardiologists from Cincinnati Children’s Hospital Medical Center.

Dr. Linguraru’s team of AI and imaging scientists spent hours working with cardiologists, including Dr. Sable, to truly understand how they approach and assess RHD from echocardiograms. Building the tool based on that knowledge is why this tool stands apart from other efforts to use machine-learning for this purpose. Orienting the approach to the clinical steps of diagnosis is what led to the very first deep learning algorithm that diagnoses mild RHD with similar success to the specialists themselves. After the platform was built, 2,136 echocardiograms from 591 children treated at the Uganda Heart Institute fed the learning algorithm.

What’s next

The team will continue to collect data points based on clinical imaging data to refine and validate the tool. Ultimately, researchers will look for a way that the algorithm can work directly with ultrasound/echocardiogram machines. For example, the program might be run through an app that sits on top of an ultrasound device and works on the same platform to communicate directly with it, right in the clinic. By putting the two technologies together, care providers on the ground will be able to diagnose mild cases and prescribe prophylactic treatments like penicillin in one visit.

The first outcomes from the program were showcased in a presentation by Dr. Roshanitabrizi at one of the biggest and most prestigious medical imaging and AI computing meetings — the 25th International Conference on Medical Image Computing and Computer Assisted Intervention (MICCAI).