fbpx

X

MIT Researchers Create AI that Detects COVID-19 Infections Through Cellphone-Recorded Coughs

MIT Researchers Create AI that Detects COVID-19 Infections Through Cellphone-Recorded Coughs

AI is being used to detect COVID-19 in asymptomatic patients using the sound of their forced cough and their smartphone devices.

MIT researchers developed an algorithm that has correctly identified asymptomatic people with COVID-19 by the sound of their forced coughs using their smartphone devices. The researchers have found that asymptomatic people may differ from healthy people in the way that they cough. These differences are not identifiable by the human ear, but the researchers have developed an AI that can detect this difference.

The paper, which was recently published in the IEEE Journal of Engineering in Medicine and Biology, shows that the AI model identified the difference between healthy and COVID-19 positive individuals through forced-cough recordings that were submitted through web browsers and cellphones for the study.

The researchers trained the model by using tens of thousands of sample coughs and spoken words.

The study found a 98.5 percent success rate among the people who received a positive COVID-19 test result, which includes 100 percent of those that were asymptomatic and reported they had no symptoms but tested positive.

To create a user-friendly app for this model, an FDA approval must be obtained. This app could help a large number of individuals by providing a free, convenient and non-invasive screening tool to identify COVID-19 positive patients and aid in flattening the curve.

“The effective implementation of this group diagnostic tool could diminish the spread of the pandemic if everyone uses it before going to a classroom, a factory, or a restaurant,” said co-author Brian Subirana, a research scientist in MIT’s Auto-ID Laboratory, in a statement.

Before the pandemic, researchers were training algorithms to diagnose conditions such as pneumonia and asthma. MIT researchers were developing an AI to detect signs of Alzheimer’s because of weakened vocal cords that are associated with the disease.

When the pandemic began, the team wondered if they could use the same AI framework developed for their Alzheimer’s study and adapt it to diagnose COVID-19.

“The sounds of talking and coughing are both influenced by the vocal cords and surrounding organs. This means that when you talk, part of your talking is like coughing and vice versa. It also means that things we easily derive from fluent speech, AI can pick up simply from coughs, including things like the person’s gender, mother tongue, or even emotional state. There’s in fact sentiment embedded in how you cough,” said Subirana. “So we thought, why don’t we try these Alzheimer’s biomarkers [to see if they’re relevant] for COVID.”

To date, the researchers have collected more than 70,000 recordings, amounting to 200,000 forced cough audio samples which Subirana believes is the largest research cough data set that they know of.


Related: Can Dogs Identify COVID-19 Positive People?


In analyzing the cough samples, they found a “striking similarity between Alzheimer’s and COVID discrimination.” Subirana said, “We think this shows that the way you produce sound, changes when you have COVID, even if you’re asymptomatic.”

The team is now working with a company to develop a free pre-screening app based on their AI model and is partnering with several hospitals around the world to gather a more diverse set of cough recordings to train and strengthen the model’s accuracy. The research was also supported by Takeda Pharmaceutical Company.

The research paper proposes that “pandemics could be a thing of the past if pre-screening tools are always on in the background and constantly improved.”

More AI Developments to Aid the COVID-19 Pandemic

MIT isn’t the only institution using AI to solve problems during the pandemic; Rensselaer Polytechnic Institute has developed an AI tool that could help clinicians provide more effective COVID-19 intervention for patients that might need the ICU. It could aid in providing proper treatment plans for individual patients.

The method combines chest computed tomography (CT) images that assess the severity of the patient’s lung infection along with non-imaging data that includes demographic information, vital signs and blood tests. The algorithm can then combine and predict the patient’s outcome and, more specifically, whether the patient will need ICU intervention or not.

“As a practitioner of AI, I do believe in its power,” said Pingkun Yan, who is a member of the Center for Biotechnology and Interdisciplinary Studies (CBIS) at Rensselaer, in a statement. “It really enables us to analyze a large quantity of data and also extract the features that may not be that obvious to the human eye.”

GE Healthcare has also launched a new algorithm that can read X-rays and help assess the correct placement of ventilator tubes in patients in critical care. This is a step towards helping COVID-19 patients that are critically ill get help.

“Today, clinicians are overwhelmed, experiencing mounting pressure as a result of an ever-increasing number of patients,” said Jan Makela, president and CEO of Imaging at GE Healthcare, in a statement. “The pandemic has proven what we already knew – that data, AI and connectivity are central to helping those on the front lines deliver intelligently efficient care. GE Healthcare is not only providing new tools to help hospital staff keep up with demand without compromising diagnostic precision, but also leading the way on COVID-era advancements that will have a long-lasting impact on the industry, long after the pandemic ends.”

Several more organizations, including Cambridge University and Carnegie Mellon University, are working on similar projects that use AI to help detect and limit the spread of COVID-19.