Thanks to Gordon G.
Based on a cellphone-recorded cough, machine learning models accurately detect coronavirus even in people with no symptoms
Again and again, experts have pleaded that we need more and faster testing to control the coronavirus pandemic—and many have suggested that artificial intelligence (AI) can help. Numerous COVID-19 diagnostics in development use AI to quickly analyze X-ray or CT scans, but these techniques require a chest scan at a medical facility.
Since the spring, research teams have been working toward anytime, anywhere apps that could detect coronavirus in the bark of a cough. In June, a team at the University of Oklahoma showed it was possible to distinguish a COVID-19 cough from coughs due to other infections, and now a paper out of MIT, using the largest cough dataset yet, identifies asymptomatic people with a remarkable 100 percent detection rate.
If approved by the FDA and other regulators, COVID-19 cough apps, in which a person records themselves coughing on command, could eventually be used for free, large-scale screening of the population.
With potential like that, the field is rapidly growing: Teams pursuing similar projects include a Bill and Melinda Gates Foundation-funded initiative, Cough Against Covid, at the Wadhwani Institute for Artificial Intelligence in Mumbai; the Coughvid project out of the Embedded Systems Laboratory of the École Polytechnique Fédérale de Lausanne in Switzerland; and the University of Cambridge’s COVID-19 Sounds project.
The fact that multiple models can detect COVID in a cough suggests that there is no such thing as truly asymptomatic coronavirus infection—physical changes always occur that change the way a person produces sound. “There aren’t many conditions that don’t give you any symptoms,” says Brian Subirana, director of the MIT Auto-ID lab and co-author on the recent study, published in the IEEE Open Journal of Engineering in Medicine and Biology.
While human ears cannot distinguish those changes, AI can. Ali Imran, who led the earlier project at the University of Oklahoma’s AI4Neworks Research Center, compares the concept to a guitar: If you put objects of different shapes or materials in a guitar but play the same notes, it will lead to subtly different sounds. “The human ear is capable of distinguishing maybe five to ten different features of cough,” says Imran. “With signal processing and machine learning, we can extract up to 300 different distinct features.”
Wouldn’t it be great to have this ‘cough test’ device available at Skyline ???
Thanks for sharing this, Gordon.