Dr. Ramsey Wehbe, a cardiologist and postdoctoral fellow in A.I.[1]Wehbe is a postdoctoral fellow at Bluhm Cardiovascular Institute at Northwestern Memorial Hospital.[2]Wehbe, MD, in the midst of our Northwestern Medicine Fellowship in Artificial Intelligence in Cardiovascular Disease, recently published a paper with impressive results from a set of 5,853 patients using machine learning to accurately identify COVID-19 using chest X-rays.[3]Wehbe and his colleagues expect the system to save not just on time but the costs associated with employing radiologists to identify potential COVID-19 patients.[4]
Events - Primer's event detection algorithm clusters and summarizes multiple documents describing real-world events.
Mentions - Mentions are snippets of text that map to a person.
Docs - The number of documents that match to a person in Primer's corpus of news articles.
Full tech explainer here.
Remember to check the sources and follow Wikipedia's guidelines.
4
28
19
The breakthrough will likely be instrumental in preventing the spread of the virus, particularly amongst hospital patients and staff, by providing a rapid and accurate method for screening patients admitted to hospital either with or without COVID-19 symptoms by merely scanning their chest x-rays. The technology will act as an early warning system, highlighting those who need to self-isolate even before symptoms develop and, additionally, picking up on those who may have never been aware of their need to isolate.[1]
11/24/2020
Faster, earlier detection of the highly contagious virus could potentially protect health care workers and other patients by triggering the positive patient to isolate sooner. The study's authors also believe the algorithm could potentially flag patients for isolation and testing who are not otherwise under investigation for COVID-19. Katsaggelos' laboratory specializes in using A.I. for medical imaging. He and Wehbe had already been working together on cardiology imaging projects and wondered if they could develop a new system to help fight the pandemic.[2]
11/24/2020
The panel said a lack of transparency surrounding the datasets that train algorithms can lead to public mistrust in AI-powered medical tools, as these devices may not have been trained using patient data that represents the patients on which they will be used.[3]
12/01/2020