What can artificial intelligence do to CT scans that humans don’t see?
Welcome to Impact Factor , your weekly dose of commentary on a new medical study. I am Dr. F. Perry Wilson from the Yale School of Medicine in New Haven, USA.
If a picture is worth a thousand words, a chest CT scan might as well be Atlas Shrugged . When you think about the amount of information that one of these scans contains, it’s immediately clear that our usual method of interpreting scans must be leaving a lot on the table. At the end of the day, we can review all that information and just say “normal” and consider the matter resolved.
Of course, radiologists can take a lot out of a CT scan, but they’re trained to look for abnormalities. They can detect pneumonia, emboli, fractures, and pneumothorax, but the presence or absence of life-threatening abnormalities is only a fraction of the data found there.
Extracting more data from those images—data that may not indicate a disease per se , but that nevertheless tells us something important about a patient and their risks—might behoove those entities that are prepared to take a large amount of data and interpret them in new ways: artificial intelligence.