October 5, 2022

CloudsBigData

Epicurean Science & Tech

AI can tell your race from an X-ray image — and scientists can’t figure out how

5 min read

Large research team taught AI program to read scans, and it outwitted them

Article content

A new study by an international team of scientists from Canada, the U.S., Australia and Taiwan reports that artificial intelligence used to read X-rays and CT scans can predict a person’s race with 90 per cent accuracy — and humans can’t. The scientists, including those from Massachusetts Institute of Technology and Harvard Medical School, have no idea how the program does it.

Advertisement 2

Article content

“When my graduate students showed me some of the results that were in this paper, I actually thought it must be a mistake,” Marzyeh Ghassemi, an MIT assistant professor of electrical engineering and computer science, and co-author of the paper, published in The Lancet Digital Health, told the Boston Globe. “I honestly thought my students were crazy.”

Article content

The study began after scientists noticed that an AI program for examining chest X-rays was more likely to miss signs of illness in Black patients. “We asked ourselves, how can that be if computers cannot tell the race of a person?” a co-author and an associate professor at Harvard Medical School Leo Anthony Celi told the Boston Globe.

Article content

Researchers taught the AI program by showing it large numbers of race-labelled images of different parts of the body, including the chest, hand and spine — with no obvious markers of race, such as skin colour or hair texture — and then sets of unlabelled images. The program identified the race in the unmarked images with more than 90 per cent accuracy, and could differentiate Black patients from white even when images were from people of the same size, age or gender.

Advertisement 3

Article content

The discovery can assist medical staff in some ways, but its also raises the prospect that AI-based diagnostic systems might unintentionally generate racially biased results, such as automatically recommending a particular treatment for Black patients, whether or not it’s appropriate for the particular person, the newspaper reports. Additionally, that person’s own doctor would be unaware the AI based its diagnosis on racial data.

Ghassemi believes the answer to the mystery is related to melanin, where X-rays and CT scanners detect the higher melanin content of darker skin, and embed this information in the digital image in some way that has gone unnoticed. More research will be carried out on this — but not everyone agrees with the hypothesis.

Advertisement 4

Article content

Rather than being proof of innate differences between races, Alan Goodman, a professor of biological anthropology at Hampshire College and coauthor of the book Racism Not Race, suggests AI is picking up differences resulting from geography.

Osteoarcheologists and geneticists have found no evidence of substantial racial differences in the human genome, but they do find major differences between people based on where their ancestors lived.

By analyzing isotopes such as oxygen, strontium and sulphur from human bones, researchers can learn where a person was born and raised. They can find evidence of diseases such as osteoarthritis, trauma, infections including leprosy and syphilis, suggesting to them details about the person’s living conditions and lifestyle.

Advertisement 5

Article content

“Instead of using race, if they looked at somebody’s geographic co-ordinates, would the machine do just as well? My sense is the (AI) machine would do just as well,” Goodman told the paper.

So, while AI might be able to determine from an X-ray whether a person’s ancestors were from Scandinavia, Africa or Asia, Goodman says it’s not about race. “You call this race. I call this geographical variation,” said Goodman — but he did admit it’s unclear how AI could detect geographical location from an X-ray.

In any case, Harvard’s Celi said doctors should be reluctant to use AI diagnostic tools that might automatically generate biased results.

Advertisement

Comments

Postmedia is committed to maintaining a lively but civil forum for discussion and encourage all readers to share their views on our articles. Comments may take up to an hour for moderation before appearing on the site. We ask you to keep your comments relevant and respectful. We have enabled email notifications—you will now receive an email if you receive a reply to your comment, there is an update to a comment thread you follow or if a user you follow comments. Visit our Community Guidelines for more information and details on how to adjust your email settings.

Copyright © cloudsbigdata.com All rights reserved. | Newsphere by AF themes.