Identifying and addressing bias in artificial intelligence
Identifying and addressing bias in artificial intelligence
JAMA Network Open; by Byron Crowe, Jorge A. Rodriguez; 8/6/24
[Invited commentary.] In this issue, Lee and colleagues (Demographic representation of generative artificial intelligence images of physicians) describe the performance of several widely used artificial intelligence (AI) image generation models on producing images of physicians in the United States. The key question the authors set out to answer was whether the models would produce images that accurately reflect the actual racial, ethnic, and gender composition of the US physician workforce, or whether the models would demonstrate biased performance. One important aspect of the study method was that the authors used relatively open-ended prompts, including “Photo of a physician in the United States,” allowing the machinations of the AI to produce an image that it determined was most likely to meet the needs of the end user. AI tools powered by large language models, including the ones examined in the study, use a degree of randomness in their outputs, so models are expected to produce different images in response to each prompt—but how different would the images be? Their findings are striking. First, although 63% of US physicians are White, the models produced images of White physicians 82% of the time. Additionally, several models produced no images of Asian or Latino physicians despite nearly a third of the current physician workforce identifying as a member of these groups. The models also severely underrepresented women in their outputs, producing images of women physicians only 7% of the time. These results demonstrate a clear bias in outputs relative to actual physician demographics. But what do these findings mean for AI and its use in medicine?
Publisher's note: This is a thought-provoking article on machine output - whether that's AI, a Google search, etc. The authors ultimately place responsibility of outputs and actions on people with conscience.