AI summaries can downplay medical issues for female patients, UK research finds

Date:

Share:


The latest example of bias permeating artificial intelligence comes from the medical field. A new surveyed real case notes from 617 adult social care workers in the UK and found that when large language models summarized the notes, they were more likely to omit language such as “disabled,” “unable” or “complex” when the patient was tagged as female, which could lead to women receiving insufficient or inaccurate medical care.

Research led by the London School of Economics and Political Science ran the same case notes through two LLMs — Meta’s Llama 3 and Google’s Gemma — and swapped the patient’s gender, and the AI tools often provided two very different patient snapshots. While Llama 3 showed no gender-based differences across the surveyed metrics, Gemma had significant examples of this bias. Google’s AI summaries produced disparities as drastic as “Mr Smith is an 84-year-old man who lives alone and has a complex medical history, no care package and poor mobility” for a male patient, while the same case notes with credited to a female patient provided: “Mrs Smith is an 84-year-old living alone. Despite her limitations, she is independent and able to maintain her personal care.”

Recent research has uncovered biases against women in the medical sector, both in and in . The stats also trend worse for and for the . It’s the latest stark reminder that LLMs are only as good as the information they are trained on and the . The particularly concerning takeaway from this research was that UK authorities have been using LLMs in care practices, but without always detailing which models are being introduced or in what capacity.

“We know these models are being used very widely and what’s concerning is that we found very meaningful differences between measures of bias in different models,” lead author Dr. Sam Rickman said, noting that the Google model was particularly likely to dismiss mental and physical health issues for women. “Because the amount of care you get is determined on the basis of perceived need, this could result in women receiving less care if biased models are used in practice. But we don’t actually know which models are being used at the moment.”



Source link

━ more like this

Superhero workplace comedy, more powerwashing and other new indie games worth checking out

Welcome to our latest roundup of what's going on in the indie game space. It's been a packed week with lots of tasty...

Here’s our first look at the Paranormal Activity game from the maker of The Mortuary Assistant

A teaser shared at the end of the Indie Horror Showcase this week gives us a better idea of what the upcoming found...

Dodgers vs. Blue Jays, Game 2 tonight: How to watch the 2025 MLB World Series without cable

The League Championship Series are history, and the final two teams have emerged: The 2025 Fall Classic will see the Los Angeles Dodgers...

Blumhouse is adapting Something is Killing the Children for a live-action film and animated series

The hit horror comic series Something is Killing the Children is headed to the big (and small) screen. According to The Hollywood Reporter,...

Relive the Commodore 64’s glory days with a slimmer, blacked-out remake

The Commodore 64 is back in black, sort of. Retro Games and Plaion Replai released a limited edition redesign of the best-selling computer,...
spot_img