Advertisement
IRL

‘Do you see any diversity?’: Man says AI is ableist after generating 100-plus images of ‘an autistic person’—and they’re all white men

‘Most were skinny, and weirdly enough, a disproportionate abundance had red hair.’

Photo of Tricia Crimmins

Tricia Crimmins

man speaking with caption 'Can you believe this person is claiming AI is racist?' (l) AI search for an autistic person with caption 'And let's do it again...' (c) man speaking with caption 'It is. Ableist is too.' (r)

A disability advocate says he asked AI to generate photos of “an autistic person” 148 times, and with the exception of two photos, all the images showed white men.

Featured Video

In a TikTok posted on Monday, Jeremy Davis (@jeremyandrewdavis) says that not only is artificial intelligence racist, it is ableist as well. He shows images that he asked AI to generate with the prompt “an autistic person” with the parameters “lifelike,” “photoreal,” and “photojournalism.”

All of the photos show white, thin, young men. Only twice did the software produce photos of women.

“Do you see any diversity? Say, race, gender, age?” Davis says in his TikTok. “Plus most were skinny, and weirdly enough, a disproportionate abundance had red hair.”

Advertisement

Davis said he asked AI to produce images of “an autistic person” with the parameters 148 times. The images never showed people smiling and were “moody, melancholy, depressing.”

He also says that he used “lifelike,” “photoreal,” and “photojournalism” to avoid receiving cartoon-style images or ones with puzzle pieces, the symbol of Autism Speaks, a nonprofit that many autistic people feel only increases stigma against them.

“AI is racist, sexist, ageist, and not only ableist,” Davis says. “But uses harmful puzzle imagery from hate groups like Autism Speaks.”

The reason for all of those biases, Davis says, is because what we think of as AI is actually just machine learning that works with patterns and looks for majorities and similarities, “and then excludes outliers,” thus amplifying stereotypes.

Advertisement

“In order for AI to be an effective tool, you have to be smarter than the AI,” Davis says. “We must be aware of its limitations and pitfalls.”

@jeremyandrewdavis Is AI Ableist? #AI #ArtificialIntelligence #MidJourney #Ableist #Bias #AIBias #Autism #ActuallyAutistic #Representation #DisabilityRepresentation #SystemicRacism #SystemicAbleism #MachineLearning #Sexist ♬ Suara Seram Sangat Mencekam – Kholil Buitenzorg

It is relatively known within the tech community that AI can perpetuate ableism. According to the MIT Technology Review, while many have focused on combatting AI’s racism and sexism, there is still work to be done about its ableism.

And that ableism has consequences: An example from a Pulitzer Center project on artificial intelligence and ableism posits that if used in hiring practices, AI’s potential inability to understand someone with a speech impediment could discriminate against that person based on their disability.

Advertisement

“Because machine-learning systems—they learn norms,” Shari Trewin, an IBM’s accessibility leadership team researcher told MIT Technology Review. “They optimize for norms and don’t treat outliers in any special way. But oftentimes people with disabilities don’t fit the norm.”

Trewin says that a lack of diversity among people with disabilities is harder to correct in an AI data set compared to sexism. Like Davis’ point about being smarter than AI, she suggests combining artificial intelligence with rules for the machine that will mandate diversity.

A commenter on Davis’ video also explained the importance of diverse researchers developing AI.

“All software encodes the [biases] of its engineers,” @draethdarkstar wrote. “[Machine learning] also encodes the biases of its dataset. Diverse researchers are essential.”

Advertisement
web_crawlr
We crawl the web so you don’t have to.
Sign up for the Daily Dot newsletter to get the best and worst of the internet in your inbox every day.
Sign up now for free
 
The Daily Dot