Canva AI Magic Media result images of boys with ankle monitor (l) Canva on phone in front of computer screen (c) woman speaking with caption '@Canva your AI tool is racist' (r)

photosince/Shutterstock @bella_noche_/TikTok (Licensed) Remix by Caterina Rose

‘This is scary’: Woman says Canva’s AI is ‘racist’ after asking for photos of kids in ankle monitors (updated)

'The way my jaw dropped…'

 

Tiffanie Drayton

Trending

Posted on Apr 24, 2024   Updated on Apr 26, 2024, 9:20 am CDT

Canva’s AI is being accused of racism after one of its users asked for kids in ankle monitors. Allegedly, the bot only produced images of Black boys.

TikToker Bellanoche (@Bella_noche_) took to social media to call out the company in a now-viral video that has amassed over 728,100 views and over 127,500 likes as of publication.

“@Canva your AI tool is racist,” the on-screen caption read.

The woman said she used the software to generate images of “an ankle monitor on a juvenile defendant.” However, she said after trying several versions of the search, the AI only generated images of boys from one race.

“Every time, it’s giving me images of little Black boys,” she alleged.

The content creator went on to say that she believes AI is often racist, but she was still surprised by her experience with Canva’s AI.

“I’ve now been shot out 12 different images that are only Black kids, and I have not described or asked for a Black child,” she said.

Examples of the images were shown in the clip.

“You wonder then why little Black boys are f*cking demonized, why they are shot down by their white counterparts,” she continued. “It’s because they are villainized from a very young age.”

She concluded her video by asking Canva to explain its algorithm.

@bella_noche_ @Canva your Magic Media AI image generator tool clearly has racial bias. Care to explain why you’ve got a racist AI system? #canva #ai ♬ original sound – Bellanoche

In the comments section, many shared their own concerns about racist tech and algorithms.

“Was actually just learning about how computer programming is discriminatory in my one college class,” user Tate wrote.

“There is a TED talk on something similar to this. They talk about the face scan for Snapchat for example,” another viewer added.

This is not the first time AI image-generating software has been accused of racism. Right-wingers accused Google’s AI Gemini of refusing to acknowledge the existence of white people. One disability advocate said AI was racist after he asked for generated photos of “an autistic person,” and the vast majority of the images rendered were of white men.

AI is a powerful tool, but it is not without flaws. Artificial intelligence software does make mistakes and has limitations.

The Daily Dot reached out to Canva via email and to Bellanoche by TikTok comment for more information.

Update 9:20am CT, April 26: In an email to the Daily Dot, Canva shared the following statement:

“We take this matter very seriously and are working actively to fix this issue and prevent the possibility of it happening again.  

We became aware of this user’s experience yesterday and we’ve implemented a fix today for this specific query to do better in terms of sharing more balanced results. Unfortunately, the datasets AI models are trained on can magnify terrible real-world biases. We’re building ways of mitigating known biases into our tools, but it is a continuous work in progress for us, and the AI ecosystem as a whole. 

Representation and fairness are core to our values, and feedback from our community plays an important role in helping us deliver the best AI experience possible. The more we receive, the more we can refine our tools. Ultimately, it’s been our mission to empower the world to design, and to do so in a safe environment, especially when using AI. That’s been our commitment since day one, and we will not waver on this point.”

The internet is chaotic—but we’ll break it down for you in one daily email. Sign up for the Daily Dot’s web_crawlr newsletter here to get the best (and worst) of the internet straight into your inbox.

Share this article
*First Published: Apr 24, 2024, 6:00 pm CDT