Advertisement
Internet Culture

‘This is so scary’: Woman says AI phone scam made her think brother was dead

‘This has been happening to A LOT of people.’

Photo of Audra Schroeder

Audra Schroeder

person speaking in car with caption 'Our family almost got scammed by an A.I. machine' (l) hand holding phone with unknown caller on screen (c) person speaking on couch (r)

“He mimicked my brother’s voice and said, ‘Oh, I’m about to get in a wreck.’ And then the phone went off.”

Featured Video

Earlier this week, TikToker Brooke Bush (@babybushwhacked) recounted a scammer who reportedly targeted her grandfather with a new ruse: Using AI tech that can mimic voices and make people believe they’re talking to someone they know. The TikTok has more than 1.4 million views.

“this has been happening to A LOT of people,” said one commenter.

@babybushwhacked

Sharing so this doesn’t happen to anyone else

♬ original sound – Brooke Bush
Advertisement

Over the last few months, there have been more reports that scammers are using AI-assisted voice cloning tech to fool victims into handing over private information. In March, the Federal Trade Commission released a statement warning that these calls tend to target family members, and scammers only need a short clip of the alleged person’s voice (from, say, TikTok or YouTube) to potentially fool someone. There have also been reports that scammers are spoofing actual phone numbers.

We reached out to Bush for comment via TikTok DM. It’s not clear if the scammer mimicked the voice of her brother Clayton Bush, a football player with a sizable Instagram following.

In a stitch with Bush’s video, user @babyybiird claimed something similar happened to him last year; his aunt told him that someone who sounded like him called his grandmother claiming he’d been injured in a car accident and gotten a DUI. The “caller” needed $3,000 to make bail.

“My grandma swears that it sounded exactly like me,” he says. “So my family and I came up with a safe word for situations like this.” Creating a “safe word” was suggested by others as well.

Advertisement
@babyybiird #stitch with @babybushwhacked scammers are the worst #scam #scammer #scammers #ai #fy #fyp ♬ original sound – Christopher Lee

NBC recently reported on a father who got a call from someone who sounded like his daughter, saying she was in trouble, and then a man got on the line asking for a ransom. This same approach was also used on an Arizona mother, who says the scammers asked for $1 million in ransom.

Influencers have recounted similar stories, and social media like TikTok could be making it easier for scammers to mimic voices.

@payton.bock AI VOICE CLONING SCAM PSA!!!!! #fyp #ai #scam #voicecloning ♬ original sound – PAYTON
Advertisement
@eddiecumberbatch Be safe 🙏 #ai #aivoice #aiscam #scam ♬ In Essence (Slowed) – Ka$tro
web_crawlr
We crawl the web so you don’t have to.
Sign up for the Daily Dot newsletter to get the best and worst of the internet in your inbox every day.
Sign up now for free
 
The Daily Dot