It turns out Waymo's self-driving taxis aren’t always acting alone.
During a congressional hearing this week, the company acknowledged that when its robotaxis get confused, they request help from human operators, including some based as far away as the Philippines.
Chief Safety Officer Mauricio Peña admitted this detail during a congressional hearing over a robotaxi hitting a child near an elementary school in California.
When Waymo robotaxis need human help
During Wednesday's hearing, U.S. Congress members grilled Peña over Waymo's use of foreign products and labor. After they finished discussing how the cars are made in China, the topic turned to who steps in when the robotaxis get stuck.
Peña insisted that Waymo operators do not drive the cars remotely most of the time. When the bots become confused, however, they will ping human workers for instructions. Some of these workers live in the U.S., where they're likely familiar with our traffic laws and customs.
Others live on the other side of the world, like in the Philippines.
Massachusetts Senator Ed Markey didn't enjoy hearing that.
"Having people overseas influencing American vehicles is a safety issue," he said.
Yesterday, I got Waymo to admit they are using people 8000 miles away in the Philippines tohelp guide their self-driving cars in the U.S.
— Ed Markey (@SenMarkey) February 5, 2026
This should scare us all. It must end. https://t.co/4Xf9SfwoAk
"The information the operators receive could be out of date. It could introduce tremendous cybersecurity vulnerabilities. We don’t know if these people have U.S. driver’s licenses.”
He further argued that outsourcing jobs overseas is bad.
Speaking to Futurism, Waymo claimed that all its operators are "required to have a passenger car or van license, and are reviewed for records of traffic violations, infractions, and driving-related convictions."
It's unclear if those licenses need to be based in the U.S. or if they receive updated information on American traffic laws.
Critics say humans never left the loop
Waymo and similar companies love espouse "autonomous" vehicles. However, people have raised the issue of how autonomous they actually are, time and time again.
While Waymo has been open about its safety operators, critics once again pointed out that these bots always end up needing human help.

"Once again, it turns out 'fully autonomous' means 'a guy in the Philippines,'" said The Onion owner Ben Collins on Bluesky.
"The whole 'autonomous' thing is a fraud," @robbydobbymark2.bsky.social agreed. "Some diver [sic] from overseas who doesn’t know a US State’s driving laws or passed a test there, is controlling the vehicles. They’re not safe."

"AI means Actually Interns," joked @ilovepets420.bsky.social.
Waymo vehicles don't rely on AI like Tesla's robotaxis do, but we all remember how those Optimus robots had human operators behind them. It's never not funny, until a robotaxi hits and injures a little kid.

As @trashmuppet.bsky.social pointed out, "having someone who is 8000 miles away and has a ping of like 300ms controlling a car that's going potentially 65 mph is not exactly what i'd call reassuring."
The internet is chaotic—but we’ll break it down for you in one daily email. Sign up for the Daily Dot’s newsletter here.






