California state lawmakers are bringing forward legislation that would require social media companies, like Twitter and Facebook, to identify and deal with automated bot accounts on their platform.
The move comes in the absence of action on the matter from the federal government and as pressure mounts on technology companies to tackle manipulative swarms of so-called bot accounts, engineered by actors to push misinformation and fake news or promote hyper-partisan propaganda.
On multiple occasions, bot swarms have utilized national tragedies, such as the recent shooting at Marjory Stoneman Douglas High School in Florida. As gun control became the point of debate, accounts on Twitter suspected of having links to Russia pushed hashtags demanding #guncontrolnow and #gunreformnow.
During the 2016 presidential election, Russian-linked bots were suspected of having amplified President Donald Trump’s tweets and promoted divisive policy issues, like immigration.
“We need to know if we are having debates with real people or if we’re being manipulated,” said State Senator Bob Hertzberg (D-Los Angeles). “Right now we have no law and it’s just the Wild West.”
Sen. Hertzberg’s bill requires that bots, which are also often used for marketing purposes, disclose their status and not proceed in engaging a user with “the intention of misleading and without clearly and conspicuously disclosing that the bot is not a natural person.”
Platforms would be required to put reporting procedures in place, where bots in violation of the new rules could be flagged, and deliver statistical information every two weeks on the company’s strategy to the state Attorney General.
“California feels a bit guilty about how our hometown companies have had a negative impact on society as a whole,” Shum Preston, national director at Common Sense Media, told Bloomberg. “We are looking to regulate in the absence of the federal government. We don’t think anything is coming from Washington.”
The legislation is set to go before two Californian committees this month.