Tech

U.K. advocacy group releases deepfakes of Corbyn, Johnson endorsing each other

Future Advocacy created a pair of political deepfakes to raise awareness of the threat.

Photo of Siobhan Ball

Siobhan Ball

corbyn deep fakes

To those who follow U.K. politics closely, two very surprising videos appeared on the internet today.

Featured Video

One featured Prime Minister Boris Johnson endorsing opposition leader Jeremy Corbyn for the upcoming election.

Advertisement

The second video had Corbyn doing the same thing for the sitting prime minister.

Advertisement

This isn’t some strange kind of quid pro quo between the two party leaders, though you might be forgiven for thinking so due to the current state of U.K. politics.

Instead, they’re deepfakes; AI-generated videos that superimpose one person’s image over another, put out by think tank Future Advocacy to warn us about the dangers this technology pose to democracy.

Advertisement

Concerns about deepfakes being used to interfere with elections and otherwise spread misinformation are as old as the technology itself.

While the vast majority of deepfakes are actually porn right now, as the technology improves, it’s possible there will be attempts at using them for political ends.

Advertisement

At the same time, people are already blaming deepfakes when their preferred politicians says or does something they disagree with, even when the video footage is verifiably real.

Advertisement

Future Advocacy wants to raise awareness of the technology itself and its potential to spread disinformation, interfere with elections, and otherwise prevent people from knowing what’s really happening in the world.

They also want to promote dialogue about regulation.

Having managed to get the phrase “deepfake” trending on Twitter, with people advocating for better education on the subject, they have successfully met at least part of that goal.

Advertisement

Advertisement

https://twitter.com/LeftWingKav/status/1194170714527584257

However, not everyone is impressed by the quality of work on Future Advocate’s deepfakes.

https://twitter.com/ludditus/status/1194283657021788167

Advertisement

Rather than using technology to imitate Johnson and Corbyn’s voices, they hired impressionists to do the job, and many people found their work far from plausible.

Advertisement

https://twitter.com/BatmanInit/status/1194204662947307520

Advertisement

Others find the effect of the videos too plausible, and—while acknowledging that the voices are far from perfect—point out that this won’t actually stop people from being taken by them.

Advertisement

Twitter user @elmyra, who studies deepfakes and the way they’re used to create pornography, explained how easy it would be to edit these videos to hide their deepfake origins.

Advertisement

The fact that no watermark appears in the first part of the videos while the fake politicians are endorsing each other, presumably to enhance their impact by making them more plausible prior to the reveal, shows it all too easy to also just crop any kind of reveal out.

Advertisement

While Future Advocacy raised an important issue for the future of politics, it’s entirely possible that in the short term they have created a little more confusion.

Advertisement

READ MORE: 

 
The Daily Dot