To those who follow U.K. politics closely, two very surprising videos appeared on the internet today.
One featured Prime Minister Boris Johnson endorsing opposition leader Jeremy Corbyn for the upcoming election.
Boris Johnson has a message for you.#GE2019 pic.twitter.com/ST4dXPbRYE
— Future Advocacy (@FutureAdvocacy) November 12, 2019
The second video had Corbyn doing the same thing for the sitting prime minister.
Jeremy Corbyn has a message for you.#GE2019 pic.twitter.com/mUlarnQmRW
— Future Advocacy (@FutureAdvocacy) November 12, 2019
This isn’t some strange kind of quid pro quo between the two party leaders, though you might be forgiven for thinking so due to the current state of U.K. politics.
Instead, they’re deepfakes; AI-generated videos that superimpose one person’s image over another, put out by think tank Future Advocacy to warn us about the dangers this technology pose to democracy.
A video showing Boris Johnson endorsing Jeremy Corbyn for Prime Minister has just landed online, another shows Corbyn backing Johnson.
— Catrin Nye (@CatrinNye) November 12, 2019
Confused? Well they’re deep fakes created by @futureadvocacy & I’ve been behind the scenes for the making of them for @VictoriaLIVE > pic.twitter.com/N5uvwsZAFU
Concerns about deepfakes being used to interfere with elections and otherwise spread misinformation are as old as the technology itself.
While the vast majority of deepfakes are actually porn right now, as the technology improves, it’s possible there will be attempts at using them for political ends.
At the same time, people are already blaming deepfakes when their preferred politicians says or does something they disagree with, even when the video footage is verifiably real.
This has to be a #deepfake, right?? https://t.co/DnrPV97e81
— Daniel Eglin (@danieleglin) November 11, 2019
Future Advocacy wants to raise awareness of the technology itself and its potential to spread disinformation, interfere with elections, and otherwise prevent people from knowing what’s really happening in the world.
They also want to promote dialogue about regulation.
Having managed to get the phrase “deepfake” trending on Twitter, with people advocating for better education on the subject, they have successfully met at least part of that goal.
An interesting example of a #deepFake video along which illustrates some of the potential implications. This is the kind of thing we need to be discussing in schools however the challenge is to find a space in the already busy curriculum. https://t.co/IpfqOpKKuJ
— gary henderson (@garyhenderson18) November 12, 2019
#Deepfake of UK prime minister Boris Johnson appearing to endorse Labour leader Jeremy Corbyn.
— Vivien Boidron (@VivienBoidron) November 12, 2019
Combined with Facebook not fact checking political ads this could lead to a surprising #BrexitVote.
Politicians have yet to address the issue of #disinformation online. https://t.co/zELCd6VdDy
https://twitter.com/LeftWingKav/status/1194170714527584257
However, not everyone is impressed by the quality of work on Future Advocate’s deepfakes.
https://twitter.com/ludditus/status/1194283657021788167
I’ve seen better tbh, poor effort.
— Soapbox Orator (@SrlUndrchvr) November 12, 2019
Rather than using technology to imitate Johnson and Corbyn’s voices, they hired impressionists to do the job, and many people found their work far from plausible.
The voices are obviously wonky.
— JonathanJK (@Jonathanjk) November 12, 2019
Voice accuracy is already here. Maybe they don’t want a version that would be more realistic going viral, thereby making their point for them.
https://twitter.com/BatmanInit/status/1194204662947307520
Others find the effect of the videos too plausible, and—while acknowledging that the voices are far from perfect—point out that this won’t actually stop people from being taken by them.
Publishing deep fake videos of politicians to demonstrate the danger to #democracy is a bit like detonating a nuke to demonstrate the danger of nuclear proliferation #deepfakes https://t.co/1btbl0BbN0
— Amir Tocker (@amir_t) November 12, 2019
Twitter user @elmyra, who studies deepfakes and the way they’re used to create pornography, explained how easy it would be to edit these videos to hide their deepfake origins.
– They then move on to saying they are #deepfakes.
— dr elmyra (@elmyra) November 12, 2019
– It’s only at that point that a watermark with @FutureAdvocacy‘s logo appears in the top right-hand corner.
Except: Boris Johnson is our Prime Minister, and *he always sounds like that*. He is beyond parody, making the rest of politics beyond parody, and your Boomer mum sharing random videos on Facebook definitely can’t tell the difference.
— dr elmyra (@elmyra) November 12, 2019
The fact that no watermark appears in the first part of the videos while the fake politicians are endorsing each other, presumably to enhance their impact by making them more plausible prior to the reveal, shows it all too easy to also just crop any kind of reveal out.
And of course, even once @FutureAdvocacy‘s logo appears in the corner, who the heck knows who @FutureAdvocacy are? Their name is so generic, this might as well be some kind of legit political ad.
— dr elmyra (@elmyra) November 12, 2019
While Future Advocacy raised an important issue for the future of politics, it’s entirely possible that in the short term they have created a little more confusion.
READ MORE: