Deepfake video of Nigel Farage playing Minecraft ‘of course’ not real, party says

Estimated read time 4 min read

The video is clear: Nigel Farage, appearing on screen as a gaming livestreamer, is commentating as he plays Minecraft. The Reform UK party leader explains that he has logged on to Rishi Sunak’s server, tracked down the prime minister’s virtual home in the video game, and intends to blow it up.

Farage’s distinctive voice can be heard as he explains what he’s about to do: “I filled it to the brim with TNT. And for everyone’s information there were absolutely no traces of Sky TV services in or around the house.”

A mildly exasperated spokesperson for Farage confirmed that the video was “of course” not real and the Reform party leader had not been spending the campaign livestreaming Minecraft commentary.

“Quite funny though,” the spokesperson added.

The spokesperson’s reaction sums up the role of deepfake videos during this general election, as they so far fail to cause the disruption that some had predicted before the campaign.

Instead, deepfakes – digital content that has been manipulated using artificial intelligence, often to purportedly show famous people in fictitious situations – have largely existed in the form of obviously fake memes, such as an edit of Rishi Sunak’s national service plan where the prime minister appears to be instructing schoolchildren on how to play Fortnite.

The Sunak clip, the deepfake Farage video, and footage featuring Keir Starmer were made and uploaded to TikTok by PodcastPilotPro, a subscription AI app that enables users to pretend to be on a podcast with famous individuals.

Most users seem simply impressed by the slickness in making the AI-generated video, while spotting they are fakes. Or as one much-liked comment on the Farage video puts it: “Old people are going to get fooled by ai.”

Yet so far cruder manipulation of real clips can prove to be a more effective tactic. Tim Gatt, a digital campaign consultant, said: “I don’t think we should be celebrating yet – there’s still a long way to go in the election campaign. But it doesn’t necessarily have to be a very sophisticated deepfake in order to manipulate or trick the public.

“We’ve seen a lot of examples on Twitter, for example, of people engaging and sharing pretty simply-made misleading content that they want to believe is true or aligns with what they strongly believe in.”

A group of leftwing users opposed to Keir Starmer’s Labour party had used the social network X to spread poorly dubbed videos falsely suggesting, among other things, that the shadow health secretary, Wes Streeting, had criticised the Labour candidate Diane Abbott. After the BBC contacted X about the videos they were removed and the accounts were banned.

skip past newsletter promotion

Ciaran Martin, the former chief executive of the National Cyber Security Centre, has said that with a handful of exceptions – such as the recent Slovakian elections – it “has proved remarkably hard to fool huge swathes of voters with deepfakes”.

What matters, he wrote in the Guardian last week, is how “swiftly and comprehensively” a deepfake is debunked – with the real risk existing at a local level in individual constituencies.

One of the most damaging fake viral videos in UK politics this year fits this description, when a teacher in Dudley was falsely accused of racism while delivering leaflets on behalf of the Labour party. Legitimate footage from a doorbell camera was adjusted and overlaid with false subtitles alleging that the teacher had use a racial slur.

The video was distributed widely by Akhmed Yakoob, a local lawyer and social media personality who is standing against Labour in Birmingham Ladywood on a pro-Gaza platform. He later apologised, saying he did not understand what happened and he was sent the video “with captions already on it”.

Source: theguardian.com

You May Also Like

More From Author