Search for:
kralbetz.com1xbit güncelTipobet365Anadolu Casino GirişMariobet GirişSupertotobet mobil girişBetistbahis.comSahabetTarafbetMatadorbethack forumBetturkeyXumabet GirişrestbetbetpasGonebetBetticketTrendbetistanbulbahisbetixirtwinplaymegaparifixbetzbahisalobetaspercasino1winorisbetbetkom
Warning to UK politicians over risk of audio deepfakes that could derail the general election | Politics News

As AI deepfakes cause havoc during other elections, experts warn the UK’s politicians should be prepared.

“Just tell me what you had for breakfast”, says Mike Narouei, of ControlAI, recording on his laptop. I speak for around 15 seconds, about my toast, coffee and journey to their offices.

Within seconds, I hear my own voice, saying something entirely different.

Follow the latest updates on the election

In this case, words I have written: “Deepfakes can be extremely realistic and have the ability to disrupt our politics and damage our trust in the democratic process.”

Tamara Cohen's voice being turned into a deepfake
Tamara Cohen’s voice being turned into a deepfake

We have used free software, it hasn’t taken any advanced technical skills, and the whole thing has taken next to no time at all.

This is an audio deepfake – video ones take more effort to produce – and as well as being deployed by scammers of all kinds, there is deep concern, in a year with some two billion people going to the polls, in the US, India and dozens of other countries including the UK, about their impact on elections.

More on Artificial Intelligence

Sir Keir Starmer fell victim to one at last year’s Labour Party conference, purportedly of him swearing at staff. It was quickly outed as a fake. The identity of who made it has never been uncovered.

London mayor Sadiq Khan was also targeted this year, with fake audio of him making inflammatory remarks about Remembrance weekend and calling for pro-Palestine marches going viral at a tense time for communities. He claimed new laws were needed to stop them.

Ciaran Martin, the former director of the UK’s National Cyber Security Centre, told Sky News that expensively made video fakes can be less effective and easier to debunk than audio.

“I’m particularly worried right now about audio, because audio deepfakes are spectacularly easy to make, disturbingly easy”, he said. “And if they’re cleverly deployed, they can have an impact.”

Those which have been most damaging, in his view, are an audio deepfake of President Biden, sent to voters during the New Hampshire primaries in January this year.

A “robocall” with the president’s voice told voters to stay at home and “save” their votes for the presidential election in November. A political consultant later claimed responsibility and has been indicted and fined $6m (£4.7m).

Read more:
The digital election in India
Time running out for regulators to tackle AI threat
Biden to unveil sweeping AI regulations

Ciaran Martin, the former NCSC director
Ciaran Martin, the former NCSC director

Mr Martin, now a professor at the Blavatnik School of Government at Oxford University, said: “It was a very credible imitation of his voice and anecdotal evidence suggests some people were tricked by that.

“Not least because it wasn’t an email they could forward to someone else to have a look at, or on TV where lots of people were watching. It was a call to their home which they more or less had to judge alone.

“Targeted audio, in particular, is probably the biggest threat right now, and there’s no blanket solution, there’s no button there that you can just press and make this problem go away if you are prepared to pay for it or pass the right laws.

“What you need, and the US did this very well in 2020, is a series of responsible and well-informed eyes and ears throughout different parts of the electoral system to limit and mitigate the damage.”

He says there is a risk to hyping up the threat of deepfakes, when they have not yet caused mass electoral damage.

A Russian-made fake broadcast of Ukrainian TV, he said, featuring a Ukrainian official taking responsibility for a terrorist attack in Moscow, was simply “not believed”, despite being expensively produced.

The UK government has passed a National Security Act with new offences of foreign interference in the country’s democratic processes.

The Online Safety Act requires tech companies to take such content down, and meetings are being regularly held with social media companies during the pre-election period.

Democracy campaigners are concerned that deepfakes could be used not just by hostile foreign actors, or lone individuals who want to disrupt the process – but political parties themselves.

Polly Curtis is chief executive of the thinktank Demos, which has called on the parties to agree to a set of guidelines for the use of AI.

Polly Curtis, the chief executive of Demos
Polly Curtis, the chief executive of Demos

She said: “The risk is that you’ll have foreign actors, you’ll have political parties, you’ll have ordinary people on the street creating content and just stirring the pot of what’s true and what’s not true.

“We want them to come together and agree together how they’re going to use these tools at the election. We want them to agree not to create generative AI or amplify it, and label it when it is used.

“This technology is so new, and there are so many elections going on, there could be a big misinformation event in an election campaign that starts to affect people’s trust in the information they’ve got.”

Deepfakes have already been targeted at major elections.

Last year, within hours before polls closed in the Slovakian presidential election, an audio fake of one of the candidates claiming to have rigged the election went viral. He was heavily defeated and his pro-Russian opponent won.

The UK government established a Joint Election Security Preparations Unit earlier this year – with Whitehall officials working with police and security agencies – to respond to threats as they emerge.

Follow Sky News on WhatsApp
Follow Sky News on WhatsApp

Keep up with all the latest news from the UK and around the world by following Sky News

Tap here

A UK government spokesperson said: “Security is paramount and we are well-prepared to ensure the integrity of the election with robust systems in place to protect against any potential interference.

“The National Security Act contains tools to tackle deepfake election threats and social media platforms should also proactively take action against state-sponsored content aimed at interfering with the election.”

A Labour spokesperson said: “Our democracy is strong, and we cannot and will not allow any attempts to undermine the integrity of our elections.

“However, the rapid pace of AI technology means that government must now always be one step ahead of malign actors intent on using deepfakes and disinformation to undermine trust in our democratic system.

“Labour will be relentless in countering these threats.”

Nottingham Forest to privately hear VAR audio connected to three penalty claims | UK News

Nottingham Forest will be given the opportunity to privately hear the VAR audio connected to three penalty claims in their match against Everton last Sunday.

It comes as Forest have risked Football Association and Premier League sanctions over their extraordinary response to the three rejected penalty appeals.

The club said in a statement on Sunday they had “warned” Professional Game Match Officials Limited (PGMOL), the referees body, that VAR Stuart Attwell “was a Luton fan” but that they did not change the appointment.

The Premier League said it was “extremely disappointed” by the statement, adding it was “never appropriate to improperly question the integrity of match officials”.

The league said it was investigating the matter in relation to its rules, with regulations B.15 and B.16 governing the requirement on clubs and their officials to behave with utmost good faith.

Forest, who are in a relegation battle with Luton at the bottom of the Premier League table, went on to release a further statement on Monday evening calling for the rules around referees’ allegiances to be updated to account for “contextual rivalries in the league table”.

Meanwhile, three Forest staff – manager Nuno Espirito Santo, full-back Neco Williams and referee analyst Mark Clattenburg – have been asked by the FA for observations on the comments they made about the officiating at Goodison Park.

Nothing to hide for PGMOL

A day after posting its controversial statement, Forest called for the audio between Stuart Attwell and on-field official Anthony Taylor to be released publicly.

The club are unhappy that it appears that will not happen, the Press Association news agency understands.

Sources close to PGMOL insist it has nothing to hide and will give the club the opportunity to hear the audio in private, as it would any other club making a similar request.

It is also understood, however, that no decision has yet been taken on whether this audio would also feature in the next edition of “Match Officials Mic’d Up”, a series that aims to explain refereeing decisions using match footage and previously unreleased audio. which will air next Tuesday evening.

Earlier this season, audio related to a wrongly disallowed Luis Diaz goal for Liverpool at Tottenham was released publicly, but in that instance there had been a serious communication error so the matter was treated differently.

Nottingham Forest manager Nuno Esp..rito Santo during the Premier League match at Goodison Park, Liverpool. Picture date: Sunday April 21, 2024. PA Photo. See PA story SOCCER Everton. Photo credit should read: Peter Byrne/PA Wire...RESTRICTIONS: EDITORIAL USE ONLY No use with unauthorised audio, video, data, fixture lists, club/league logos or "live" services. Online in-match use limited to 120 images, no video emulation. No use in betting, games or single club/league/player publications.
Nottingham Forest manager Nuno Espirito Santo reacts during this side’s defeat at Goodison Park. Pic: PA

Are referees allowed to officiate games linked to clubs they support?

Referees declare allegiances and will not be assigned that team’s matches, or certain other fixtures such as those involving direct local rivals of that club. For instance, Michael Oliver has spoken in the past about how he cannot referee Newcastle games because he is a fan.

Other factors that determine appointments include which teams an official’s immediate family members support, as well as performance and the number of times they have officiated a particular team’s matches.

PGMOL takes all of that into account and endeavours to make the best appointments possible when allocating six officials to each Premier League fixture from a pool of 70 to 75, while also fulfilling Championship refereeing appointments.

Ultimately, it has confidence in the impartiality and professionalism of its officials.

No club is believed to have ever questioned the process in the manner Forest have since the birth of the Premier League 32 years ago, and neither have the club raised any concerns in relation to the previous occasions when Attwell has been the VAR at their matches this season.

Deepfake audio of Sadiq Khan suggesting Remembrance weekend ‘should be held next week instead’ under police investigation | UK News

Digitally generated audio of Sadiq Khan seemingly calling for Armistice Day to be delayed is being investigated by police.

Clips have been circulating on social media, using the London mayor’s voice and mannerisms, where he can be heard playing down the importance of Remembrance weekend commemorations.

In one clip, a voice similar to Mr Khan can be heard saying: “I don’t give a flying s*** about the Remembrance weekend.”

Follow the Sky News Politics Hub

The fake recording continues to say: “What’s important and paramount is the one million-man Palestinian march takes place on Saturday.”

It’s a reference to the Million March in 1965 – a civil rights protest in Washington DC attended overwhelmingly by people of colour.

A large pro-Palestinian demonstration in London calling for a ceasefire in Gaza has been planned for Saturday, with more than 2,000 police officers drafted in to help manage the event.

More on Artificial Intelligence

However, Home Secretary Suella Braverman has criticised the Metropolitan Police over its decision to allow the march go ahead.

Another clip using Mr Khan’s voice says: “I know we have Armistice Day on Saturday but why should Londoners cancel the Palestinian march on Saturday? Why don’t they have Remembrance weekend next weekend? What’s happening in Gaza is much bigger than this weekend and it’s current.”

The Metropolitan Police said it was investigating the fake clips.

“We can confirm that we have been made aware of a video featuring artificial audio of the mayor, and that this is with specialist officers for assessment,” the force said in a statement.

Please use Chrome browser for a more accessible video player

Why are people marching in London?

Read more:
More than 1,000 officers drafted in to help Met Police amid pressure to prevent Remembrance disruption
‘Disrespectful’ pro-Palestine march will go ahead on Armistice Day, PM says

Writing on social media, Mr Khan wrote: “While I hosted an interfaith Remembrance event with our armed forces at City Hall: the far-right were sharing deepfake audio about me.

“They may have new means, but their ends are the same – to divide our diverse communities. We must stand together – it’s what London does best.”

The mayor also used social media to point people to an article written in the Evening Standard about the importance of events this weekend.

In it, he writes: “It’s right that the organisers have said they will not protest near the Cenotaph. I urge everyone attending to co-operate with police and make sure to be respectful on Armistice Day.”

People’s comments on the faked audio ask if it is real or made with artificial intelligence – an indication of how accurate the technology used to make these kind of clips is.

It comes after a deepfake clip of Mr Khan’s party leader, Sir Keir Starmer, was circulated as Labour’s annual conference got under way in October, heightening fears about the potential impact of the technology on democracy.