Skip to main content
sports_soccer
Tiki
notifications
What can players and clubs do about 'AI slop'?
BBC Sportscheduleabout 4 hours agopersonDale Johnson

What can players and clubs do about 'AI slop'?

In these AI generated images, Real Madrid striker Kylian Mbappe is shown on a skiing holiday with a turtle

Published March 2, 2026 • Source: https://www.bbc.com/sport/football/articles/cy8pdr55219o

By Dale Johnson (https://www.bbc.co.uk/sport/topics/cglgnp4394wt)

Football issues correspondent

You do not have to look far on social media to find images and videos of footballers in unlikely or bizarre situations.

Scroll through TikTok and you may soon stumble across Lionel Messi and Cristiano Ronaldo cutting each other's hair, or boarding the Titanic in Edwardian dress. You might even see Kylian Mbappe on a ski-lift with a turtle.

This is the result of the exponential growth of artificial intelligence (AI). Or, more precisely, AI 'slop'.

AI can be asked to deliver pretty much anything. By anyone. The tools are becoming ever more sophisticated and easily accessible.

It will become even harder to spot what is real and what is, in AI terms, deepfake.

It may seem, for the most part, like harmless fun. After all, who really thinks Messi and Ronaldo have been serving burgers?

But is there a point at which players and clubs will try to draw the line?

Options are limited for players to take action

As football has become a commercial juggernaut, players and clubs have had to learn how to look after their brands.

That could be by protecting the club crest or challenging the use of a player's name in unauthorised promotional material.

Take Chelsea midfielder Cole Palmer, who has trademarked the term 'Cold Palmer' with the UK government's Intellectual Property Office. The 23-year-old did the same with his name, autograph and signature 'shivering' celebration.

Creating protections is one thing. Being able to tackle this new AI world of relentless content is another.

In the UK there is limited legislation covering someone's likeness. Or, as it is called in football, image rights.

Jonty Cowan, legal director at law firm Wiggin LLP, told BBC Sport that AI was presenting "lots of novel challenges".

"Various governments around the world are trying to figure out... how do we react to AI?" said Cowan.

AI is being used to put players into real-life scenarios, as well as those more obviously fake.

Take the unveilings of Antoine Semenyo and Marc Guehi by Manchester City in January.

The club's official photographs show each player with director of football Hugo Viana. Yet before those pictures had even been taken, you could find AI images of Semenyo and Guehi signing a contract alongside manager Pep Guardiola.

There was another of Semenyo being greeted at the training centre by former player Yaya Toure, whose old squad number - 42 - he was expected to take.

None of these events happened, but it was impossible to tell the pictures were fake.

Last month, an image appeared of Manchester United head coach Michael Carrick with Frank Ilett - the supporter who won't cut his hair until the Red Devils win five games in a row.

Once again, it did not happen but looks so realistic.

And Cowan said it was difficult for there to be any recourse when content is presented "in a non-contentious manner".

Unless a person has suffered commercial or reputational damage, options are limited.

"It's always been quite challenging for an individual to enforce IP rights," Cowan said. "If it is a deepfake that is showing them in a compromising position, let's say, that's different."

The Data (Use and Access) Act came into force last month, making it a criminal offence to create, share or request a sexually explicit deepfake.

But then you have AI-generated videos such as Celtic's Luke McCowan punching an assistant referee. Could it damage his reputation, or is it just not believeable?

A more pressing concern for players might be 'passing off'. This is where someone unfairly associates their own products or services with the reputation and goodwill of an established brand or business - or player.

It is intended to mislead consumers into believing they connected to it - to the detriment of the established brand.

Cowan explained that in December 2024, as part of an AI-related consultation, the UK government said it was considering "introducing some kind of personality right".

That would give a player more scope to take action.

Clubs, for their part, have a few more options open to them.

Social media accounts putting players in the shirts of their new team - or any team - is nothing new.

But what if a club wanted to take issue?

"Where you've got, for example, the Man City kit they could look at other IP rights," Cowan said.

"Have they infringed the trademark in their crest? Or design rights in their shirt? For that kind of image, that's what a club or an individual would likely be looking at."

BBC Sport understands City believe fans know official channels remain the only places to go for any genuine news, images or videos.

But as the lines blur further, will clubs keep that stance?

Tackling platforms more realistic than court action

While clubs and players might consider taking the creators of AI images to court, it is a long and costly fight.

Cowan says there is a quicker and cheaper route: challenge the platforms directly.

"The Online Safety Act has been introduced in the UK recently, and that is putting an obligation on platforms to tackle illegal content," he added.

"It may well be that we will see more mechanisms that platforms will introduce to have that content taken down. Often, that is the easiest and quickest way to tackle these images."

This could lead to a growth in companies looking after the digital rights of clubs and players.

Those that already exist scrape websites and apps - using AI, of course - to identify where a company's intellectual property or a person's image might have been used.

They can request takedowns, effectively tackling the use of AI without the affected parties getting directly involved.

Bad actors may use AI for nefarious means

AI presents opportunities as well as problems. Adverts and promotional material can be created without players even needing to leave their homes.

But alongside the genuine AI-generated adverts, it is easy for unauthorised parties to take a player's likeness and use it to promote their business.

Last year the oversight board that runs Meta's appeals process banned an advert for a gambling app on Facebook, external (https://www.oversightboard.com/our-work/) that was created using AI.

It featured a manipulated video of former Brazil striker Ronaldo which imitated his voice. It was not picked up by Meta's automated detection tools.

Meta was told to create "easily identifiable indicators that distinguish AI content" to prevent "significant amounts of scam content".

It was a prime example of a platform being challenged and forced to act.

The Football Association has had to tackle controversy, too.

England head coach Gareth Southgate was targeted during Euro 2024. Fake AI-generated interviews showed Southgate making derogatory remarks about his players.

The videos were reported and taken down. They were found to have breached TikTok's AI-generated policy, which forbids content that "falsely shows public figures in certain contexts".

But by that point, the videos had been viewed and shared by millions of people.

Should users be forced to say they have used AI?

Scrolling through apps today, it is rare for anyone to indicate AI has been used.

That is even with TikTok's community guidelines asking users to "label realistic AI-generated content" and banning content considered to "harmfully mislead or impersonate others".

Cowan believes there is unlikely to be any major change to legislation, but platforms could be given tougher rules.

"There are transparency requirements under the EU AI Act," Cowan explained, with the act not covering the UK.

"Under advertising regulations, influencers have to disclose where a video they produce has been sponsored.

"I suspect we may end up with similar transparency requirements. A little '#AI generated' or similar label in the corner."

The problem will be whether creators care, and how easy enforcement is for platforms.

Cowan added: "If you've got those egregious videos, where someone's putting out a hideous deepfake, they're not going to worry about adding that label."

For now, at least, it seems clubs are not too concerned - that AI is just something happening on social media.

There might come a point they decide more action is required.

Related Topics

#Chelsea#Real Madrid#Manchester City