Creators are finding that the copyright laws that protect their work do not stand up well to AI models. In this guide, we take a global view of where this fast-moving situation has got to and where it might be heading next.
With (some) AI models training on just about anything they can devour online, visual assets have found themselves on the menu.
Artworks, illustrations and even photographs have been picked over while the real owners – creative partitioners, agencies or brands – remain blissfully unaware until they’re confronted by something created in their style, yet not created by them.
Want to go deeper? Ask The Drum
The current legal framework around ownership and copyright
The borderless nature of AI models’ consumption and imitation makes data scraping a global problem, with the law lagging behind the reality faced by creative practitioners and progress rates varying in different countries.
Copyright grants creators automatic legal protection over the use of original works, which are their intellectual property. This is an automatic IP right. Various multilateral agreements in place between participating countries bolster this.
However, as we’re not talking about original works here, this isn’t really applicable.
The view from the US
In the US, copyright offers protection for anything created by humans. If AI-generated content has been produced without obvious proof of significant human involvement, then there is no copyright.
In the case of an AI model aping someone’s work, it has been largely allowed to happen in the US as copyright doesn’t protect artistic style or techniques.
In court, someone would need to prove that their copyrighted work is very similar to an AI-created ‘artwork.’ This is hard to prove and is a battleground in live lawsuits like one involving Getty images against Stability AI and its Stable Diffusion model.
The common line of defense by AI model owners is that training AI on copyrighted works constitutes fair use as new work is being created rather than copied.
Meanwhile, opting out of data scraping AI platforms isn’t a perfect solution, as even if a platform agreed to such a request, its AI might already have been trained on the work in question.
The view from the UK
It’s a similar picture in the UK, where I recently spoke with an expert on the topic, Anti Copying in Design (Acid) co-founder Dids Macdonald OBE. In the UK, AI companies are allowed to mine copyright works for training unless creators opt-out.
“We believe the onus should be on AI developers to comply, not the other way around,” she said.
“The lack of standardized, accessible opt-out systems creates barriers for rightsholders and complicates enforcement, as many AI developers and bad actors use copyright without permission. This currently leaves little to no global protection for artists, designers, illustrators and photographers who have their style ripped off by AI.”
Macdonald also points towards potential violations of the Human Rights Act 1998, particularly Article 1 of Protocol 1, which protects creators’ ownership of IP, and Article 10, which safeguards freedom of expression.
“Arguably, these provisions demand respect for IP ownership and accountability, ensuring AI applications do not exploit creators’ works without consent,” according to Macdonald.
It’s not really enough for Macdonald, though, who contributed to a public UK government copyright and AI consultation, which was closed at the end of February. It remains to see how its findings will be acted on and if they will inform future policy.
The Make It Fair campaign is also pushing for legislative change in the UK.
Are any regions making more progress?
The European Union takes a similar line to the UK and US and has an opting-out system for copyright holders, but again their data could have already been gobbled up.
One country making progress is Japan, where a so-called ‘non-enjoyment purpose’ permits AI models to train on copyrighted work but not recreate the work or style.
Its Article 34 sets out guardrails that state that if an artist’s finances are affected or reputation harmed, then copyright has been infringed.
Owning your own AI model
A robust alternative might be owning your own AI model. In the course of speaking to Macdonald, I looked at one platform, Exactly.ai, that is offering creators just that. It can either train a model on creators’ own work and use it privately – which is the default setting, often used for personal or client-facing idea iteration – or it can make the model public so that creators are appropriately renumerated every time someone makes something in their style. That’s right, they’re paid.
Certifying fair use
One organization policing the AI model landscape is Fairly Trained, a certification standard that recognizes platforms that reward creators fairly.