• Posted

Elton John recently accusing the government of “committing theft” highlights issues over how artificial intelligence (AI) is using copyrighted content. He is one of many well-known artists demanding more transparency from AI companies about how they use creative works. In this blog, senior commercial lawyer Sarah Liddiard explores how the government’s latest proposal to avoid a potential influx of AI copyright disputes could impact creative businesses…

AI and copyright issues goes beyond celebrity headlines. A parliamentary inquiry into generative AI has warned that failing to protect UK creatives could harm one of the country’s most valuable industries, which contributes £124 billion to the economy each year.

For businesses in publishing, design, marketing, fashion, technology, or media, AI is already creating complex challenges around copyright, licensing, and originality. Although formal legal reforms are still being developed, it is becoming clear that companies must understand how their content might be used to train AI systems — and how they can respond.

The government has backed away from its earlier plan to allow AI developers broad access to copyrighted material.  One of the key proposals in the recent government consultation is a new “text and data mining” exception. This would allow AI models to use copyrighted material unless the creator has explicitly opted out. The government says this approach is meant to build trust, encourage transparency, and support innovation. However, many creators are concerned that it could lead to a loss of control over how their work is used. The government has stated that there will be further legislative proposals and further consultation to govern frontier AI.

This is especially important for businesses that produce digital content, such as images, branding, or written materials. AI tools can learn from publicly available data — including websites, product descriptions, and marketing materials — and use it to generate new content.

For creative businesses, the stakes are high. The government has suggested opt-out tools like metadata tags or “do-not-train” lists previously, but these may be difficult to implement, especially for smaller companies.

There is also a risk that businesses could unknowingly use AI-generated content that is based on copyrighted material. Without clear licensing rules or transparency from AI developers, this could lead to legal problems or damage to a company’s reputation.

Now is the time for businesses to take action. Those that create original content should review how they protect and label their work. They should also stay informed about the implementation of any opt-out systems. Businesses using generative AI should be careful to document sources, use only approved tools, and create clear policies to avoid copyright infringement.

In this fast-changing landscape, businesses should not assume that doing nothing is safe. If your work involves creating, licensing, or using content that could be affected by AI, it is essential to prepare now.

You can find full details of the government’s proposals to give creative industries and AI developers clarity over copyright laws on gov.uk.

The contents of this article are intended for general information purposes only and shall not be deemed to be, or constitute legal advice. We cannot accept responsibility for any loss as a result of acts or omissions taken in respect of this article.