On a Monday morning, just a couple of hours after the UK government released its AI opportunities action plan, my phone lit up with messages from artists. One, clearly frustrated, said, “We are not going to continue drawing so that the founders of these AI companies can get rich. I am quitting this job now.” Another simply stated, “I have given up at this point.” These sentiments aren’t new to me—they echo what I’ve been hearing for months. Creatives are leaving their jobs because AI companies are using their work without consent, exploiting it to train models that directly compete with them.
There’s a lot in the action plan that I can get behind. It aims to position the UK as a global leader in artificial intelligence. Matt Clifford, a venture capital investor, drafted the plan with suggestions to ease the way for British AI companies to access the necessary computing resources, update visa regulations to attract AI experts, and other impactful measures—all sensible moves that could invigorate the economy.
However, it’s the copyright recommendation that has caused unrest among artists. Clifford suggests tweaking copyright laws to benefit AI companies. He appears to support the government’s suggestion that creators’ copyrighted works—like art, music, and literature—should be available for AI companies to use freely for training purposes unless creators specifically opt-out. It’s a radical shift, turning copyright on its head, and effectively allowing AI companies to freely access the entirety of British creative history unless the original creators protest through some yet-to-be-defined process.
It’s not difficult to understand why creators have reacted negatively. If a large language model is taught with short stories, it can generate new ones; train AI with pop music, and it can produce new pop songs. This proposal essentially authorizes AI companies to commandeer existing works to develop scalable AI models that will inevitably surpass human creators. We already see generative AI diminishing the demand for human-created art, and this plan will only quicken that trend.
Imagine I launch a new venture, let’s call it Great British AI. The goal is straightforward: to replace the UK’s creators with AI. Achieving this wouldn’t be hard. We’d comb the internet for every piece of creative work by British artists. Sure, we’d acknowledge opt-outs, but realistically, how many people bother to opt-out when given the opportunity? Using whatever work we gather, we’d train state-of-the-art AI models. With these, we could mass-produce and sell various works in the same styles at a fraction of the cost.
Running this kind of business is currently illegal in the UK, and rightly so—it’s exploitative. Yet, under these new proposals, it would be permitted. I don’t believe the government intends to decimate the creative industries by legalizing what amounts to theft. However, if implemented, these proposals will lead to exactly that.
The main issue with the suggested opt-out system is its impracticality and unfairness. Here’s an example: I compose choral music, which is distributed through a music publisher. A choir purchases the sheet music and records one of my pieces, which is then broadcast on the radio. Can I opt-out of this recording being used to train AI? Absolutely not. Once published, my control over who uses the work and for what purposes vanishes. This scenario isn’t unique to music; it spans countless pieces across creative industries. Creators rarely maintain control over their work once it’s disseminated, severely limiting actual opt-out opportunities.
The US offers a more balanced approach with its “fair use” doctrine. Some unlicensed uses of copyrighted works are allowed, while others are not. Permissibility hinges on several factors, including how the use impacts the original work’s market. The UK’s proposal, however, lacks this nuance, opting for a broad copyright exception that could inflict significant damage.
Frustratingly, there’s no pressing need to overhaul copyright laws. The UK can lead in AI by adopting Clifford’s other forward-thinking proposals. Many AI developments—those relevant to fields like healthcare, science, and defense—are not built on creators’ works. Demis Hassabis’ Nobel-winning AlphaFold, for example, utilized a database of protein structures rather than copyrighted art. Overhauling copyright law would primarily benefit a few large, foreign AI companies eager to open branches in the UK—companies that have strongly lobbied for such changes and are enthusiastically endorsing the action plan.
Unfortunately, the government seems to have set its course to implement Clifford’s recommendation on copyright reform, in addition to his other suggestions. This premature decision undermines the ongoing consultation on AI and copyright, which still has six more weeks. What’s the point in engaging with a consultation if the conclusions are already drawn?
I believe Clifford genuinely wants what’s best for the country with his action plan. Yet, his voice shouldn’t be the only one the government listens to. They must also heed figures like Paul McCartney, Kate Mosse, Kate Bush, and countless other artists who reasonably oppose AI companies using their unlicensed works to develop competition.
I hope the government rethinks its approach. Revising 49 out of the 50 recommendations wouldn’t be disastrous. It’s the only path that ensures both the UK’s AI sector and its creative industries thrive together.
Ed Newton-Rex is the founder of Fairly Trained, a non-profit that certifies generative AI companies that honor creators’ rights, and a visiting scholar at Stanford University.