According to The Economist, on November 4th Britain’s High Court ruled that Stability AI wasn’t liable for copyright infringement despite training its image generator using Getty Images content. Judge Joanna Smith described her findings as “historic and extremely limited in scope,” essentially admitting she had little choice given Britain’s outdated copyright laws written before AI training was conceivable. The government launched a consultation in December 2024 aiming to establish a framework that rewards creativity while incentivizing innovation, which received over 11,500 responses compared to fewer than 100 in a 2021 similar consultation. Hundreds of writers, musicians and artists signed an open letter accusing Prime Minister Keir Starmer of “giving away” their work, with one controversial proposal letting tech firms train on copyrighted works unless rights holders opt out.
The investment dilemma
Here’s the thing – Britain wants to become an AI hub, but this ruling creates massive uncertainty. Firms like Stability AI now know they’re operating in legal gray areas. And without clear rules, why would any company commit serious investment? They’re already dealing with high commercial electricity prices that make running energy-hungry AI models expensive. Now add potential copyright lawsuits to the mix.
Basically, the government needs to decide whose side they’re on. Do they protect the creative sector that includes global stars like J.K. Rowling and Ed Sheeran? Or do they create conditions that attract tech investment? Law firm Osborne Clarke’s disputes head Arty Rajendra put it perfectly: “The government needs to decide which way to jump.” The longer they wait, the more likely tech firms will just set up shop in America or Japan where the rules are clearer.
The creative backlash
You can’t blame artists for being furious. Imagine spending years developing your style and technique, only to have AI companies scrape your work without permission or compensation. The opt-out proposal many artists hate basically puts the burden on creators to constantly monitor and object to training uses. That’s practically impossible for individual artists to manage.
And the response numbers tell the story – from fewer than 100 responses in 2021 to over 11,500 now? That shows how much the creative community understands what’s at stake. This isn’t some niche legal issue anymore – it’s about whether human creativity gets protected or becomes free training data.
What comes next?
So where does this leave us? The High Court basically passed the buck to lawmakers. Judge Smith’s “limited in scope” language was basically a polite way of saying “fix your outdated laws, Parliament.” Now the government consultation will determine whether Britain becomes an AI-friendly jurisdiction or protects its world-class creative industries.
The timing couldn’t be more critical. AI development is accelerating, and other countries are making their positions clear. Britain risks getting left behind if it can’t provide legal certainty. But it also risks damaging one of its most valuable export sectors if it gets the balance wrong. Nobody’s happy with the current situation – which probably means the court got it exactly right by forcing the issue into the political arena.
