NextFin News - The British government has formally abandoned its controversial plan to grant artificial intelligence companies a broad exemption to use copyrighted music, literature, and art for model training without explicit consent. Technology Secretary Liz Kendall confirmed on Wednesday that the proposed "text and data mining" exception, which would have allowed tech giants like Google and OpenAI to scrape creative works with only a limited opt-out provision for creators, is no longer the government’s preferred path. The reversal follows a high-stakes lobbying campaign led by cultural icons including Sir Paul McCartney, Sir Elton John, and Dua Lipa, who argued that the previous policy amounted to state-sanctioned "thievery" of intellectual property.
The retreat marks a significant pivot for a government that has spent the last year attempting to position the United Kingdom as a global AI superpower. By ditching the exception, the Department for Science, Innovation and Technology has effectively hit the "reset button" on a debate that has pitted the country’s £126 billion creative economy against a domestic AI sector growing 23 times faster than the rest of the economy. While the move is being hailed as a "major victory" by UK Music CEO Tom Kiehl, it leaves the British regulatory landscape in a state of strategic ambiguity. The government now admits it has no "preferred option" for how to resolve the tension between protecting artists and fostering the data-hungry innovation required for large language models.
The economic stakes of this policy vacuum are immense. The UK music industry alone generates £8 billion for the economy and supports 220,000 jobs, many of which are threatened by generative AI tools capable of mimicking a songwriter’s voice or a novelist’s prose style. Conversely, tech advocates warn that without a clear framework for data access, British AI startups will struggle to compete with firms in jurisdictions with more permissive "fair use" doctrines. Vinous Ali of the Startup Coalition expressed disappointment that a concrete solution remains elusive, noting that international competitors are moving ahead while the UK remains mired in consultation. The government’s own impact assessment acknowledges this dilemma, describing the creative sector as a "world-leading national asset" while simultaneously emphasizing the need for AI developers to access "high-quality content."
This legislative U-turn also reflects a broader global shift in the AI copyright wars. As U.S. President Trump’s administration continues to navigate the intersection of tech dominance and intellectual property rights in Washington, the UK’s decision to prioritize "creator control" could set a precedent for other G7 nations. The previous British proposal was seen as one of the most tech-friendly in the world; its collapse suggests that the political cost of alienating the cultural sector has become too high. Industry bodies like the BPI and the Ivors Academy are now pushing for a licensing-led market, where AI firms must negotiate and pay for the data they consume, rather than relying on statutory exceptions.
The path forward remains fraught with technical and legal hurdles. Beyond the immediate question of training data, the creative industry is demanding new "personality rights" to protect against AI-generated digital replicas and stricter transparency requirements that would force tech companies to disclose exactly what copyrighted material was used to build their models. For now, the UK government is buying time, stating it will not introduce reforms until it is confident they meet broader economic objectives. This cautious approach may satisfy the immediate demands of disgruntled artists, but it leaves the British tech sector waiting for a rulebook that is still being written.
Explore more exclusive insights at nextfin.ai.

