We have recently had unprecedented “ping pong” between the Lords and Commons on whether to incorporate provisions in the Data Use and Access Bill ( now Act) which would ensure that AI developers would be required to be transparent about the copyright content used to train their models. Liberal Democrats in both the Lords and Commons consistently supported this change throughout. This is why.

As Co-chair of the All-Party Parliamentary Group on Artificial Intelligence and now Chair of the Authors’ Licensing and Collecting Society (ALCS), I find myself at the epicentre of one of the most significant intellectual property debates of our time.

The UK’s creative industries are economic powerhouses, contributing £126 billion annually while safeguarding our cultural identity. Yet they face an existential challenge: the wholesale scraping of copyrighted works from the web to train AI systems without permission or payment.

The statistics are stark. A recent ALCS survey revealed that 77% of writers don’t even know if their work has been used to train AI systems. Meanwhile, 91% believe their permission should be required, and 96% want compensation for use of their work. This isn’t anti-technology sentiment – it’s about basic fairness.

From Sir Paul McCartney to Sir Elton John, hundreds of prominent creatives have demanded action. They’re not opposing AI innovation; many already use AI in their work. They simply want their intellectual property rights respected so they can continue making a living.

December’s government consultation on Copyright and AI proposed a text and data mining exception with an opt-out mechanism for rights holders. This approach fundamentally misunderstands the problem. It places the burden on creators to police the internet, protecting their own works – an impossible task given the scale and opacity of AI training.

The creative sector’s opposition has been overwhelming. The proposed framework would undermine existing copyright law while making enforcement practically impossible. As I’ve consistently argued, existing copyright law is sufficient if properly enforced – what we need is mandatory transparency.

During debates on the Data (Use and Access) Bill, Baroness Kidron championed amendments requiring AI developers to disclose copyrighted material used in training data. These amendments received consistent support from all Liberal Democrat MP’s and peers, crossbench peers, and many  Labour and Conservative backbench peers.

The government’s resistance has been remarkable. Despite inserting a requirement for an econimic impact assessment and a report on copyright use in AI development, they have opposed mandatory transparency, leading to an unprecedented “ping-pong” debate between the Houses.

Transparency isn’t about stifling innovation – it’s about enabling legitimate licensing. How can creators license their work if they don’t know who’s using it? How can fair compensation mechanisms develop without basic disclosure of what’s being used?

The current system allows AI companies to harvest vast quantities of creative content while claiming ignorance about specific sources. This creates a fundamental power imbalance where billion-dollar tech companies benefit from the work of individual creators who remain entirely in the dark.

The solution isn’t complex. Mandatory transparency requirements would enable:

  • Creators to understand how their work is being used
  • Development of fair licensing mechanisms
  • Preservation of existing copyright frameworks
  • Continued AI innovation within legal boundaries

This debate reflects deeper concerns about AI innovation coming at the expense of human creativity. The government talks about supporting creative industries while simultaneously weakening the intellectual property protections that sustain them.

We need policies that recognize the symbiotic relationship between human creativity and technological advancement. AI systems trained on creative works should provide some return to those creators, just as streaming platforms pay royalties for music usage.

The government has so far failed to rise to this challenge. But with continued parliamentary pressure and overwhelming creative sector support, we can still achieve a framework that protects both innovation and creativity.

The question isn’t whether AI will transform creative industries – it’s whether that transformation will be fair, transparent, and sustainable for the human creators whose work makes it all possible.