This evening’s winter reception of the All Party Writers Group takes place at an important moment for authors and writers. It is therefore especially appropriate that we are joined by Dr Clementine Collett, whose important new report, The Impact of Generative AI on the Novel, sets out in clear terms the risks and opportunities that generative technologies present for long‑form fiction

Her work reinforces a message that writers, agents and publishers have been giving Parliament for some time: that generative AI must develop within a framework that protects the integrity of original work, the viability of creative careers and the trust of readers.
The starting point is the change of direction we have already seen. Following an overwhelming response to its consultation on copyright and AI, the Government has stepped back from its previously stated preferred option of a broad copyright exception for text and data mining. That proposal was regarded by authors and rightsholders as unfair, unworkable and difficult to reconcile with international norms. The decision to move away from it has been widely welcomed across the creative industries, and rightly so.
The government has recognised that the copyright creative content is not an input to be taken for granted, but an asset that needs clear, enforceable rights.
From the outset, rightsholders have been remarkably consistent in what they ask for. They want a regime based on transparency, licensing and choice. Transparency, so that authors know whether and how their works have been used in training AI systems and their rights can be enforced.
Licensing, so that companies seeking to build powerful models on the back of that material do so on lawful terms.
And choice, so that individual creators can decide whether their work is used in this way and, if so, on what conditions and at what price. Dr Collett’s report underlines just how crucial these principles are for novelists, whose livelihoods depend on the distinctiveness of their voice and the long‑term value of their backlist.
In parliamentary terms, much of this came into sharp relief during the passage of the Data (Use and Access) Bill, where many of us in both houses were proud to support the amendments brought forward by Baroness Beeban Kidron. Those amendments reflected the concerns of musicians, authors, journalists and visual artists that their works were already being used to train AI models without their permission and without remuneration. They made it clear that they were not anti‑technology, but that innovation had to be grounded in respect for copyright and for the moral and economic rights that underpin creative work.
Those concerns are echoed in Dr Collett’s analysis of how unlicensed training can erode both the economic prospects of writers and the incentive to invest in new writing.
Since then, there have been some modest but important advances. We have seen a renewed emphasis from the Secretaries of State at DSIT and DCMS on supporting UK creatives and the wider creative industries. Preliminary and then technical working groups on copyright and AI have been convened, alongside new engagement forums on intellectual property for Members of both Houses.
The Creative Industries Sector Vision, and the announcement of a Freelance Champion, signal an acceptance that the conditions for freelance writers must be improved if we want a sustainable pipeline of new work. For novelists in particular, whose incomes are often precarious and long‑term, the policy choices made now in relation to AI will have lasting consequences.
In parallel, the international context has moved rapidly. High‑profile litigation in the United States has demonstrated that the boundary between lawful and unlawful use of works for training models is real and enforceable, with significant financial consequences when it is crossed. The European Union has moved ahead with guidelines for general‑purpose AI under the AI Act, designed in part to give practical effect to copyright‑related provisions.
Courts in the EU have begun to address the legality of training on protected works such as song lyrics. Other jurisdictions, including Australia and South Korea, are clarifying that there will be no blanket copyright exemptions for AI training and are setting out how AI‑generated material will sit within their systems.
Here in Parliament, the Lords Communications and Digital Committee has continued its inquiry into AI and copyright, taking evidence from leading legal experts. A number of points have emerged strongly from that work: that transparency is indispensable if rightsholders are to know when their works have been used; that purely voluntary undertakings in codes of practice are not sufficient; and that there is, as yet, no compelling evidence that the existing UK text and data mining exception in section 29A of the Copyright, Designs and Patents Act should be widened. Dr Collett’s report adds a vital literary dimension to this picture, examining how the widespread deployment of generative AI could reshape the market for fiction, the expectations of readers and the discovery of new voices if left unchecked.
Against this backdrop, the position of writers’ organisations has been clear. The Authors’ Licensing and Collecting Society, reflecting a survey of over 13,500 members, is firmly opposed to any new copyright exception that would weaken protection for works used in AI training. We argue instead for licensing models that give technology companies access to content while preserving genuine choice and control for creators.
Working with the Copyright Licensing Agency, ALCS is developing a specific licence for training generative AI systems, initially focused on professional, academic and business content, where licensing is already well embedded and where small language models can be tested in a controlled way. There is strong concern that, if left entirely to market forces, generative systems could flood the ecosystem with derivative material, making it harder for original voices to be heard and weakening the economic foundation of literary careers. That is why many in the sector argue that fiction should be approached with particular care, and that any licensing solutions must be robust, transparent and genuinely optional.
Looking ahead, several priorities suggest themselves. First, Government should make clear that it will not re‑open the door to a broad copyright exception for AI training.
Secondly, it should actively support the development of practical licensing routes, including those being taken forward by ALCS and CLA, while recognising that fiction may require distinct treatment.
Thirdly, transparency and record‑keeping obligations on AI developers should be strengthened so that rightsholders, including novelists, can identify when and how their works have been used.
Finally, Parliament should continue to scrutinise this area closely, informed by expert work such as Dr Collett’s and by the lived experience of writers represented through this All-Party Group.
The past year has shown what can be achieved when writers organise and speak with a united voice. The Government has shifted away from its most problematic proposals and has begun to engage more seriously with the issues.
But for authors the destination has not yet been reached. The aim must be a settlement in which creators can be confident that their rights will be respected, that they have meaningful choice over the use of their work in AI, and that they can share fairly in any new value created. This evening’s discussion, and the findings of Dr Collett’s report, are an important contribution to that task. This work must continue, but I believe we are now on the right path: one of balance, respect and creative confidence for and by our creators in the digital age.When the Government launched its consultation on copyright and artificial intelligence, there was a strong sense of unease among creators and rights holders. Their response was overwhelming—and decisive. The Government quite rightly moved away from its original proposal to introduce a copyright exception for text and data mining. That so‑called “preferred option” would have been unfair to authors, unworkable in practice, and at odds with our international obligations under the Berne Convention and other frameworks.
Instead, the clear message from those who create—from writers and composers to journalists, artists and performers—was that transparency and choice must guide the use of their work in the age of AI. As many rightsholders stressed, a transparent licensing system would allow AI companies to gain legitimate access to creative material while ensuring that authors can exercise control and be remunerated fairly for the use of their works.
My Lords, I was proud to support the amendment tabled by Baroness Kidron to the Data (Use and Access) Bill earlier this year. I said then, and I say again tonight, that musicians, authors, journalists and visual artists have every right to be concerned about their work being used in the training of AI models without permission, transparency or remuneration. These creators are not seeking to halt innovation, but to ensure that innovation is lawful, ethical and sustainable. Only through trust and fairness can we achieve that balance.
Since then, welcome signs have emerged. A change of personnel at DSIT and DCMS has brought, I hope, a more vigorous commitment to our creative sectors. New engagement groups and technical working groups have been established, including those for Members of both Houses, to consider the complex interactions between copyright and AI. I commend that spirit of dialogue—but now we need to see outcomes, not just ongoing discussion.
The Government’s Creative Industries Sector Vision also set out ambitions that we can all share. The appointment of a Freelance Champion, long advocated by many of us, is especially welcome. We await news of how the role will evolve, but it is another step toward strengthening the creative economy that underpins so much of Britain’s soft power and international reputation.
Developments abroad remind us that we are not alone in this debate. In the United States, the landmark settlement between Anthropic and authors earlier this year, worth 1.5 billion dollars, demonstrates that AI companies cannot simply appropriate creative works without consequence. In Europe, the Commission is advancing guidelines for general-purpose AI under the AI Act, including measures to enforce copyright obligations. The Regional Court of Munich has likewise held OpenAI to account for reproducing protected lyrics in training outputs. Elsewhere, Australia has confirmed that it will not introduce a copyright exception, while South Korea moves ahead with its own AI-copyright framework.
Internationally, then, we see convergence around one simple idea: respect for copyright remains essential to confidence in creative and AI innovation alike.
That position is reflected clearly in the work of the Authors’ Licensing and Collecting Society. Its recent survey of over 13,000 members shows a striking consensus: loosening copyright rules would be counterproductive and unfair to writers. By contrast, licensing systems give creators choice and control, enabling them to decide whether—and on what terms—their works are used.
The ALCS, together with the Copyright Licensing Agency, is now developing an innovative licensing model for the training of generative AI systems. This is a pragmatic and forward-looking approach, beginning in areas like professional, academic and business publishing where licensing frameworks already operate successfully. It builds on systems that work, rather than tearing them down.
Of course, literary fiction is more sensitive territory, and the ALCS is right to proceed carefully. But experimentation in smaller, more structured datasets can be a valuable way to test principles and develop viable models. As the courts continue to deal with questions of historic misuse, this prospective route offers a constructive path forward.
The creative industries are united. They do not seek privilege, only parity. They oppose new copyright exceptions that would undermine markets and livelihoods, but they also recognise the need to make licensing work—so that ministers and AI companies cannot claim it is impractical or inadequate.
Much progress has been made. The Government is, at last, listening. But until creators can be confident that their rights will be respected, this campaign cannot rest.
Our writers, musicians and artists have given us immense cultural wealth. Ensuring that they share fairly in the new wealth created by artificial intelligence is not an impediment to innovation—it is the foundation of it. This work must continue, and I believe we are now on the right path: one of balance, respect and creative confidence in a digital age.






