
New research from IFOW Research Fellow, Dr Clementine Collett, examines how literary creatives are using (or not using) GenAI, how their work and income are being affected, how this is shaping their hopes and fears for the future, and what measures they would like to see implemented by the Government to protect the UK’s precious literary scene.
Dr Clementine Collett, who is BRAID UK Fellow at the Minderoo Centre for Technology and Democracy (University of Cambridge), here lays out key findings and recommendations from her research, published in partnership with us at IFOW.
Novelists and those in the publishing industry are worried about the rise of generative AI (GenAI). There is widespread concern about the future of the novel, with 51% of novelists saying they think it’s likely that AI will displace their work entirely. More than this, literary creatives report to already be feeling the negative effects of this technology on their work. 39% of novelists report that their income has already been negatively impacted by GenAI.
These are some of the major findings of the research which I’ve been conducting this year with hundreds of novelists, fiction publishers, and literary agents for fiction across the UK.
It is vital that the Government, technologists, and the public pay attention to these findings, and it is imperative that we take them seriously given the immense importance of the literary arts for our culture, society, and economy. The creative industries are the beating heart of the UK. They are globally renowned cultural assets that generate immense soft power while also contributing £126 billion Gross Value Added (GVA) to the economy annually.
Novels are an important bedrock of the creative industries. They are a core part of the publishing sector, and form the basis for countless films, television series, plays, and musical compositions. More than this, though, they contribute more than we can imagine to our wellbeing, connection, culture, entertainment, and identity. However, with the rise of generative AI (GenAI), novelists, publishers, and the novel itself are experiencing unprecedented uncertainty, change, and challenge. Below, I outline some of the key findings from this study.
Firstly, novelists and literary creatives expressed concern about a loss of creativity and the de-skilling of younger generations through an increased use of GenAI. We already know that 1 in 4 children aged 8-12 are using AI, and of the children who use it, 4/10 use it for creative tasks. Literary creatives voiced fear around the use of AI within the creative process (where access needs do not require its use) and spoke about how this might impact the development of imagination, empathy, resilience, problem-solving and critical thinking.
Secondly, and in light of this, it is perhaps unsurprising that most literary creatives reported not using GenAI themselves. 67% of novelists said they never use AI, partly due to the negative perception of using AI for ‘creative’ tasks. Where literary creatives do use GenAI, they reported most commonly using it for what they deem to be ‘non-creative’ tasks such as information search. There was recognition from many that AI could be useful within the writing process for certain ‘non-creative’ tasks, but the call is for these systems to be responsibly designed – trained on licensed data and less harmful to the environment.
Thirdly, on the impact of GenAI on the novel itself, literary creatives predicted a loss of originality in the content, style and the language of novels. Genre fiction, such as romance, thrillers and crime, was perceived to be at highest risk of displacement, and literary fiction was seen to be the least at risk. Interestingly, some creators also anticipated a potential rise in experimental fiction to counteract ‘AI style’.
Fourthly, as millions of books are scraped from shadow libraries such as LibGen and used to train AI models without consent, novelists in the UK are already experiencing the harsh reality of ensuing economic challenges. As mentioned earlier, over a third of novelists (39%) report that their income has already been negatively impacted by GenAI. This is for a range of reasons, including competition from AI-generated books, and sabotage of sales due to rip-off AI-generated imitations of books appearing online under the names of real authors. The irony being, of course, that the work of these novelists has likely been used to train the GenAI which they feel is now competing with their own work. Almost two thirds of novelists (59%) report that they know their work has already been used to train AI without permission or remuneration.
The Government must protect our creative industries in the UK. The message from novelists and literary creatives is clear - 86% of literary creatives support an opt-in model for AI training based on licensing structures which would enable them to give their informed consent and be fairly remunerated for the use of this work. There were also clear calls for granular transparency from AI companies concerning the data used to train their AI models, which would help to facilitate a licensing market and would help creatives to exercise their rights.
Beyond copyright and responsible AI design, we must also look to educate the next generation in using AI responsibly, supporting digital skills in children that enable them to be critical of AI systems as well as to be digitally literate. For this, the Department for Education should mandate AI-free creative writing programmes and reading in schools. Funding creative writing initiatives, particularly among vulnerable and underrepresented groups, and supporting the work of cutting-edge independent publishers, will be hugely important to ensuring that unique and diverse voices in literature are encouraged and nurtured, and skills shared and passed on.
Novels contribute substantially to our culture, education, identity, wellbeing, and entertainment. If implemented, the recommendations in this report should help to support the novel’s continued, vital position as the heart of the thriving British creative sector, while also supporting the UK as a world leader in the design and implementation of responsible AI.