February 2, 2026
A major report has been published setting out the threats that AI poses to the creative industries in the UK, together with clear recommendations for how the Government can respond.
The report – Brave New World? Justice for Creators in the Age of AI – is the product of organisations that collectively represent over 80,000 individual creators and performers: The Society of Authors, The Association of Illustrators, the Independent Society of Musicians, The Association of Photographers, and Equity.
The tone of the report is set from the opening sentence of its foreword, written by Baroness Kidron, one of the leading members of the House of Lords who last year sought (ultimately unsuccessfully) to amend the Data (Use and Access) Bill to strengthen protections for copyright owners. She writes that “the UK government is presiding over one of the greatest acts of theft in modern history: the stripping of the UK’s creative industries of their rights, livelihoods, and control over their work”.
The effects of this activity, the report argues, are already being felt across the creative industries. It cites stark figures such as musicians describing cuts to income of up to 50% as generative AI replaces paid work, professional photographers reporting average losses of £14,000, and 86% of authors declaring that AI has already led to a drop in earnings. Perhaps most strikingly, the report states that 99% of the some 10,000 creators it surveyed say their work has been scraped without their consent.
Much of the report is devoted to presenting its findings, alongside case studies of how generative AI is affecting those in various creative industries. It also points to the broader impacts on the environment, society, and the wider economy if a £124.6 billion industry is hollowed out.
Despite the bleak picture it paints, the report also offers a practical way forward through a framework that it suggests will strike the balance between protecting the rights of creators and allowing AI companies to continue to develop their models.
The so-called ‘CLEAR Framework’ demands the following:
- Consent and licensing. Creators must be given the option to decide whether their works are used to train generative AI models and, if they agree, they must be compensated.
- Licensing: a sector-specific approach. A one-size-fits-all licensing model will not work. Instead, sector-specific models should be developed alongside relevant trade unions and representative bodies. The report also suggests that statutory intervention may be needed to establish minimum enforceable standards on transparency, consent and remuneration, particularly to counter the risk of opaque licensing deals negotiated by powerful intermediaries that concentrate revenue upstream at the expense of individual creators.
- Ethical Use. Generative AI datasets must be built from lawfully licensed works.
- Accountability and transparency. Minimum transparency standards should require AI developers to, among other things: disclose what data was used to train their models; provide simple tools to allow creators to check if their work is included in training datasets; and explain whether outputs rely on identifiable creative styles, voices, or performers.
- Remuneration. The report emphasises that creators are not opposed to AI, but want to “participate in its economy, not be erased by it”.
The report ends as it begins, arguing that the CLEAR framework “offers a roadmap for reform, one that restores balance, protects creators and supports the industries that make the UK a cultural world leader. Without it, GenAI remains a one-way pipeline for the theft of copyright materials extracted to reward multinational monopolies while riding roughshod over the rights, incomes and trust that keep our creative sector alive”.
To read the report in full, click here.
Expertise