Reconfiguring the UK’s approach to cultural labour: how might policy makers approach the arts and AI?

Robyn Dennis, Client Executive

Worthy Farm awoke from its slumber once again last weekend, as 200,000 music-lovers descended on Glastonbury festival for another year of its cultural spectacular- newcomers this year included the festival’s first ever Indonesian band, thrash metal trio Voice of Baceprot, and the "fire-belching, mind-blowing" dragonfly that has usurped Arcadia's infamous spider.

Novelty is in the air, then- with the general election looming, headline acts such as Dua Lipa may well not be the only frontrunners who will make their debut this week (although musical talent may differ, despite Keir Starmer’s childhood violin lessons with the DJ Fat Boy Slim). Whilst the two may feel worlds apart, it is this sense of newness that has played a central binding force in the role of politics and culture recently, particularly with the rise of artificial intelligence and its effects on the arts.

The next Parliamentary term will prove crucial for UK artists in determining how AI is regulated and utilised effectively- it was announced last month, for instance, that Arts Council England has joined forces with Goldsmiths, University of London, to develop best practice guidance on using artificial intelligence (AI) in the cultural sector. Funded by UK Government programme Bridging the Responsible AI Divide (BRAID), the team at Goldsmiths will work with ACE over the next 18 months to explore how AI might be ethically and responsibly integrated into its work and the work of the organisations it supports.

Against such a backdrop, all the major Parties in the UK are beginning efforts to incorporate the challenges of an exponential rise in AI use in the cultural sectors – industries previously hailed for their individuality and innovation of the cultural unknown. Equally, the speed of AI advancement also provides political policy makers an opportunity to redefine our view of creative work and how AI might challenge its value.

Whilst the Conservatives focused on growth and investment in the sector for their recent manifesto, Labour’s present remedy to AI in the creative industries revolves around protecting workers through the UK copyright framework, which they argue is the central form of protection to artists’ intellectual property. At a recent speech at the Creative Cities Convention, Shadow Culture Secretary Thangam Debbonaire concurred, “[Copyright and IP] is the way that we protect the raw materials of the creative industries – the creative output and imagination,” she added. “Getting this right will be good for the screen sector as a whole, as well as individual creatives, to protect the films and shows you’re invested in.”

Recent lawsuits over copyrighted works indicate this focus on copyright is not unfounded, with industry mainly concerned about the degradation of their activities to data production for AI and their attempt to slow AI’s advancement. The New York Times, for instance, brought Open AI and Microsoft to the court for their unauthorised use of its articles for AI datamining. The concern is echoed by an AI Open Letter (2023) signed by 2,400+ journalists, writers and artists globally who see AI datamining as ‘effectively the greatest art heist in history’.

These concerns are anything but seeped in melodrama – in recent weeks, music generators such as Udiohave developed to produce an ‘original’ song in seconds with a simple prompt. Sora AI can generate thousands of music videos an hour, whilst enormous capital is being poured into Google’s Gemini 2 or OpenAI’s GPT5 to produce sophisticated, and rather elegant, prose for online commentaries.

The exponential rise of ‘creative’ generative AI such as these indicates a pressing demand to re-think how we formulate the value of our creative industries in the UK.  Whilst all major parties focused on the value of our cultural industries to growth and international clout, the framing of these industries by their economic value alone is ill-informed against the backdrop of AI. Indeed, though the ability of these new technologies to churn out content is unparalleled, the threats it poses to job losses, intellectual property and a skewed distribution of labour must not form the only concern for policy makers.

Rather, the unknown creative potential of generative AI – and its potential to lessen the centrality of human contribution – presents a key danger in the loss of humanity from our creative industries. The introduction of software such as Udio or Sora AI already reflects a shift in creative industries, that separates artist identity and human imagination with profitable culture for consumption. Whilst this has been countered by a host of benefits from the generative tools- including its versatility, affordability and technological innovation, like the epic drone show that opened Glastonbury on Wednesday- the cultural magnitude caused by live festivals such as Glastonbury equally demonstrates the relevance of authentic, human-centred art.

Labour hopes to protect the human input on this work through the proposed Regulatory Innovation Office, which it pledges will ‘set targets for tech regulators, end uncertainty for businesses, turbocharge output, and boost economic growth’, although it remains unclear exactly how it plans to affirm the importance of human creativity and cultural ownership. Equally, through initiatives like a more flexible Growth and Skills Levy, their recent manifesto erred on the side of collaboration through education through education. Rather than arguing it is the nuance of emotional intelligence and live performance that must be focused on to counter AI’s influence, Labour’s most recent publishing caveats in another direction, arguing that creative industries require highly skilled technical workers as well as creators, perhaps with a nod to closer collaboration with AI frameworks.

Whilst policy makers continue to make loose commitments to ‘binding regulation’ for developing AI models, then, it clear that a central mandate for growth will naturally mesh with artificial innovation in the creative industries. With the argument for creative democratisation and innovation, this is not necessarily a bad thing – Glastonbury even hosted a David Hockney AI- generated exhibition on the pyramid stage last year. Nor must it be assumed that the arts are inherently incompatible with profitable growth. And yet, when AI now competes with the human ownership of creativity, it is crucial that policy makers identify areas for state support in upholding the value of this human contribution and recognise the importance of art for art’s sake. Failure to do this will result in a diminished creative industry that favours profit over conceptual profundity and production frequency over creative control.

We’ve cultivated an environment that harbours independence. Whether they are early birds who go to yoga and then smash their news updates before 8.30am, or they simply hate travelling on the tube in rush hour, we trust and respect our team’s skills and conscientiousness. As long as core responsibilities are covered, our team is free to work flexibly.

We’re proud to be a living wage employer. We believe that no one should have to choose between financial stability and doing a job they love, so we pay a wage that allows our team to save for a rainy day and guarantees a good quality of life.

Many members of the Atticus Partners team hold the Communications Management Standard (CMS). CMS demonstrates a commitment to achieving excellence and assures our clients that we are providing the most effective service possible.

Sign up to receive the Atticus Agenda

Sign Up Here