Bots Vs the Byline
Sanskriti Bacchu, Intern
03/04/2024
News media is in the throes of its most significant revolution since the introduction of the internet. The advent of artificial intelligence (AI) presents vast opportunities for the media industry to enhance news production and delivery. However, it also poses significant risks, including a further erosion of trust, increased misinformation and a second assault on the industry’s business models.
AI thrives on extensive text and data mining to train its models. Creative industries and news organisations, such as the New York Times argue that popular AI models are being illegally trained on copyright protected or paywalled content without appropriate compensation and consent. This training data directly impacts the capabilities and profitability of the AI models, and the media industry contends that their products and services are thereby being profited off of by another entity. Though this hasn’t led to the breakdown of news business models yet, there are already some examples of AI websites churning traditional news sites’ content and profiting off diverted traffic and platform subscriptions.
The UK is well regarded for its media and creative industries, which contributed a combined £109 billion to the UK economy in 2021. The UK copyright framework, established in the Copyright, Designs and Patents Act 1988, predates the rise of AI and arguably now needs updating to reflect the rapidly changing developments in this sector. Industry bodies such as the News Media Association have called on the Government to explore the introduction of copyright legislation that protects these industries and safeguards their financial viability.
The Government’s current ‘hands-off’ approach to AI regulation stands in stark contrast to the EU, which released its AI Act in 2023. The legislation was lauded by The Federation of European Publishers (FEP), as it requires "general purpose AI companies" to respect copyright law. The UK’s Intellectual Property Office (IPO) intended to produce a code of conduct on the matter after consulting with rights holders and AI companies, but that disappointingly concluded in a stalemate.
The undertaking has now been deferred to the Department of Culture, Media and Sport and the Department of Science, Innovation and Technology. Together the departments are leading the engagement between industry executives, which include representatives from the BBC, the British Library, the Financial Times, Microsoft and DeepMind, to decide on an approach that allows the AI and creative sectors to grow together in partnership.
The potential structure for such a partnership has already been demonstrated by OpenAI and the Associated Press in July 2023, when they signed a landmark deal giving the tech firm permission to train its models on the news organisation’s archive. However, for such co-operation to become commonplace, the autonomy of rightsholders must be secure so that they may be compensated for the value they provide.
The recently launched ‘Future of News’ inquiry by the House of Lords Communications and Digital Committee, which is set to examine issues such around impartiality, trust and the impact of generative AI on news media business models, has been critical of the Government’s inaction. The Committee has called on the Government to support copyright holders, saying it “cannot sit on its hands” while Large Language Model developers exploit the works of rightsholders and has urged the Government to end the copyright dispute “definitively” including through legislation if necessary.
The Labour Party has also called for the Government to require tech companies to pay newspapers for their content, with Shadow Culture Secretary Thangam Debbonaire MP recently writing that “we must maintain robust copyright protections for content creators, and they must be fairly rewarded for their work. If negotiations fail, we believe an independent arbitrator would set a fair price”. If you believe the polls, Labour is likely to be the next government, and if current talks result in another impasse, it will be up to Labour to break through.
But until the next election, the industry will be watching to see if the regulator and competition authority, Ofcom, who among other regulators has been asked to evaluate AI risks and opportunities within their sectors by the end of April, will acknowledge AI as a serious threat to the competitiveness of the media industry.
The state of a country’s media industry says a lot about the health of its democracy. Thus, it is paramount that the UK Government ensures the future of the industry and acknowledges that copyright legislation is not a trade-off to AI innovation. The public is dependent on credible and value-delivering news organisations and the Government must act now to avoid losing the integrity of the British press.
We’ve cultivated an environment that harbours independence. Whether they are early birds who go to yoga and then smash their news updates before 8.30am, or they simply hate travelling on the tube in rush hour, we trust and respect our team’s skills and conscientiousness. As long as core responsibilities are covered, our team is free to work flexibly.
We’re proud to be a living wage employer. We believe that no one should have to choose between financial stability and doing a job they love, so we pay a wage that allows our team to save for a rainy day and guarantees a good quality of life.
Many members of the Atticus Partners team hold the Communications Management Standard (CMS). CMS demonstrates a commitment to achieving excellence and assures our clients that we are providing the most effective service possible.
Sign up to receive the Atticus Agenda
Sign Up Here