Has the Online Safety Bill overcome its criticisms?

By Megan MacDougall 
17/02/2022


It has been nearly two years since the Online Harms White Paper was published in April 2019. Following the publication of a draft bill, extensive pre-legislative scrutiny, with the Joint Draft Online Safety Bill Committee delivering its much-anticipated report in December, the Government has announced new clarifications to the legislation. And, though we are one step closer to seeing the Bill’s publication, it has been far from a quiet procession with hefty scrutiny of the legislation and an at times heated debate surrounding it. Due to return to Parliament imminently, we explore whether the Online Safety Bill has managed to overcome its criticisms.

The Online Safety Bill is set to introduce a new regulatory framework to tackle harmful content online aimed at both children and adults. The Bill imposes a duty of care on the providers of content sharing platforms and search services, which would prevent exposure to harmful content. Compliance would be enforced by the Office of Communications (Ofcom), the UK’s communications regulator. The regulator will be able to levy fines on non-compliant companies of an amount of up to 10% of their annual global turnover.

Upon the release of the draft Bill in May, the then Digital Secretary Oliver Dowden noted that it would “usher in a new age of accountability for tech and bring fairness and accountability to the online world”. However, the bill has received a mixed response. Frequent criticisms have argued that it is overly ambitious, too broad and, to the very core of the bill, what actually is the definition of “harmful content?” It has also sparked a debate on the fundamentals of free speech versus censorship – a particular concern of social media businesses and parliamentarians (see the House of Lords Communications and Digital Committee’s report ‘Free for all? Freedom of expression in the digital age’).

Nor is this concern limited to the regulated. Indeed, the regulator, Ofcom, has been set with what is, to be frank, a momentous task. As The Guardian reported last year “Dame Melanie Dawes, had warned of being ‘overwhelmed’ by complaints from social media users and having to deal with the “sheer legal weight” of big tech’s response to the act once it becomes law.” Sectors across the digital space, from big tech to search engines, will be waiting in trepidation for Ofcom’s codes of conduct for compliance. Indeed, with the Bill about to return to Parliament, there is still a great deal to go before the UK even considers implementation. 

Of course, ahead of the Bill’s expected publication, attention has increased in recent weeks. Parliamentary committees have been publishing en-masse their recommendations on everything from freedom of speech and financial crime to the remit of the bill itself.

In late January, the House of Commons Digital, Culture, Media and Sport Committee published its latest report on the bill. Its outlook was damning. It noted, “There are several areas where existing pre-legislative scrutiny has missed an opportunity and must go further” and “the Bill neither adequately protects freedom of expression nor is clear and robust enough to tackle the various types of illegal and harmful content on user-to-user and search services.” 

In early February, the House of Commons Treasury Committee released its report on financial crime, in which it reiterated the importance of one element of the bill – to protect people against user-generated financial fraud on social media and dating apps. And in the same month, the House of Commons Petitions Committee published its report on tackling online abuse, in which it outlined: “The lack of clarity in the draft version of the Bill on what content will be covered under this definition is unhelpful.”

Whilst the recommendations of each committee are meticulously researched, with each important to consider, the breadth of recommendations highlights the complexity and expansive implications of the Online Safety Bill. It also poses the question of whether one bill can do it all. 

The present and the most recent developments suggest the Government is taking on board some of these recommendations. Last week, Digital Secretary Nadine Dorries announced a new list of provisions, which are to be included in the Online Safety Bill. Since Dorries took over from Dowden, she has promised, as reported in Politico, a “much tougher and stronger” approach. The new list of expected offences includes that such as revenge porn, hate crime and people smuggling. Dorries noted, “today’s changes mean we will be able to bring the full weight of the law against those who use the internet as a weapon to ruin people’s lives.” Critically, these additions mark a shift in the way companies would handle content that falls under these categories, from a reactive response to proactive. And only this week the Financial Times has reported that Home Secretary Priti Patel has written to colleagues, pushing for an increased liability on Big Tech to monitor content which is deemed “legal but harmful.” Is the Government about to prescribe tools and moderation policies? 

Whilst the progression and consultation surrounding the draft Bill has quelled some concerns regarding the legislation, it has also brought to light additional complexities. As it stands, it is unlikely the new Bill will answer every outstanding question, which leaves a significant role for MPs and peers as the legislation progresses later this year. 

As the legislation heads to Parliament following its expected publication between March and the Queen’s Speech, MPs and peers alike will have an unparalleled opportunity to correct the ills they perceive within the Bill, going as far as to strengthen, or even dilute its intended consequences. With so much history in getting the Online Safety Bill to the point where it goes to Parliament, it is important to remember that where we are now is only the end of the beginning. Just as the lobbyists of Big Tech have descended on Brussels to counter the Digital Services Act and Digital Markets Act, so they will on Westminster. And that is before the Competition Bill is announced in May’s Queen’s Speech. 


Meg MacDougall is an Account Manager at Atticus Communications. 

She works across corporate, technology and third sector clients, delivering campaigns covering government relations, public policy, and strategic communications. She can be contacted at mmacdougall@atticuscomms.com.








We’ve cultivated an environment that harbours independence. Whether they are early birds who go to yoga and then smash their news updates before 8.30am, or they simply hate travelling on the tube in rush hour, we trust and respect our team’s skills and conscientiousness. As long as core responsibilities are covered, our team is free to work flexibly.

We’re proud to be a living wage employer. We believe that no one should have to choose between financial stability and doing a job they love, so we pay a wage that allows our team to save for a rainy day and guarantees a good quality of life.

Many members of the Atticus Partners team hold the Communications Management Standard (CMS). CMS demonstrates a commitment to achieving excellence and assures our clients that we are providing the most effective service possible.

Sign up to receive the Atticus Agenda


Sign Up Here