Fighting the spread of harmful content online is an important goal, but effectively addressing the issue has grown more urgent and complicated as Internet users spend more time and post more content online. That’s why the Online Safety Bill was recently announced as the culmination of years-long discussions by the United Kingdom (UK) government and its regulatory agencies on how to make online services safer for users. In June, the legislation will face line-by-line scrutiny and potential amendments by the Public Bill Committee. Unfortunately, the Online Safety Bill does not sufficiently balance legal free expression, content moderation, and online privacy. Instead, it will open the door to government-sanctioned surveillance of user-created content, including private communications.
The Online Safety Bill requires search engines, social media and other services focused on user-created content to follow duties of care—binding legal obligations designed to prevent harm to others—to seek and remove a variety of online content. By enabling the government to use multiple duties of care to compel proactive monitoring obligations, the Online Safety Bill opens online content up to government snooping. From a civil liberties perspective, this sort of online policing is antithetical to users’ rights to legal free expression because it fosters a culture of businesses over-moderating legal but potentially harmful speech to avoid penalties. And from a privacy perspective, government-compelled monitoring will make UK citizens more vulnerable to bad actors online because it compels online services into a general monitoring obligation of user-created content.
For example, the Online Safety Bill recommends online services use age assurance measures to follow the bill’s duties of care to protect children. In this sense, the Online Safety Bill allows the UK regulatory agency Ofcom to mandate that online services collect people’s personal data to prevent harmful content from reaching children. In practice, this requirement would enable the government to push private businesses to place any content arbitrarily deemed by the Secretary of State or considered significantly harmful to an “appreciable number of children in the United Kingdom” behind an age assurance wall.
Or to take a related example, the bill also allows Ofcom to direct platforms to use “proactive technology” to scan, restrict, and remove content considered harmful to adults. But the legislation does not clearly define what content falls into this category and even includes any content that presents a risk of significant harm to a hard-to-define “appreciable number of adults in the United Kingdom.” This vague definition will likely lead online services to over-moderate lawful speech that they, and their users, would prefer to allow to remain online.
The bill is careful not to prescribe the exact processes behind the monitoring. Still, the provisions strongly show a preference for age verification and other measures to distinguish which users are underage. According to the Online Safety Bill, Ofcom could require more complicated age assurance measures than an easily circumventable birthdate form. Users’ passport numbers, driver’s license scans, or other forms of age verification could become their ticket to entry to online services the government deems unsuitable for children’s eyes. As a result, UK adults may have to disclose personally identifiable information to access online content like legal online pornography that previously they could access anonymously or semi-anonymously, which is a serious threat to online privacy. For example, bad actors could use this personal information to extort individuals. Marginalized communities like dissidents, human rights activists, and abuse survivors have historically relied on online anonymity to stay safe from the threats of persecution and domestic violence. For these communities, personal data collection at this scale will gate off entire sections of the Internet with identity checks and have a chilling effect on their rights to privacy and expression.
The Online Safety Bill also includes a de facto death sentence for end-to-end encryption by requiring online services to moderate messaging and other forms of online communication. Many of these services use encryption to prevent service providers or other third parties from reading private communications. Unfortunately, the duties of care to prevent users from accessing “priority illegal” content—such as terrorism content and child sexual abuse material—will require online services to scan for such content. These communications platforms would either need to begin client-side scanning, create a backdoor that foreign adversaries and others could exploit in cyber attacks, or weaken and potentially remove the encryption these services use to secure communications. For example, services like WhatsApp, Signal, and Wire, would need to create a way to read encrypted messages or risk fines for noncompliance.
Before Brexit, the UK was bound to the privacy safeguards within the European Union’s e-Commerce Directive—a law that tackled the basics of transparency requirements and intermediary liability for e-commerce. The e-Commerce Directive prevented members of the European Union from forcing a general content monitoring obligation on online services. But in the post-Brexit world, UK Internet users will no longer have the Directive protecting them from government-mandated monitoring.
This is just a slice of the lengthy list of negative consequences the Online Safety Bill could impose on UK Internet users and the UK tech sector. Without serious amendments to the current proposal, Parliament should not move forward with a scheme that compels government monitoring online and discourages free expression. Instead, Ministers of Parliament should consider at length what the Online Safety Bill harms regarding innovation, consumer choice, and user welfare in the UK. If the government truly wants to foster a safer online environment, it should not exacerbate its concerns by undermining online encryption and anonymity.