UK policymakers are using a new Irish content moderation regulation as justification for a provision in their own bill tackling online harms—the Online Safety Bill— to criminalize senior tech employees. But in fact, the Irish model provides restraint the UK version lacks.
In December, Ireland signed into law the Online Safety and Media Regulation Act of 2022, which obliged online services to regulate specific harmful or illegal content. The law established a Media Commission to audit and investigate compliance with the new content moderation regulation and, in extreme cases, hold individual executives at online services criminally liable for continued noncompliance. UK policymakers have confirmed the addition of a similar provision to their Online Safety Bill to hold senior tech employees criminally liable if they fail to protect children on their user-to-user and search services.
But the UK’s Online Safety Bill’s provision is very different from its Irish counterpart. The provision in Ireland’s law treats criminal liability as a last-resort punishment for online services. If services remain noncompliant after being issued notices by the Media Commission, then employees of online services can be held liable for offenses under the law. In contrast, the provision in the UK proposal treats criminal liability for tech executives as an early weapon in its arsenal to punish online services for failing to comply with a child-safety duty. Criminal liability could come after a single instance of noncompliance with child-safety duties in the Online Safety Bill.
This difference will have important consequences. By threatening criminal liability immediately for noncompliance, the UK encourages online services to excessively remove content that services must moderate under the child-safety duties but is legal to show adult users. Over-removal on this scale will take many types of content adults want to see online, offline. Over-removal could leave UK users with a sanitized version of online services that always moderate anything that could be considered child-unfriendly under the Online Safety Bill.
The provision in Ireland’s law mitigates many—but not all—of the over-moderation concerns facing the UK proposal by not threatening jail time immediately for every failure to comply with the Irish law’s regulations and instead holding employees liable for continued failure to comply after Media Commission orders. It is unlikely online services will over-remove in Ireland at the scale they will in the UK because there are options to fix any moderation missteps before employees can even be held liable and risk going to jail.
Moreover, Ireland only signed the Online Safety and Media Regulation Act into law in December, so any actual consequences of the law are yet to be seen. Because companies are likely still figuring out their reactions and responsibilities under the new Irish law, UK policymakers cannot begin to predict if criminal liability is a successful strategy to force content moderation regulations on online services.
The UK’s Online Safety Bill has other mechanisms to enforce its proposed regulations and duties for online services, such as imposing penalties and restriction orders. Adding broad criminal liability for senior tech managers to the regulation does not achieve policymakers’ goals of protecting free expression and promoting online safety. For this reason, policymakers should nix these provisions from the Online Safety Bill before it further hurts the online landscape for user-generated content.