Home PublicationsCommentary Event Recap: Can GDPR’s ‘Automated Decision Opt-Out’ Be Improved Without Harming Users?

Event Recap: Can GDPR’s ‘Automated Decision Opt-Out’ Be Improved Without Harming Users?

by Benjamin Mueller
by

Under the leadership of the Department for Digital, Culture, Media & Sport (DCMS), the UK government is currently running a public consultation on overhauling the country’s data protection regime. Following Britain’s exit from the European Union, the government is looking to address some of the flaws of the General Data Protection Regulation (GDPR). The UK’s data strategy seeks to remove barriers to responsible innovation, reduce burdens on business, boost trade, remove barriers to data flows, and deliver better public services. 

As part of this reform effort, DCMS is exploring the role and relevance of Article 22 of the GDPR: the right not to be subject to an automated decision-making process with significant legal effects. In conjunction with DCMS, the Center for Data Innovation convened a roundtable discussion to examine whether Article 22 is fit for purpose. 

Kristian Stout of the International Center for Law & Economics opened the conversation with the observation that GDPR has led to a performative “compliance theatre” at the expense of encouraging valuable forms of technological innovation. For example, Article 22 makes it difficult to offer instantaneous “take it or leave it” contracts, the kind of frictionless, automated decision-making that is powering point-of-sale credit tools popular with consumers. A privacy framework built on the principle of data minimization, which focuses primarily on compliance for compliance’s sake rather than on maximizing consumer welfare, reduces the incentives for startups to explore new ways to use data. 

Jonathan Kirsop, a partner at Pinsent Masons LLP and an expert on data privacy law, pointed out that Article 22’s distinction between decisions taken by a machine rather than a human is largely artificial, seemingly designed to appease our instinctive preference for decisions taken by our peers as opposed to automated decisions powered by computers. The underlying rationale—protecting us against biases and unexplainable decisions—is perfectly understandable. But since the intent behind Article 22 is to empower humans to challenge such decisions, it is entirely valid to question whether the law can design different routes to this goal with reduced costs to innovation. The purpose is not to rip up the data protection principle, but to reflect on the costs and benefits of the existing legal framework and scope out alternatives that provide similar protections at a lower cost. 

On this point, Omer Tene, partner at Goodwin Procter LLP and former vice president of the International Association of Privacy Professionals, opined that Article 22’s distinction between human- and machine-powered decision-making is crude and outdated. He stressed that the policy goal—giving humans the ability to challenge or respond to decision-making systems that affect their lives— is conceptually unrelated to the means of making the decision. Instead, the processes behind the decision, and the routes of redress available to the subjects of the decision, are key: ensuring that the outcome of the decision is intelligible, is based on due process, can be appealed, and that errors can be corrected. These conditions are not functionally related to whether or not the decision was taken by man or machine. A legal right to challenge a decision based solely on who made the decision is thus strikingly hostile to technological progress at a time when algorithmic decision-making is becoming rapidly more capable. Humans are not inherently better or worse than algorithms at making decisions. In fact, the finance industry introduced the now ubiquitous system of credit scoring to address widespread discrimination in the loans sector, when decisions on awarding credit were taken exclusively by humans. The irony should not escape us that this very system, designed to make access to credit equitable and objectively driven by one’s capacity to repay a loan, has just been sent to the European Court of Justice by a German administrative court which questioned whether such credit scoring tools are compatible with Article 22.  

Andrew Orlowski drew attention to the political sensibilities around loosening consumer protection provisions amidst a wider regulatory “tech-lash.” While it is important not to offer technology companies too much immunity from consumer harms created by data misuse, Orlowski laments the loss of nuance and subtlety around data issues. Almost any use of data, regardless of context (e.g. sensible efforts to use data to fight Covid-19 and improve healthcare provision) is deemed suspect by a cottage industry of lawyers and academics with a vested interest in an ever-growing thicket of regulation. Further to this point, Jonathan Kirsop pointed out that GDPR is a curious patchwork of principle-based legislation along with prescriptive policies like Article 22. The principle-driven components of GDPR are much sounder than its prescriptive parts. For instance, creating a “lawful basis” for data processing based on legitimate interests (requiring a prior conformity assessment, one which data subjects can challenge) provides a solid protective mantle for consumers. The addition of prescriptions to GDPR creates definitional and implementational ambiguities that deter innovators from trying out new tools. From a legal standpoint, removing Article 22 would leave the level of consumer protection untouched: the other parts of GDPR provide more than sufficient guardrails against the improper collection and use of data. It is largely a matter of public perception and fear-mongering by academics that Article 22 has attained an almost hallowed status as a critical safeguard against algorithmic discrimination. 

In light of this, it seems crucial for the DCMS consultation to examine what other guardrails against unfair discrimination exist within the GDPR as well as in the form of other laws. For instance, personal loans are regulated by the FCA’s Consumer Credit sourcebook and policed by the Financial Ombudsman Service. Both borrowers and regulators care more about whether a lending decision was fair and affordable than whether it was automated. It is thus unclear what Article 22 adds to consumer protection in credit underwriting. Similarly, lawsuits against unfair dismissal based on algorithmic decisions would stand regardless of whether Article 22 exists or not—due to well-established laws against unfair dismissal (which, among other things, give workers the right to an impartial appeal against a decision, mandating a human intervention). This observation aligns neatly with Omer Tene’s point that insisting on keeping a “human in the loop” in all contexts is a quaint relic of 1980s data protection assumptions. The real question is whether a decision can be challenged, what the most appropriate way is to do so, and who conducts the review. For instance, it may well be the case that automated challenges are more meaningful than human review, especially when a human is unable to understand the reasoning behind a machine’s decision. The panel agreed that consumer protection law is a better avenue to explore proper redress mechanisms than Article 22. GDPR’s purpose is to set out the rules of lawfully obtaining and processing data. Giving subjects the ability to challenge the outcomes of data processing is increasingly irrelevant as the power of automated decision-making systems grows, and the likelihood of changing an outcome via human review shrinks. 

Kristian Stout also made the sensible point that regulators could give data processors the option of offering data subject informed consent and the right to provide contracts of adhesion. Such a solution would enable companies to experiment with innovative automation tools, giving subjects the ability to use such services or go elsewhere and seek non-automated providers. There could be additional regulatory qualifications to further protect consumers (e.g. in contexts where there are few providers in the market, i.e. low levels of competition).

 

You may also like

Show Buttons
Hide Buttons