Unintentional Algorithmic Discrimination: How Artificial Intelligence Undermines Disparate Impact Jurisprudence

By: Vincent Calderon Artificial intelligence holds the capacity to revolutionize the economy by capturing efficiencies. These benefits, ostensibly, should pass down to consumers, thereby benefitting the general public. But the immense complexity of AI systems is bound to introduce legal hurdles for plaintiffs and frustrate our disparate impact jurisprudence. Specifically, demonstrating causation and proffering a less discriminatory alternative are herculean tasks for a plaintiff seeking to prove a disparate impact upon which legal relief may be granted. The courts have already begun to wrestle with these issues, primarily in the housing and employment sectors. With the rapid surge of AI systems, courts should expect further inquiry into how these programs interfere with our established antidiscrimination framework. This Note outlines how each step of a plaintiff’s successful disparate impact analysis is hindered by the opaque ways in which AI operates. This Note then proposes several policy reforms to mitigate these consequences. Download Full Article (PDF) Cite: 24 Duke L. & Tech. Rev. 28

Regulating Data as Property: A New Construct for Moving Forward

By: Jeffrey Ritter and Anna Mayer The global community urgently needs precise, clear rules that define ownership of data and express the attendant rights to license, transfer, use, modify, and destroy digital information assets. In response, this article proposes a new approach for regulating data as an entirely new class of property. Recently, European and Asian public officials and industries have called for data ownership principles to be developed, above and beyond current privacy and data protection laws. In addition, official policy guidances and legal proposals have been published that offer to accelerate realization of a property rights structure for digital information. But how can ownership of digital information be achieved? How can those rights be transferred and enforced? Those calls for data ownership emphasize the impact of ownership on the automotive industry and the vast quantities of operational data which smart automobiles and self-driving vehicles will produce. We looked at how, if at all, the issue was being considered in consumer-facing statements addressing the data being collected by their vehicles. To formulate our proposal, we also considered continued advances in scientific research, quantum mechanics, and quantum computing which confirm that information in any digital or electronic medium is, and

Slave to the Algorithm? Why a ‘Right to an Explanation’ Is Probably Not the Remedy You Are Looking For

By: Lilian Edwards & Michael Veale Algorithms, particularly machine learning (ML) algorithms, are increasingly important to individuals’ lives, but have caused a range of concerns revolving mainly around unfairness, discrimination and opacity. Transparency in the form of a “right to an explanation” has emerged as a compellingly attractive remedy since it intuitively promises to open the algorithmic “black box” to promote challenge, redress, and hopefully heightened accountability. Amidst the general furore over algorithmic bias we describe, any remedy in a storm has looked attractive. However, we argue that a right to an explanation in the EU General Data Protection Regulation (GDPR) is unlikely to present a complete remedy to algorithmic harms, particularly in some of the core “algorithmic war stories” that have shaped recent attitudes in this domain. Firstly, the law is restrictive, unclear, or even paradoxical concerning when any explanation-related right can be triggered. Secondly, even navigating this, the legal conception of explanations as “meaningful information about the logic of processing” may not be provided by the kind of ML “explanations” computer scientists have developed, partially in response. ML explanations are restricted both by the type of explanation sought, the dimensionality of the domain and the type of user

Damned Lies & Criminal Sentencing Using Evidence-Based Tools

By: John Lightbourne The boom of big data and predictive analytics has revolutionized business. eHarmony matches customers based on shared likes and expectations for romance, and Target uses similar methods to strategically push its products on shoppers. Courts and Departments of Corrections have also sought to employ similar tools. However, the use of data analytics in sentencing raises a host of constitutional concerns. In State v. Loomis, the Wisconsin Supreme Court was faced with whether the use of an actuarial risk assessment tool based on a proprietary formula violates a defendant’s right to due process where the defendant could not review how the various inputs were weighed. The opinion attempts to save a constitutionally dubious technique and reads as a warning to lower courts in the proper use of predictive analytics. This article explores certain equal protection and due process arguments implicated by Loomis. Download Full Article (PDF) Cite: 15 Duke L. & Tech. Rev. 327