Big Brother is Listening to You: Digital Eavesdropping in the Advertising Industry

By: Dacia Green

In the Digital Age, information is more accessible than ever. Unfortunately, that accessibility has come at the expense of privacy. Now, more and more personal information is in the hands of corporations and governments, for uses not known to the average consumer. Although these entities have long been able to keep tabs on individuals, with the advent of virtual assistants and “always-listening” technologies, the ease by which a third party may extract information from a consumer has only increased.

The stark reality is that lawmakers have left the American public behind. While other countries have enacted consumer privacy protections, the United States has no satisfactory legal framework in place to curb data collection by greedy businesses or to regulate how those companies may use and protect consumer data. This Article contemplates one use of that data: digital advertising. Inspired by stories of suspiciously well-targeted advertisements appearing on social media websites, this Article additionally questions whether companies have been honest about their collection of audio data. To address the potential harms consumers may suffer as a result of this deficient privacy protection, this Article proposes a framework wherein companies must acquire users’

These Walls Can Talk! Securing Digital Privacy in the Smart Home Under the Fourth Amendment

By: Stefan Ducich

Privacy law in the United States has not kept pace with the realities of technological development, nor the growing reliance on the Internet of Things (IoT). As of now, the law has not adequately secured the “smart” home from intrusion by the state, and the Supreme Court further eroded digital privacy by conflating the common law concepts of trespass and exclusion in United States v. Jones. This article argues that the Court must correct this misstep by explicitly recognizing the method by which the Founding Fathers sought to “secure” houses and effects under the Fourth Amendment. Namely, the Court must reject its overly narrow trespass approach in lieu of the more appropriate right to exclude. This will better account for twenty-first century surveillance capabilities and properly constrain the state. Moreover, an exclusion framework will bolster the reasonable expectation of digital privacy by presuming an objective unreasonableness in any warrantless penetration by the state into the smart home.

Download Full Article (PDF)
Cite: 16 Duke L. & Tech. Rev. 278

Peeling Back the Student Privacy Pledge

By: Alexi Pfeffer-Gillett

Education software is a multi-billion dollar industry that is rapidly growing. The federal government has encouraged this growth through a series of initiatives that reward schools for tracking and aggregating student data. Amid this increasingly digitized education landscape, parents and educators have begun to raise concerns about the scope and security of student data collection.

Industry players, rather than policymakers, have so far led efforts to protect student data. Central to these efforts is the Student Privacy Pledge, a set of standards that providers of digital education services have voluntarily adopted. By many accounts, the Pledge has been a success. Since its introduction in 2014, over 300 companies have signed on, indicating widespread commitment to the Pledge’s seemingly broad protections for student privacy. This industry participation is encouraging, but the Pledge does not contain any meaningful oversight or enforcement provisions.

This Article analyzes whether signatory companies are actually complying with the Pledge rather than just paying lip service to its goals. By looking to the privacy policies and terms of service of a sample of the Pledge’s signatories, I conclude that noncompliance may be a significant and prevalent issue.

Slave to the Algorithm? Why a ‘Right to an Explanation’ Is Probably Not the Remedy You Are Looking For

By: Lilian Edwards & Michael Veale

Algorithms, particularly machine learning (ML) algorithms, are increasingly important to individuals’ lives, but have caused a range of concerns revolving mainly around unfairness, discrimination and opacity. Transparency in the form of a “right to an explanation” has emerged as a compellingly attractive remedy since it intuitively promises to open the algorithmic “black box” to promote challenge, redress, and hopefully heightened accountability. Amidst the general furore over algorithmic bias we describe, any remedy in a storm has looked attractive.

However, we argue that a right to an explanation in the EU General Data Protection Regulation (GDPR) is unlikely to present a complete remedy to algorithmic harms, particularly in some of the core “algorithmic war stories” that have shaped recent attitudes in this domain. Firstly, the law is restrictive, unclear, or even paradoxical concerning when any explanation-related right can be triggered. Secondly, even navigating this, the legal conception of explanations as “meaningful information about the logic of processing” may not be provided by the kind of ML “explanations” computer scientists have developed, partially in response. ML explanations are restricted both by the type of explanation sought, the dimensionality of the domain and the type of user seeking an explanation.

Collection of Cryptocurrency Customer-Information: Tax Enforcement Mechanism or Invasion of Privacy?

By: Austin Elliott

After granting permission to the Internal Revenue Service to serve a digital exchange company a summons for user information, the Federal District Court for the Northern District of California created some uncertainty regarding the privacy of cryptocurrencies. The IRS views this information gathering as necessary for monitoring compliance with Notice 2014-21, which classifies cryptocurrencies as property for tax purposes. Cryptocurrency users, however, view the attempt for information as an infringement on their privacy rights and are seeking legal protection.

This Issue Brief investigates the future tax implications of Notice 2014-21 and considers possible routes the cryptocurrency market can take to avoid the burden of capital gains taxes. Further, this Issue Brief attempts to uncover the validity of the privacy claims made against the customer information summons and will recommend alternative actions for the IRS to take regardless of whether it succeeds in obtaining the information.
Download Full Article (PDF)
Cite: 16 Duke L. & Tech. Rev. 1

Law Firm Cybersecurity: The State of Preventative and Remedial Regulation Governing Data Breaches in the Legal Profession

By: Madelyn Tarr

With the looming threat of the next hacking scandal, data protection efforts in law firms are becoming increasingly crucial in maintaining client confidentiality. This paper addresses ethical and legal issues arising with data storage and privacy in law firms. The American Bar Association’s Model Rules present an ethical standard for cybersecurity measures, which many states have adopted and interpreted. Other than state legislation mandating timely disclosure after a data breach, few legal standards govern law firm data breaches. As technology advances rapidly, the law must address preventative and remedial measures more effectively to protect clients from data breaches caused by outdated or ineffective cybersecurity procedures in law firms. These measures should include setting a minimum standard of care for data security protection and creating a private cause of action for individuals whose personal information has been improperly accessed because of a failure to comply with those standards.
Download Full Article (PDF)
Cite: 15 Duke L. & Tech. Rev. 235

Police Body Worn Cameras and Privacy: Retaining Benefits While Reducing Public Concerns

By: Richard Lin

Recent high-profile incidents of police misconduct have led to calls for increased police accountability. One proposed reform is to equip police officers with body worn cameras, which provide more reliable evidence than eyewitness accounts. However, such cameras may pose privacy concerns for individuals who are recorded, as the footage may fall under open records statutes that would require the footage to be released upon request. Furthermore, storage of video data is costly, and redaction of video for release is time-consuming. While exempting all body camera video from release would take care of privacy issues, it would also prevent the public from using body camera footage to uncover misconduct. Agencies and lawmakers can address privacy problems successfully by using data management techniques to identify and preserve critical video evidence, and allowing non-critical video to be deleted under data-retention policies. Furthermore, software redaction may be used to produce releasable video that does not threaten the privacy of recorded individuals.

Download Full Article (PDF)

Cite: 14 Duke L. & Tech. Rev. 346

Weathering the Nest: Privacy Implications of Home Monitoring for the Aging American Population

By: Jillisa Bronfman

The research in this paper will seek to ascertain the extent of personal data entry and collection required to enjoy at least the minimal promised benefits of distributed intelligence and monitoring in the home. Particular attention will be given to the abilities and sensitivities of the population most likely to need these devices, notably the elderly and disabled. The paper will then evaluate whether existing legal limitations on the collection, maintenance, and use of such data are applicable to devices currently in use in the home environment and whether such regulations effectively protect privacy. Finally, given appropriate policy parameters, the paper will offer proposals to effectuate reasonable and practical privacy-protective solutions for developers and consumers.

Download Full Article (PDF)

Cite: 14 Duke L. & Tech. Rev. 192

Riley v. California and the Stickiness Principle

By: Steven I. Friedland

In Fourth Amendment decisions, different concepts, facts and assumptions about reality are often tethered together by vocabulary and fact, creating a ‘Stickiness Principle.’ In particular, form and function historically were considered indistinguishable, not as separate factors. For example, “containers” carried things, “watches” told time, and “phones” were used to make voice calls. Advancing technology, though, began to fracture this identity and the broader Stickiness Principle.

In June 2014, Riley v. California and its companion case, United States v. Wurie, offered the Supreme Court an opportunity to begin untethering form and function and dismantling the Stickiness Principle. Riley presented the question of whether cell phone searches incident to a lawful arrest were constitutional. The Court, which had clung to pre-digital concepts such as physical trespass well into the twenty-first century, appeared ready to explore how technology is reshaping historically understood conceptions of privacy. From a broader perspective, the case offers an initial step in reconciling pre-digital rules based on outdated spatial conceptions of physical things with the changing realities of a technology driven world.

Download Full Article (PDF)

Cite: 14 Duke L. &