Content area
Full Text
Abstract: The rise of "Big Data" analytics in the private sector poses new challenges for privacy advocates. Through its reliance on existing data and predictive analysis to create detailed individual profiles, Big Data has exploded the scope of personally identifiable information ("PII"). It has also effectively marginalized regulatory schema by evading current privacy protections with its novel methodology. Furthermore, poor execution of Big Data methodology may create additional harms by rendering inaccurate profiles that nonetheless impact an individual's life and livelihood. To respond to Big Data's evolving practices, this Article examines several existing privacy regimes and explains why these approaches inadequately address current Big Data challenges. This Article then proposes a new approach to mitigating predictive privacy harms-that of a right to procedural data due process. Although current privacy regimes offer limited nominal due process-like mechanisms, a more rigorous framework is needed to address their shortcomings. By examining due process's role in the Anglo-American legal system and building on previous scholarship about due process for public administrative computer systems, this Article argues that individuals affected by Big Data should have similar rights to those in the legal system with respect to how their personal data is used in such adjudications. Using these principles, this Article analogizes a system of regulation that would provide such rights against private Big Data actors.
INTRODUCTION
Big Data analytics have been widely publicized in recent years, with many in the business and science worlds focusing on how large datasets can offer new insights into previously intractable problems.1 At the same time, Big Data poses new challenges for privacy advocates. Unlike previous computational models that exploited known sources of personally identifiable information ("PII") directly, such as behavioral targeting,2 Big Data has radically expanded the range of data that can be personally identifying.3 By primarily analyzing metadata, such as a set of predictive and aggregated findings, or by combining previously discrete data sets, Big Data approaches are not only able to manufacture novel PII, but often do so outside the purview of current privacy protections.4 Existing regulatory schema appear incapable of keeping pace with these advancing business norms and practices.
Personal harms emerge from the inappropriate inclusion and predictive analysis of an individual's personal data without their knowledge or express consent....