The Hedgehog Review: Vol. 17 No. 1 (Spring 2015)
The Rise of the Cryptopticon
Reprinted from The Hedgehog Review 17.1 (Spring 2015). This essay may not be resold, reprinted, or redistributed for compensation of any kind without prior written permission. Please contact The Hedgehog Review for further details.
Consider two American films, twenty-four years apart, both starring Gene Hackman as a reclusive surveillance expert. The difference between the work done by Harry Caul, the naive, emotionally stunted private investigator played by Hackman in Francis Ford Coppola’s 1974 film The Conversation, and the work done by Edward Lyle, the disaffected, cynical former spy Hackman portrays in the 1998 Tony Scott film Enemy of the State, is more than a matter of the tools they use.1
Caul uses audio and video surveillance to investigate private citizens, while Lyle deftly deploys the digital tools and techniques that have come to characterize our era of total surveillance. We learn that before choosing to go “off the grid,” Lyle did high-level work for either a government organization like the National Security Agency or a private contractor working for the NSA. (The exact truth is never fully revealed.) Lyle seems to be Caul a quarter century later, with a new name, a deeper sense of nihilism, but the same aversion to sharing information with others.
Caul’s tools, analog and cumbersome, are remarkably effective at capturing the conversations and images of his targets. He snoops on specific human subjects and works for private firms and individuals alike. He focuses on personal matters, not criminal or national security ones.
Lyle, by contrast, introduces both Robert Clayton Dean (Will Smith) and filmgoers of the late 1990s to an invisible web sustained by the continuous mining and tracking of digital data. The team of geeky spies assigned to track Dean as he rushes through Washington has at its disposal credit records, mobile phone signals, and hundreds of surveillance cameras positioned throughout the city.
Caul lives in a completely different information ecosystem from the one inhabited by Lyle. It’s not that the government was more benign or restrained during the Nixon years—we need only think of Watergate—or that private firms had nobler motivations. And Caul certainly has the skill and equipment to track individuals and record their words and movements in intimate detail. Like Lyle, he has the power to ruin lives through surveillance and revelation. But Caul cannot imagine anything beyond the precisely targeted surveillance of specific individuals.
Lyle, however, lives at the dawn of the Big Data era. In Lyle’s information ecosystem, firms and states maintain massive databases that contain records not only of commercial transactions, but of people’s movements and even their characteristic facial expressions. There is a permeable membrane between data collected by private firms and data used by state security forces. And our electronic devices, as Dean learns the hard way, support this environment of continuous, near-total surveillance. Data collection is so cheap and easy that it’s unnecessary to make a priori judgments about which of its findings might be important. Firms and states collect first and ask questions later.
Caul’s downfall in The Conversation results from a moment of weakness. He reveals the wrong details to the wrong person at the wrong time. His own vulnerability awakens his moral sense. Concerned not just for his own privacy, he now feels culpable for the damage he has done to others by invading theirs.
Toward a Definition of Privacy
In Caul’s awakening, we glimpse what Georgetown law professor Julie Cohen referred to in her 2012 book Configuring the Networked Self as the shift in focus in privacy thinking from individual autonomy to the “social value of privacy.”2 But for Cohen, Caul’s concerns about even other people’s individual privacy are inadequate. Rather, she argues, theories founded on and bounded by liberal individualism consistently fail to account for how we actually live in a networked world, imbricated as our lives are (and always were, even before the rise of digital technologies) in social and cultural contexts. We make and remake ourselves dynamically as we move in time and among others, as our interests and allegiances change.
Cohen develops a complex theory of a networked self that helps us distill a better working definition of privacy than the worn and somewhat limited one Samuel Warren and Louis Brandeis dubbed “the right to be left alone” in their landmark 1890 article on privacy rights in the Harvard Law Review.3 Through Cohen, we can see that privacy does not consist merely of those aspects of our lives that we withhold from others. Privacy is more than the autonomy we exercise over our own information. It more accurately comprises the ways we manage our various reputations within and among various contexts. Those contexts might include school, church, the public sphere, a place of employment, or a family. Each of these contexts shifts and overlaps with others. Borders change, contexts blend. So configuring a “self” in the twenty-first century is a lot more work than it used to be. The fluidity can be liberating, especially for those who seek niches supportive of marginalized identities. But it can also be a terrifying and vertiginous liberty—sometimes exhausting and even potentially dangerous.
Building on Cohen, we can see the problem. Contexts in a digitally networked world—a world Lyle eschews in Enemy of the State—are constantly intersecting and overlapping. Our work sphere impinges upon our family sphere too easily, challenging our personal ability to manage our reputations and control the manners of disclosure. Our public contexts blend as commercial data firms collect and then sell our profiles to political parties and campaigns. Facebook brings all of our acquaintances together into one confusing collection of otherwise unrelated profiles that we are forced to deal with without the help of rank or distinction. Friends are just friends. So are lovers, bosses, acquaintances, and high school teachers.
In the current commercial, political, and regulatory environment, institutions have powerful incentives to collect, save, and analyze every trace of human activity. These incentives are not entirely new, of course. People have long been aware of the potential payoffs of tracing and tracking subjects (consumers, citizens, criminals, “users”). To explain the relatively recent turn to Big Data as a tool of choice, scholars and analysts have tended to emphasize the availability of appropriate technologies. Among these are huge server farms, algorithms designed to reveal patterns quickly within otherwise meaningless pools of data, and faster bandwidth and processing capacities. But this techno-centric analysis misses or downplays the role of significant changes in the global political economy and dominant ideologies since 1980. When securities markets and consultants praise “efficiency” above all other values, when states place “security” above all other public needs, and when mass-market advertising reaps, at best, murky returns for each dollar spent, the incentives to target, trace, and sift grow stronger.
Clearly, there is much in the current commercial, political, and even cultural environment that encourages the use of Big Data. Because it offers clear public benefits, such as quicker and broader epidemiological assessments, we would seem foolish to dispense with Big Data and its technological systems and practices. But we should understand the costs as well as the benefits—and not allow Big Data’s rapid rise and widespread adoption to blind us to the need for critical public and political discussions of its use and abuse.
From Panopticon to Cryptopticon
In his influential 1975 book Discipline and Punish, Michel Foucault adopted the concept of the Panopticon—Jeremy Bentham’s never-realized design for a circular prison with a central watchtower, from which the behavior of inmates could be observed at all times—to describe the programs and techniques used by the modern state to monitor, supervise, and, ultimately, modify the behavior of its citizens. To Foucault, the Panopticon was embedded in the practices, structures, and institutions of modern society, from government bureaucracies to schools to hospitals and asylums, and to the assorted regimes of health, well-being, and citizenship they variously pressed upon their subjects. This system of surveillance left “no need for arms, physical violence, material constraints,” as Foucault once said. All that was needed was “a gaze,” an endlessly inspecting observation, which each individual would end up internalizing, thus becoming his or her own constant supervisor. A superb formula: power exercised continuously and for what turns out to be a minimal cost.4
Those who write about privacy and surveillance often invoke the Panopticon to argue that the great harm of mass surveillance is social control. Yet the Panopticon does not suffice to describe our current predicament. First, mass surveillance does not necessarily inhibit behavior: People will act as they wish regardless of the number of cameras pointed at them. The thousands of surveillance cameras in London and New York City do not deter the eccentric and avant-garde. Today, the example of reality television suggests that there may even be a positive correlation between the number of cameras and observers watching subjects and their willingness to act strangely and relinquish all pretensions of dignity. There is no empirical reason to believe that awareness of surveillance limits the imagination or stifles creativity in a market economy in an open, non-totalitarian state.
Obviously, coercive state violence still exists, and at times metastasizes. In the Cold War era, the East German secret police, the Stasi, knew how to exploit widespread awareness of surveillance to heighten the fear and submissiveness of the general public. Florian Henckel von Donnersmarck’s brilliant 2007 film, The Lives of Others, demonstrates the corrosive power of constant state surveillance. The protagonist, a playwright loyal to the East German government, enjoys all the perks of stardom, naively trusting that his allegiance to the Party will continue to protect him. When a romantic entanglement places his girlfriend, and then him, under high-level surveillance, his confidence unravels and the depravity of the state becomes clear.5 The film concludes with a glimpse of the 1991 version of Big Data. The playwright, now trying to reconstruct his life in the wake of the unification of Germany, visits the new archive in Berlin that allows citizens to examine the files that the Stasi collected. This moment leaves viewers with a powerful sense of how detailed, destructive, and all-encompassing state surveillance could be in even an era of non-networked analog media forms.
But the environment shaped by the Stasi is not the environment in which most of us now live. Unless the Panopticon is as visible, ubiquitous, and intentionally menacing as agencies such as the Stasi made it, it cannot influence behavior in the ways Bentham and Foucault assumed. And as British political writer Timothy Garton Ash shows in The File (1997), his brilliant account of the surveillance he underwent during his time as a graduate student in East Berlin, even the Stasi’s Panopticon was not enough to preserve the iron grip of the state.6
In Europe, North America, and much of the rest of the world, governments and businesses achieve their ends in almost the opposite way from that of the Panopticon: not through the subjection of the individual to the gaze of a single, centralized authority but through the surveillance of the individual by all (at least in theory, though by many in fact). Not a Panopticon, then, but a Cryptopticon, to use the name I have given to the information ecosystem of massive corporate and state surveillance.7
Unlike Bentham’s Panopticon, the Cryptopticon is not supposed to be intrusive or obvious.8 Its scale, its ubiquity, even its very existence, are supposed to go unnoticed. So while a closed-circuit television camera mounted over a counter at a convenience store openly warns would-be shoplifters or robbers to behave or risk being caught, the Cryptopticon relies on browser cookies, data streams retained by telecommunication firms, satellite imagery, global positioning system traces, covert voice surveillance, store discount cards, e-book readers, and mobile applications. Each of these things masks its real purpose: to gather or provide data and to track the behavior of millions of people with stunning precision. Beguilingly, though, most of these instrumentalities offer something valuable (convenience, security, connectivity, information, efficiency, lower costs) to those who engage with them—often “for free.”9
Unlike Bentham’s prisoners, we do not know all the ways in which we are being watched or profiled—we simply know that we are. And we do not regulate our behavior under the gaze of surveillance. Instead, we seem not to care. The workings of the Cryptopticon are cryptic, hidden, scrambled, and mysterious. One can never be sure who is watching whom and for what purpose. Surveillance is so pervasive, and much of it seemingly so benign (“for your safety and security”), that it is almost impossible for the object of surveillance to assess how he or she is manipulated or threatened by powerful institutions gathering and using the record of surveillance. The threat is not that expression and experimentation will be quashed or controlled, as they supposedly would have been under the Panopticon. The threat is that subjects will become so inured to and comfortable with the networked status quo that they will gladly sort themselves into “niches” that will enable more effective profiling and behavioral prediction.
The Cryptopticon, not surprisingly, is intimately linked to Big Data. And the dynamic relationship between two has profound effects on the workings of commerce, the state, and society more generally.
Customize to Monetize
Facebook, Google, and Amazon want us to relax and be ourselves. They have an interest in exploiting niche markets that our consumer choices have generated. These companies are devoted to tracking our eccentricities because they understand that the things with which we set ourselves apart from others are the things about which we are most passionate. Not only our passions, but our predilections, fancies, and fetishes, drive and shape our discretionary spending; they are what make us easy targets for precise marketing. As former Wired editor Chris Anderson elaborates in The Long Tail (2004) and Joseph Turow explains in Niche Envy (2006), market segmentation is vital to today’s commerce. In order for marketers and vendors to target messages and products to us, they must know our eccentricities—what makes us distinctive, or, at least, to which small interest groups we belong. Forging a mass audience or market is a waste of time and money unless you are selling soap—and a very generic soap at that.10
The race to monitor, monetize, and manipulate the attention given by users in exchange for “free” services marks the current corporate moment. It also characterizes the mania that drives companies like Google, Facebook, Microsoft, and Apple to create more than the operating system of our computers or phones. They are racing to become the operating system of our lives.11
Know Your Info Flow
It’s not just Facebook, Google, and Amazon that want us to be ourselves. Modern liberal states want us to relax and reveal our allegiances, opinions, and affiliations. They even count on subversive and potentially dangerous people to reveal themselves through their habits and social connections. Contrast this subtle style of control with the Panopticon’s approach to suppressing dissent or quelling subversion. The Stasi, remember, lost control over the East German people despite the enormous scale of its operations and the long-lasting damage it inflicted on both the observers and the observed.
As James B. Rule explains in his 1974 book Private Lives and Public Surveillance: Social Control in the Computer Age, the rise of commercial and government databases and credit bureaus in the 1960s alarmed civil libertarians already shocked by the abuses of the Nixon administration. This led to a series of relatively strong federal privacy protections that were undermined or ignored in subsequent decades.12 But the level and complexity of Big Data collection have both become much greater. Since late 2001, the United States, the United Kingdom, and the People’s Republic of China, among others, have installed sophisticated and covert surveillance systems to track the words, images, movements, and social networks of their citizens.13 Companies such as Google and Facebook put Big Data collection and analysis at the heart of their revenue-generating functions, always described by company officials as enhancements to “the user experience.”14 The line between “state” and “commercial” surveillance hardly matters any more, as state security services regularly receive significant data sets on people’s movement and habits just by asking for them or by licensing the data on the open market. The same consumer data companies that sell your profile to Target or Visa are happy to sell it to the New York City Police Department or the FBI.15 Data firms also collect state records such as voter registrations, deeds, car titles, and liens in order to sell consumer profiles to direct-marketing firms.16
Given the many possible abuses of Big Data, including the long-term tarnishing of personal and professional reputations, citizens need to be fully aware of the flows of information between private firms, governments, and any other institutions that might have an interest in using such data. They also need to consider the desirability of policies that might limit those flows and the uses to which the data might be put, including the adoption and further elaboration of the Europe Union’s recently instituted ruling on “the right to be forgotten.” As Viktor Mayer-Schönberger argues in his 2009 book Delete: The Virtue of Forgetting in the Digital Age, this right can be implemented in such a way as to protect vulnerable citizens whose records might, over time, be taken out of context or falsely presented, while not unreasonably restricting people’s access to the open flow of information. Despite the cries of alarm that followed the EU high court ruling that upheld the existence of such a right, free speech has not disappeared or been noticeably diminished.17
Does Privacy End at the Threshold?
In Michelangelo Antonioni’s 1966 film Blow-Up,18 a work focused on perception and voyeurism, the main character, a photographer, secretly takes pictures of a couple embracing in a London park. The woman, furious when she notices what the photographer is up to, chases him down. “This is a public place!” she explodes. “Everyone has the right to be left in peace.”
It is an odd bit of dialogue, at least to American ears. The standard American assumptions about private and public spaces are that everyone has a right to be left in peace in private, but not in public. Because privacy law and theory in the United States have for so long depended on the distinction drawn in the Fourth Amendment’s prohibition on “unreasonable searches and seizures,” Americans assume that there are private spaces and public spaces and that our norms and expectations of what is appropriate to each must fall within those demarcations. Privacy ends at the threshold.
Almost fifty years after the release of Antonioni’s film, the sheer contingency of the American conception of privacy is a bit easier to see. We can no longer define and describe it so certainly in terms of “public space” and “private space.” We probably never should have. For too long, the rhetorical power of law had too much influence on how we Americans conceived of privacy. That’s one of the key insights driving The Conversation: One can invade another’s privacy, and even do that person harm, simply by recording or filming that person in a public. The spatial distinctions between private and public are no longer relevant. We might once have had a stable notion of privacy when our thoughts and personal information were recorded in our “papers” that we stored at home. But now, so much essential data sits on servers far from our computers, in a place we nonchalantly and naively call “the cloud.” American law does not protect this information from the prying eyes of the state because we have placed it with “third parties.” In doing so, we have pushed beyond the walls of “private space.”
American law seems on the verge of realizing this. As Justice Sonia Sotomayor wrote in her concurring opinion in United States v. Jones in 2012, “I would ask whether people reasonably expect that their movements will be recorded and aggregated in a manner that enables the Government to ascertain, more or less at will, their political and religious beliefs, sexual habits, and so on.”19 The case was about the warrantless surveillance of a suspect who, unbeknownst to himself, was driving around with a global positioning system sensor attached to the underside of his car. The police argued that they were merely tracking his movements in public. The Supreme Court, and Justice Sotomayor in particular, did not find that a persuasive response. The Court ruled in favor of Jones and against the police. The majority opinion, written by Justice Antonin Scalia, rested on the idea that the police trespassed on private property; Justice Scalia evidently refrained from pushing for a more radical and, in my view, more appropriate revision such as the one Sotomayor described in her concurring opinion. As more state surveillance cases come before the Supreme Court, we are likely to see fluidity in the notion that privacy is bound by space, and a recognition that the threshold of one’s door is not an unassailable barrier to impingements on Americans’ dignity and freedom of thought.
The scene I have described from Blow-Up and the Supreme Court decision in United States v. Jones both emphasize the very un-American notion that privacy is not necessarily a spatial matter, an idea that underscores Julie Cohen’s critique of the inadequacy of liberal theory to explain privacy and settle its questions in law and policy. But the scene from Antonioni’s film also leads us to recognize that social relations, as philosopher Helen Nissenbaum argues in her masterful Privacy in Context: Technology, Policy, and the Integrity of Social Life (2009), rely on a web of trust. Respecting privacy is high among those norms that facilitate social relations.20 Our laws, norms, and practices should foster a fuller sense of collective dignity and autonomy, Cohen argues. Nissenbaum, in a different and more traditionally liberal way, emphasizes that individuals need to have some control over how others learn about and see them in order to be full citizens and lead full social lives.
The photographer in Blow-Up does not work for the state. He does not work for a commercial firm. And it’s unclear from the testimony or action of his subject what she fears the photographer might do with the photograph.
As Daniel Solove explains in The Future of Reputation: Gossip, Rumor, and Privacy on the Internet (2007), one of the greatest threats to personal dignity comes from neither large firms nor powerful governments. It comes from millions of individuals armed at all times and in all places with audio, video, and photographic recording devices. Fellow members of society have the means, if they are so inclined, to expose, harass, and vilify their neighbors either to satisfy a sense of vigilante justice or simply to amuse themselves.21 Soon, we may have access to “always-on” surveillance technologies such as Google Glass that will not only record all of our public and private interactions in both public and private, but will share the images and sounds of these interactions—thus making them available to businesses and governments as well.22
When Blow-Up was released, the lone man with a camera in a park capturing images of strangers was an anomaly. Now such behavior is so common that it’s unremarkable, even the norm. But as Solove argues, the new normal deserves remark and reconsideration, not least because it is so ethically and legally fraught. We so precipitously entered the age of (potentially) near-total and continuous mutual surveillance that we failed to weigh our individual consumer desires and personal predilections against the necessity for certain norms to uphold the common good.
The need for informed debate about the norms, practices, and regulations that would govern what Nissenbaum has called “privacy in public”23 is clearly urgent. Many strong incentives (convenience, efficiency, connection, pleasure) militate in favor of tacitly accepting the status quo of maximum surveillance by as many people as possible. And the devices that make this new normal possible are so attractive in so many ways that to criticize them or their users is to encounter a powerful resistance to what legal scholar Anita Allen calls “foundational human goods.” In her 2011 book Unpopular Privacy: What Must We Hide?, Allen argues that we must accept paternalistic state protections for those privacy concerns that are central to the just and flourishing economy, polity, and society. The state must protect informational or data privacy by default, she says, because individuals have no incentive (or even an appreciation of the extent of the problem) sufficient for them to “opt out” of their subjection to massive commercial (and thus state) data surveillance via protective settings, technologies, or practices. The state therefore needs to mandate a default “opt-in” status that would require firms and governments to convince us we should be watched and tracked because there would be some clear reward.24
How young people manage their reputations within various contexts has been a subject of much poorly conducted debate in recent years. Privacy is as much a matter of norms as of laws. Should we, as the pundits warn, assume that “privacy is dead” because young people seem to share all sorts of details via social media without regard for traditions of reticence and modesty? Well, it turns out that we need not worry so much about young people abandoning privacy. Indeed, the rest of us might do better to emulate the sophisticated strategies many young Americans actively deploy to protect themselves as they engage socially. The studies that Microsoft Research social scientist danah boyd undertook for her essential book It’s Complicated: The Social Lives of Networked Teens (2014) demonstrate that young people learn early on how to mask the meanings of their social network engagements by developing codes that are impenetrable by parents and others in authority. Just as important, young people are far more likely to manipulate privacy settings on social network services than their older “friends” (relatives, teachers, coaches, etc.) are. While Cohen and Nissenbaum provide theoretical approaches to the challenges of understanding privacy, boyd brings empirical and practical considerations to the discussion.25
We are beginning to understand the ramifications of rapid change in our information ecosystem.26 Scholars in such disconnected areas as computer science, science and technology studies, library and information studies, communication, marketing, political science, media studies, and the philosophy of science have been picking away from different angles at the problems and opportunities Big Data presents. But we still lack a comprehensive history of Big Data that covers the major technological leaps, theoretical concepts, and public policies that have led us to this moment. Our thinking about “privacy” and “surveillance” is still overly determined by American legal history and by the long shadow of Michel Foucault. The good news is that we have some valuable tools—both in films and in books—that can take us toward the kind of synthetic and ecological understanding we require in order to resist the pernicious effects of massive state surveillance, the dehumanizing aspects of commercial tracking and sorting, and the dangers of social confusion and betrayal.
- Francis Ford Coppola (Producer and Director), The Conversation [Motion picture] (United States: Paramount Pictures, 1974); Jerry Bruckheimer (Producer) and Tony Scott (Director), Enemy of the State [Motion picture] (United States: Touchstone Pictures, 1998).
- Julie E. Cohen, Configuring the Networked Self: Law, Code, and the Play of Everyday Practice (New Haven, CT: Yale University Press, 2012).
- Samuel D. Warren and Louis D. Brandeis, “The Right to Privacy,” Harvard Law Review 4, no. 5 (1890): 193–220, doi:10.2307/1321160. See also Robert Post, “Re-reading Warren and Brandeis: Privacy, Property, and Appropriation,” Faculty Scholarship Series, January 1, 1991; http://digitalcommons.law.yale.edu/fss_papers/206.
- Jim Miller, The Passion of Michel Foucault (New York: Simon & Schuster, 1993), 222−23.
- Florian Henckel von Donnersmarck (Director and Screenwriter), Max Wiedemann and Quirin Berg (Producers), The Lives of Others/Das Leben der Anderen [Motion picture] (Germany: Buena Vista International; United States: Sony Pictures Home Entertainment, 2007).
- Timothy Garton Ash, The File: A Personal History (New York: Random House, 1997).
- In my first effort to describe this phenomenon, I dubbed it the “Nonopticon.” See Siva Vaidhyanathan, “Naked in the ‘Nonopticon’: Surveillance and Marketing Combine to Strip Away Our Privacy,” Chronicle Review, February 15, 2008; http://chronicle.com/free/v54/i23/23b00701.htm. However, I soon realized that the term was clumsy and inaccurate. Later, my friend Bill Pugsley suggested that “Cryptopticon” accurately captured my intended meaning. I am grateful to him for the suggestion. I later employed “Cryptopticon” in Siva Vaidhyanathan, The Googlization of Everything (and Why We Should Worry) (Berkeley: University of California Press, 2011).
- Michel Foucault, Discipline and Punish: The Birth of the Prison (New York: Vintage Books, 1995); David Lyon, Theorizing Surveillance: The Panopticon and Beyond (Cullompton, England: Willan, 2006); Oscar H. Gandy, The Panoptic Sort: A Political Economy of Personal Information (Boulder, CO: Westview Press, 1993) ; Jean-François Blanchette and Deborah G. Johnson, “Data Retention and the Panoptic Society: The Social Benefits of Forgetfulness,” Information Society 18, no. 1 (January 2002): 33–45.
- Vaidhyanathan, The Googlization of Everything.
- Chris Anderson, The Long Tail: Why the Future of Business Is Selling Less of More (New York: Hyperion, 2006); Joseph Turow, Niche Envy: Marketing Discrimination in the Digital Age (Cambridge, MA: MIT Press, 2006).
- Siva Vaidhyanathan, “Fred Vogelstein’s ‘Dogfight,’” New York Times (Sunday Book Review), November 3, 2013; http://www.nytimes.com/2013/11/03/books/review/fred-vogelsteins-dogfight.html. See also Andrea Ballatore, “Googlized Capitalism, between Efficiency and Hegemony: Interview with Siva Vaidhyanathan,” academia.edu, February 13, 2013; http://www.academia.edu/6835982/Googlized_capitalism_between_efficiency_and_hegemony_Interview_with_Siva_Vaidhyanathan.
- James B. Rule, Private Lives and Public Surveillance: Social Control in the Computer Age (New York: Schocken Books, 1974); James B. Rule, Privacy in Peril: How We Are Sacrificing a Fundamental Right in Exchange for Security and Convenience (New York: Oxford University Press, 2007).
- Shane Harris, The Watchers: The Rise of America’s Surveillance State (New York: Penguin, 2010).
- Vaidhyanathan, The Googlization of Everything; Fred Vogelstein, “Great Wall of Facebook: The Social Network’s Plan to Dominate the Internet,” Wired, July 2009; http://www.wired.com/techbiz/it/magazine/17-07/ff_facebookwall?currentPage=all; Michael Agger, “Google and Facebook Battle for Your Friends,” Slate, January 14, 2009; http://www.slate.com/id/2208676/pagenum/all/#p2.
- Jack M. Balkin, “The Constitution in the National Surveillance State,” Minnesota Law Review 93, no. 1 (2008), http://ssrn.com/paper=1141524; James X. Dempsey and Lara M. Flint, “Commercial Data and National Security,” George Washington Law Review 72, no. 6 (2004): 1459−1502; https://www.cdt.org/files/publications/200408dempseyflint.pdf; Chris Jay Hoofnagle, “Big Brother’s Little Helpers: How ChoicePoint and Other Commercial Data Brokers Collect and Package Your Data for Law Enforcement,” North Carolina Journal of International Law and Commercial Regulation, 29 (2003−2004): 595−638; http://scholarship.law.berkeley.edu/cgi/viewcontent.cgi?article=1677&context=facpubs.
- Rule, Privacy in Peril.
- Viktor Mayer-Schönberger, Delete: The Virtue of Forgetting in the Digital Age. (Princeton, NJ: Princeton University Press, 2011).
- Carlo Ponti (Producer) and Michelangelo Antonioni (Director), Blow-Up [Motion picture] (United Kingdom: MGM, 1966).
- Dahlia Lithwick, “Alito vs. Scalia,” Slate, January 23, 2012; http://www.slate.com/articles/news_and_politics/jurisprudence/2012/01/u_s_v_jones_supreme_court_justices_alito_and_scalia_brawl_over_technology_and_privacy_.html.
- Helen Nissenbaum, Privacy in Context: Technology, Policy, and the Integrity of Social Life (Stanford, CA: Stanford University Press, 2009). Disclosure: Nissenbaum was my colleague when we both worked for New York University.
- Daniel Solove, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet (New Haven, CT: Yale University Press, 2007).
- Charles Arthur, “Google Glass: Is It a Threat to Our Privacy?,” The Guardian, March 6, 2013; http://www.guardian.co.uk/technology/2013/mar/06/google-glass-threat-to-our-privacy.
- Helen Nissenbaum, “Protecting Privacy in an Information Age: The Problem of Privacy in Public,” Law and Philosophy 17, no. 5/6 (1998): 559–96.
- Anita L. Allen, Unpopular Privacy: What Must We Hide? (New York: Oxford University Press, 2011).
- danah boyd, It’s Complicated: The Social Lives of Networked Teens (New Haven, CT: Yale University Press, 2014). Disclosure: I served as a visiting researcher at Microsoft Research in Cambridge, Massachusetts, in summer 2014 while boyd worked in the New York office. I also provided a promotional blurb for It’s Complicated.
- Rule, Private Lives and Public Surveillance; Rule, Privacy in Peril; Fred Turner, From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism (Chicago: University of Chicago Press, 2006).