Skip to content

Reclaiming the Right to Future Tense

PUBLISHED

Scott Skinner-Thompson (@juriscott) is an Associate Professor at the University of Colorado Law School.

By now, many of the societal, political, and distributive harms caused by large technology companies and so-called “social” media companies (Amazon, Facebook, Google, etc.) have been surfaced.  They invade our privacy, decrease market competition, erode our sense of self and, despite their euphemistic label, our sense of community.  Shoshana Zuboff’s new book—The Age of Surveillance Capitalism—intervenes to weave together the seemingly balkanized practices of large, monopolistic data-harvesting companies, painting a more comprehensive picture of their decidedly anti-social strategies.  At the same time, she situates their tactics in comparative, historical context as a distinctively new market logic.  Zuboff labels the emerging economic regime created by these tech companies “surveillance capitalism” in order to capture the transformative shift they represent in how our society is being organized—organized by surveillance capitalist corporations, not by the people.  Put simply, surveillance capitalism is an economic ideology that deploys divergent technologies as a means of cultivating and monetizing our identities. 

As Zuboff underscores, surveillance capitalists treat the information generated by our online activity and our situated, physical activity (collected through the Internet of Things) as raw material available for extraction—a pool of resources that Julie Cohen has theorized and critiqued as the “biopolitical public domain.”  But even more troubling, once scythed and privatized by the surveillance capitalists, our information is sifted to predict and shape our future behavior.  The shaping of our behavior by surveillance capitalists threatens individual autonomy, yes, but also popular sovereignty and democracy itself.  This may sound hyperbolic, but Zuboff methodically explains how surveillance capitalism is undermining core democratic values and why the stakes are so high. 

Here is but one example of the extraction-to-manipulation supply chain Zuboff highlights.  Google and Facebook profit in large part through targeted advertisements.  The more ads generate clicks and, in turn, purchases from advertisers, the more revenue that is generated (directly or indirectly) for the surveillance capitalists.  In other words, the more accurate—as in predictive of an individual’s purchases—the more profitable.  In their effort to best ensure that ads are predictive, the surveillance capitalists gather and analyze, through proprietary algorithms, a host of what Zuboff labels our behavioral surplus—the digital DNA we exfoliate when using (and not using) Google or Facebook. 

The scope of this surplus is immense and the different tools used to harvest it are myriad.  Through search engines, email, cookies, digital personal assistants, and a host of other tracking and mining tools, companies like Google search us—not the other way around.  And because of the market incentive for accurate predictions, surveillance capitalists work to ensure that we are nudged, cajoled, and ultimately rendered so as to “like” what they want us to “like” and purchase what they want us to purchase.  To be clear, the surveillance capitalists themselves are relatively agnostic as to whether we select the chintz drapes over the blackout blinds; the fake news over the investigative journalism; the deep fake over the bona fide political advertisement—so long as we abide by the consumption imperative.   

Put differently, surveillance capitalists deploy their vast knowledge of us in order to automate us.  To not only predict our behavior, but make it certain.  They are the twenty-first century Calvinist god embodied—predestining each of us toward greater and greater consumerism. 

But surveillance capitalism is more than this information-to-consumption pipeline.  It also involves a range of discursive practices meant to insulate surveillance capitalists from themselves being surveilled, known, and regulated. 

For example, surveillance capitalists benefit from and advance tropes that this extraction and rendition process is an unavoidable and inevitable byproduct of technology itself.  This is false.  Technology is just the means of data extraction and, subsequently, automation of our choices and identities.  There is nothing preternatural about it being deployed in this way.  As an example, Zuboff contrasts a 2000 “smart-home” project called Aware Home developed by scientists and engineers at Georgia Tech with smart home devices like the Nest thermostat (now owned by Google).  According to Zuboff, Aware Home was envisioned and guided by principles that it should operate as a closed loop, meaning that while the information system monitored the residents’ activities to improve service and functionality for the residents, the information was stored on the occupants’ computers.  Not so with Nest, whose extensive contractual provisos only provide a limited measure privacy—and, only then, in exchange for a decrease in functionality and security of the product.  In short, technology isn’t the problem.  We needn’t be luddites.  Rather, the real culprit is the surveillance capitalism ethos that idealizes information extraction and subsequent monetization of that information above all else. 

Similarly, surveillance capitalism attempts to preempt criticisms of its invasive practices by shaping social norms, including norms around privacy.  By declaring that “privacy is dead,” as industry leaders have done on many occasions, surveillance capitalists are rhetorically priming us to embrace—or at the very least ignore—the scope of intrusions into our networked lives. They similarly take advantage of narrow legal conceptions of privacy that emerged long before law had an appreciation for how the internet operates.  Principally, surveillance capitalists champion the (faulty) notion that once information is exposed to anyone else, any legal protections over that information are gone.  Surveillance capitalists expand this principle to claim ownership over our activity—in effect, planting a flag and colonizing our information.  Lengthy and unreadable adhesion contracts that force us to surrender any remaining claim to privacy bolster this rhetorical priming by taking advantage of the limited resources we have to consume much less contest these terms-of-service. 

As a final example of the discursive tools deployed to buttress this new market logic, surveillance capitalists propagate expansive notions of freedom of expression to insulate themselves from any harm that occurs over their services.  When the internet was first ascendant, legal protections were created for interactive computer service providers insulating them from liability for speech uttered by third-parties on their platforms—Section 230 of the Communications Decency Act.  While this largely unfettered protection serves important expressive values and made most sense when the internet was nascent and in need of incubation, query whether that logic still holds completely true in all contexts.  Indeed, providing tech companies immunity over third-party content has in many ways outsourced governance of the public square to the surveillance capitalists, aggrandizing their power at the expense of more democratically accountable forms of regulation.  At the same time, as Zuboff notes, it provides protection for surveillance capitalists’ most valuable resource—our behavioral surplus.  Are other industries granted such ex ante blanket immunity from regulation for actions that they arguably facilitate, profit from, and encourage?   While moderation of online speech admittedly implicates difficult questions with important values implicated on both sides, tricky questions require nuanced responses rather than absolutist ones. 

While Zuboff’s descriptive account relies most heavily on social/behavioral psychology (her discipline) rather than political theory, her painstakingly detailed account of how surveillance capitalism operates is in many ways a concrete and terrifying illustration of how power operates in post-modernity.  Power is diffuse, we know.  But Zuboff does the work of isolating how certain economic, technological, and discursive practices are being deployed symbiotically to deprive us of agency; of identity.

And this is Zuboff’s major contribution—naming and framing the unprecedented.  She moves us beyond analogies to prior political/economic eras; analogies that may obscure more than they reveal.  Without a soup-to-nuts understanding of how surveillance capitalists deploy technology, discourse, and law, we have little chance of reclaiming some modicum of control over our digitally mediated lives, and developing our own sense of self. 

While Zuboff seems to recognize that she does not have a complete solution for how to replace the surveillance capitalist ideology, she does highlight some tactics that ought to be embraced.  She holds out hope that the European Union’s General Data Protection Regulation—if robustly implemented—may restore some measure of control to individuals over how their data is collected and used.  At the very least, the GDPR can help in the process of rhetorically constructing privacy and autonomy as valued social goods.  And she extols forms of surveillance resistance and obfuscation (such as ad-blockers and false fingertip prosthetics)—resistance that I have theorized as acts of performative privacy that raise the operating costs for surveillance capitalists and governments while simultaneously signaling, expressing, and shaping norms around privacy.  In sum, Zuboff calls on us to create friction—to reject the logic of surveillance capitalism, to maintain our agency, and to, in effect, launch a digital sabot in an exercise of freedom.  We need not be determined by Google or Facebook.  As Zuboff exhorts, we can reclaim the right to the future tense.