Artificial Sovereigns: A Quasi-Constitutional Moment for Tech?

K. Sabeel Rahman –

Consider the following developments:

  • In recent weeks, the explosive revelations about Cambridge Analytica and its systemic data-mining of Facebook profiles has cast into relief the way in which our contemporary digitized public sphere is not a neutral system of communication but rather a privately built and operated system of mass surveillance and content manipulation.46038488 - law concept: circuit board with  scales icon, 3d render
  • Meanwhile, Alphabet has announced that its subsidiary, Sidewalk Labs, will take over management of a major redevelopment of part of Toronto’s waterfront, in an effort to build from the ground up a modern “smart city.”
  • These developments come amidst the longer-term development of new forms of technological transformations of our political economy, from the rise of Amazon to its position as the modern infrastructure for the retail economy, to the ways in which technology is transforming the nature of work and the social safety net.

There has been a growing sense of concern about the twin crises of twenty-first-century democracy on the one hand and of the growing problems of inequality and insecurity on the other. Technological change is at the heart of both of these transformations. Technological change alters the distribution and dynamics of political and economic power, creating new forms of “functional sovereignty”—state-like powers concentrated in entities and systems that are not subject to the institutional and moral checks and balances that we associate with the exercise of public power. Such arbitrary power represents a kind of quasi-sovereignty that, left unchecked, poses a threat of domination.

The rich scholarly debate on law and technology has surfaced a range of approaches for addressing some of these concerns, from legal standards for privacy and data use to antitrust and public utility regulation, and more. These proposals and interventions can be reframed as part of a broader challenge of defusing the threat of domination created by these technological systems. Regulating and responding to new technologies and modern forms of economic and political power thus represent a variation on familiar questions of public law and constitutional design: how to structure the exercise of potentially arbitrary, state-like power, rendering it contestable, and therefore legitimate.

Continue reading

Friday Roundup

Happy Friday!

This week, the public continued to grapple with the revelation that Facebook disclosed more than 50 million users’ account information to the right-wing political consultancy Cambridge Analytica. LPE contributor Frank Pasquale has previously described how we should understand the big internet platforms as exercising a form of “functional sovereignty,” a description seemed as apt than this week as ever, when people realized how little government regulation restricted Facebook’s use of their private information. Here are three recent pieces that have caught our attention with an LPE take on the issue:

  • Beware the Big Five – the U.S. military and intelligence sector’s venture capital funding has fostered the tech sector’s consolidation and permitted the growth of private empires on the back of publicly funded R&D.
  • Facebook Isn’t Just Violating Our Privacy – Facebook is insistent on seeing its failures as harming individuals, never society as a whole, but we must insist on using collective questions to challenge Silicon Valley’s libertarian perspective.
  • The New Military-Industrial Complex of Big Data Psy-Ops – Silicon Valley’s behavioral science research has been critical to the military and intelligence apparatus.

 

Elsewhere on the web:

 

Nick Werle is a student at Yale Law School.

The Law and Political Economy of the “Future of Work”

Brishen Rogers

How will new advanced information technologies impact work? This is a major focus of public debate right now, driven by widespread fears that automation will soon leave tens of millions unemployed. But debate so far has tended to neglect the relationship among technological innovation, political economy, and the law of work. This is a major omission, since the automation of particular tasks doesn’t just happen. Rather, it takes place under laws that are subject to democratic oversight and revision – and with different laws, we could encourage a radically different path of technological development, one in which workers have a real voice, and in which they share consistently in technology-driven productivity gains.

Take two upcoming transformations that we’re all familiar with: the automation of some kinds of driving and some kinds of fast-food work. Within a few years, truckers, delivery drivers, and taxi drivers may be able to use an autonomous mode consistently on highways. Later on, they may be able to do so on major suburban and rural streets. But given the wide variation in road quality, humans will likely need to pilot vehicles in residential areas and on city streets for some time to come. And given the wide variation in building structures that delivery robots would need to navigate, humans will almost certainly need to complete deliveries in many instances.

Similarly, in fast food, ordering kiosks are already displacing cashiers, but not in their entirety. Some customers are unable to use the kiosks, including the 70% of McDonalds customers who use the drive-through. Sometimes the kiosks will break down, and sometimes orders won’t be processed appropriately, and thus workers will need to step in. Food preparation may also be automated in part, but given the fine motor control and tacit knowledge required for cooking, it has proven resistant to full automation. Like the transformation in driving, then, this change will likely be gradual and iterative. Technology will augment human capabilities rather than replacing humans wholesale, and workers, companies, and consumers will need to adapt over time.

Continue reading