Scaling Trust and Other Fictions

NB: This post is part of the “Skepticism About Information Fiduciaries” symposium. Other contributions can be found here.

Julie E. Cohen – 

fbookThe Centenal Cycle, the Hugo-nominated trilogy by novelist Malka Older, describes a not-too-distant future in which the existing liberal world order has been replaced by a regime of mass-mediated micro-democracy. With some exceptions—a handful of so-called null states that opted out and a more intriguing smattering of territories that opted for self-rule without the mass mediation—nation-states and their subordinate governance units have been dissolved. The vast majority of people live in centenals—contiguous territories of no more than 100,000 citizens—administered by entities of various persuasions that compete for their affiliation. Governments range from powerful, globally distributed operations such as Liberty, Heritage, and PhilipMorris to the nerdy Policy1st to regional players like AfricaUnity and DarFur to small, quirky outfits like the generally libertarian and fun-loving Free2B.

The regime of micro-democracy relies on networked information and communication services provided by an entity called, simply, Information. When we encounter it, it has assumed the status of an independent, nongovernmental entity with an unambiguously public-regarding mandate to function as a neutral guarantor of information quality.

Of course, that is easier said than done. Governments, splinter groups, and null states have incentives to sow mis- and disinformation for their own purposes. Guaranteeing information quality requires both comprehensive surveillance and an impressive array of counter-espionage capabilities. There are intricate cat-and-mouse games between the watchers and those attempting to evade them. Technologically sophisticated separatists spoof surveillance cameras and disinformation-detection algorithms and devise means of lurking undetected within secure communications channels and data streams. Resistance and subversion also establish bases of operation within Information itself. The dream of a sustainable micro-democratic order mediated by a neutral corps of public-spirited technocrats ultimately proves untenable, and yet the dream is so compelling that as the narrative closes on the aftermath of a systemic breakdown, Older’s band of protagonists is hatching plans to rebuild infrastructures, redesign institutions, and try again.

What does any of this have to do with Khan and Pozen on Balkin? The monolithic, public-spirited Information, the multiple, capitalist information fiduciaries of the Balkin proposal (see here and here), and the regime of structural regulation of information intermediaries that Khan and Pozen appear to imagine would seem to have very little in common. But they are imagined responses to the same problem: that of governing data-driven algorithmic processes that operate in real time, immanently, automatically, and at scale. More specifically, they are visions that engage with the problems of speed, immanence, automaticity, and scale in radically different ways.

The information fiduciaries proposal promises to subject (some) entities that operate data-driven, algorithmic processes to a more robust standard of care. To oversimplify only slightly, it would do so by the expedient of decreeing a duty to exist and allowing subsequent interactions among the parties and consumer protection authorities to realign incentives. The trenchant critique by Khan and Pozen effectively exposes the emptiness at the heart of any proposal that proclaims a “fiduciary” arrangement while leaving both the basic platform business model and the basic structure of the platform-consumer relationship undisturbed.

For my purposes here, what is interesting is a different kind of category error that lies at the heart of the information fiduciaries proposal. Classic fiduciaries—doctors, lawyers, priests—operated on small scales and at human rhythms for a reason. The fiduciary construct implies a mutual encounter predicated on the knowability of human beings as human beings, with mutually intelligible desires and needs. The information fiduciaries proposal abstracts speed, immanence, automaticity, and scale away from that encounter and then assumes they never mattered in the first place. In the process, it both sacrifices the fiduciary arrangement’s most essential characteristics and fails to reckon adequately with the characteristics of the platform-consumer relationship that are most problematic.

So far, however, proposals for structural regulation fare little better. In her other work (here and here), Khan has argued for a reinvigorated, neo-Progressive approach to antitrust regulation. The progressive antitrust playbook places scale—understood as unacceptable bigness—front and center, but its focus on bigness means it has little of direct significance to say about the personalization end of the scale or about the ways that speed, immanence, and automaticity intertwine with personalization to produce new kinds of information harms that are both personal and social. Others (for example, here and here) point to common carriage and public utility frameworks but for the most part confine themselves to reworking existing institutional playbooks. But twentieth-century regulatory toolkits emphasizing rate controls and access mandates do not automatically port to contexts that pose very different kinds of access and accountability problems. Nor do they address other large-scale institutional reconfigurations that the informational economy might require (see here and, soon, here).

Older’s fictional regime of micro-democracy tackles the problems of speed, immanence, automaticity, and scale differently, scaling governments down and data-driven algorithmic processes up while subjecting those processes to new mechanisms for accountability that sound in computer forensics and information science. To maintain election integrity, Information must make and enforce constant choices about information quality. To render governments accountable to the governed, it must ensure the transparency, security, and integrity of its own processes. To render itself accountable, it must perform these functions in a way that enables continual reconstruction and audit. For my purposes here, the most important point about all of this is that the structure and obligations of Information cannot be derived by analogy to familiar common law or industrial-era regulatory forms. To fulfill its functions, Information must make use of regulatory tools and constructs that we are only beginning to think about inventing.

My argument isn’t that we should be rushing to bring Older’s fictional creations to life. For one thing, they don’t work out especially well; for another, there are important legal-institutional implementation details that we never learn. But the story of the rise and fall of micro-democracy nonetheless illustrates larger points about both the horizon of possibility for institutional change and the intrinsic poverty of the legal imagination. Proposals for reform—Balkin’s, Khan’s, and many others—smack overwhelmingly of nostalgia. If we could just rewind the clock to the nineteenth century—fiduciaries!—or the Progressive Era—trust-busting! public utilities!—or the Civil Rights Era—antidiscrimination!—all would be well. Analogies to the past are instructive for other reasons, but they cannot teach us how to devise governance mechanisms optimized for data-driven algorithmic processes and oriented toward the problems of speed, scale, immanence, and automaticity that such processes present. Governing democracy requires information; preserving democracy requires governing Information. And governing Information, both in fiction and in reality, requires a different approach to institutional design. Institutional innovations are inevitably flawed—vulnerable to subversion and open to cooptation—but continuing institutional innovation may be democracy’s last, best hope.

Julie E. Cohen (@julie17usc) is the Mark Claster Mamolen Professor of Law & Technology at the Georgetown University Law Center.