Guiding Innovation’s Hand: Industrial Policy Against Inequality

This is the second post in our series discussing The Meritocracy Trap by Daniel Markovits. Click here to read all posts in the series. 

Jeff Gordon – 

Most of the critical attention directed at Daniel Markovits’s The Meritocracy Trap has focused on its claim that well-off parents launder inequality through schooling. While Markovits brings masterfully comprehensive reams of data to bear on the concept of the “meritocratic inheritance,” the most original and provocative part of the book comes later, when Markovits offers his explanation of why educational sorting has come to matter so much: elite schooling leads to top jobs, and “[t]he top jobs pay so well because a raft of new technologies has fundamentally transformed work to make exceptional skills enormously more productive than they were at mid-century and ordinary skills relatively less productive.” This is provocative because it contradicts the pervasive myth that technological change is natural, self-directing, or inevitable. Few reviewers have remarked on this part of the book or reflected on what it suggests: that industrial policy will be vital to building a more equal economy.

Continue reading

The Second Wave of Algorithmic Accountability

Frank Pasquale –

Over the past decade, algorithmic accountability has become an important concern for social scientists, computer scientists, journalists, and lawyers. Exposés have sparked vibrant debates about algorithmic sentencing. Researchers have exposed tech giants showing women ads for lower-paying jobs, discriminating against the aged, deploying deceptive dark patterns to trick consumers into buying things, and manipulating users toward rabbit holes of extremist content. Public-spirited regulators have begun to address algorithmic transparency and online fairness, building on the work of legal scholars who have called for technological due process, platform neutrality, and nondiscrimination principles.

This policy work is just beginning, as experts translate academic research and activist demands into statutes and regulations. Lawmakers are proposing bills requiring basic standards of algorithmic transparency and auditing. We are starting down on a long road toward ensuring that AI-based hiring practices and financial underwriting are not used if they have a disparate impact on historically marginalized communities. And just as this “first wave” of algorithmic accountability research and activism has targeted existing systems, an emerging “second wave” of algorithmic accountability has begun to address more structural concerns. Both waves will be essential to ensure a fairer, and more genuinely emancipatory, political economy of technology.

Continue reading

Privacy Legislation, not Common Law Duties

NB: This post is part of the “Skepticism About Information Fiduciaries” symposium. Other contributions can be found here.

Harold Feld–

The United States has the distinction among developed nations of lacking a comprehensive consumer privacy protection law. To fill this gap, Professor Jack Balkin proposes the creation of a new class of common law fiduciaries subject to a heightened duty of care when entrusted with a party’s personal information. In addition to providing an answer to possible First Amendment problems that could arise from limiting the ability of businesses to collect personal information and use the collected information for targeted advertising, Balkin argues that courts may expand traditional fiduciary duties to this new class of “information fiduciaries” in the accordance with traditional common law principles. This would overcome the current failure of Congress and nearly all state legislatures to address the increasingly urgent problem of personal privacy in the digital economy.

Balkin’s information fiduciary proposal, while attractive in addressing some businesses that rely on collection of personal information for targeted advertising, does not do nearly enough to protect personal privacy given the unavoidable size of our information footprint. Further, an examination of existing First Amendment case law shows no clear advantage for identification of a new common law fiduciary relationship over privacy legislation. Finally, the recent passage of the California Consumer Privacy Act (CCPA) has galvanized interest in passing comprehensive privacy legislation both on a federal level and among the other states – whereas no court has yet to identify an “information fiduciary” under the common law.

The value of Balkin’s fiduciary framework, I argue, resides not in providing an enforceable legal relationship but providing a framework for privacy legislation. The existing frameworks – the Privacy Principles adopted by the Organization for Economic Co-operation and Development (OECD) in 1980 which rely heavily on notice and consent and the property framework introduced by Louis Brandies in “The Right To Privacy” (both of which I discuss in this privacy white paper) – have significant limitations. Balkin’s proposed fiduciary framework provides a model for legislation that recognizes that the nature of the relationship between information collectors and aggregators requires imposing additional duties and restrictions to adequately protect consumers, while still enabling commerce and facilitating competition.

Continue reading

Innovation for Who? Reclaiming Public Purpose in the Urban Transportation Pilot

Arielle Fleisher and Chris Chou

Hidden beneath the buzz about how technology is transforming urban transit is a quiet revolution in the way that cities approach the management of their streets. In the face of rapid change, cities and transit agencies are increasingly relying on pilot programs to manage the introduction of new modes of transportation and new uses of the right of way. Pilot approaches have spread through all regions of the country and are utilized by cities of all sizes and for numerous applications including dockless bike share, autonomous buses, micro-transit, delivery robots, and smart streetlights.

It’s important to appreciate how the pilot approach departs from how cities typically regulate and manage urban transportation problems. The public sector often makes slow decisions and avoids risks. But that stability can be a disadvantage if it ossifies and can’t accommodate changes to the system. Without compromising cities’ ability to use public funding, exercise regulatory authority, and pursue the public’s interest, pilots provide cities and transit agencies with flexibility. They give cities a safe space to try new approaches while managing the potential chaos of new technologies.

Yet pilots today are too often centered around technology alone. Continue reading

Scaling Trust and Other Fictions

NB: This post is part of the “Skepticism About Information Fiduciaries” symposium. Other contributions can be found here.

Julie E. Cohen – 

fbookThe Centenal Cycle, the Hugo-nominated trilogy by novelist Malka Older, describes a not-too-distant future in which the existing liberal world order has been replaced by a regime of mass-mediated micro-democracy. With some exceptions—a handful of so-called null states that opted out and a more intriguing smattering of territories that opted for self-rule without the mass mediation—nation-states and their subordinate governance units have been dissolved. The vast majority of people live in centenals—contiguous territories of no more than 100,000 citizens—administered by entities of various persuasions that compete for their affiliation. Governments range from powerful, globally distributed operations such as Liberty, Heritage, and PhilipMorris to the nerdy Policy1st to regional players like AfricaUnity and DarFur to small, quirky outfits like the generally libertarian and fun-loving Free2B.

The regime of micro-democracy relies on networked information and communication services provided by an entity called, simply, Information. When we encounter it, it has assumed the status of an independent, nongovernmental entity with an unambiguously public-regarding mandate to function as a neutral guarantor of information quality.

Of course, that is easier said than done. Governments, splinter groups, and null states have incentives to sow mis- and disinformation for their own purposes. Guaranteeing information quality requires both comprehensive surveillance and an impressive array of counter-espionage capabilities. There are intricate cat-and-mouse games between the watchers and those attempting to evade them. Technologically sophisticated separatists spoof surveillance cameras and disinformation-detection algorithms and devise means of lurking undetected within secure communications channels and data streams. Resistance and subversion also establish bases of operation within Information itself. The dream of a sustainable micro-democratic order mediated by a neutral corps of public-spirited technocrats ultimately proves untenable, and yet the dream is so compelling that as the narrative closes on the aftermath of a systemic breakdown, Older’s band of protagonists is hatching plans to rebuild infrastructures, redesign institutions, and try again.

What does any of this have to do with Khan and Pozen on Balkin? The monolithic, public-spirited Information, the multiple, capitalist information fiduciaries of the Balkin proposal (see here and here), and the regime of structural regulation of information intermediaries that Khan and Pozen appear to imagine would seem to have very little in common. But they are imagined responses to the same problem: that of governing data-driven algorithmic processes that operate in real time, immanently, automatically, and at scale. More specifically, they are visions that engage with the problems of speed, immanence, automaticity, and scale in radically different ways.

Continue reading

Symposium: A Skeptical View of Information Fiduciaries

Lina Khan & David Pozen –

fbookIn recent years, the concept of “information fiduciaries” has surged to the forefront of debates on platform regulation. Developed by Professor Jack Balkin, the informationfiduciary proposal seeks to mitigate the asymmetry of power between a handful of dominant digital firms and the millions of people who depend on them. Just as doctors, lawyers, and accountants are assigned special legal duties of care, confidentiality, and loyalty toward their patients and clients, Balkin argues that Facebook, Google, and Twitter should owe analogous duties toward their end users. This argument has gained broad support. Last December, over a dozen Democratic Senators introduced legislation that would designate online service providers as fiduciaries for their users, effectively implementing Balkin’s proposal.

In a forthcoming essay, we question the wisdom of applying a fiduciary framework to dominant digital platforms. Focusing on the case of Facebook—Balkin’s central example of a purported information fiduciary—we identify a number of lurking tensions in the proposal. For instance:

Continue reading

Predatory Lending and the Predator State

NB: This post is part of the “Piercing the Monetary Veil” symposium. Other contributions can be found here.

Raul Carrillo–

Like most advocates of Modern Monetary Theory (MMT), I didn’t embrace the paradigm because I dig late-night chats about accounting identities. Rather, I found it while pursuing economic justice (following the lead of Angela Harris, Emma Coleman Jordan, and other allies). Today, I fight financial predators — banks, landlords, debt collectors, and agencies engaged in racialized wealth extraction — on the daily. And so, my MMT enthusiasm remains…practical.

Although the commentariat caricatures MMT as a rationalization for U.S. deficit spending, it’s something more powerful — a new interdisciplinary lens, shaped for eyes on the prize of justice. Most importantly, MMT is rooted in legal analysis; its neochartalist foundations help illuminate financial hierarchies — so we can better dismantle them around the world.

As elites literally claim human survival is “too expensive”, it’s crucial for movements to absorb this symposium’s chief insight: money itself, although not always starkly a creature of the “state”, is a creature of law, just like the institutions through which it flows.

When we analyze money as public software, rather than private hardware, we see political economy differently. For example, as Harris argues, any movement for economic justice must overcome the toxic trope of the “undeserving benefit recipient” (and the corollary trope of the put-upon khaki-clad patriarch). In my view, MMT helps us challenge this divide-and-conquer strategy, by undermining the technical premises of “taxpayer citizenship” — the racialized and gendered notion that rights should correspond to one’s nominal contributions to government coffers. When Stephanie Kelton reminds us that “money doesn’t grow on rich people”, she is making an inference LPE readers should appreciate: the wealthy do not get their money by generating it, but by mastering a system that routes tradeable legal claims on real resources that we collectively produce (i.e. “money”) to themselves. As I’ve emphasized, the coercion Robert Lee Hale described leads the rest of us not merely to work, but to work for legal tender, which can settle debts between individuals, but must satisfy debts to the state (most notably, taxes).

Continue reading

Artificial Sovereigns: A Quasi-Constitutional Moment for Tech?

K. Sabeel Rahman –

Consider the following developments:

  • In recent weeks, the explosive revelations about Cambridge Analytica and its systemic data-mining of Facebook profiles has cast into relief the way in which our contemporary digitized public sphere is not a neutral system of communication but rather a privately built and operated system of mass surveillance and content manipulation.46038488 - law concept: circuit board with  scales icon, 3d render
  • Meanwhile, Alphabet has announced that its subsidiary, Sidewalk Labs, will take over management of a major redevelopment of part of Toronto’s waterfront, in an effort to build from the ground up a modern “smart city.”
  • These developments come amidst the longer-term development of new forms of technological transformations of our political economy, from the rise of Amazon to its position as the modern infrastructure for the retail economy, to the ways in which technology is transforming the nature of work and the social safety net.

There has been a growing sense of concern about the twin crises of twenty-first-century democracy on the one hand and of the growing problems of inequality and insecurity on the other. Technological change is at the heart of both of these transformations. Technological change alters the distribution and dynamics of political and economic power, creating new forms of “functional sovereignty”—state-like powers concentrated in entities and systems that are not subject to the institutional and moral checks and balances that we associate with the exercise of public power. Such arbitrary power represents a kind of quasi-sovereignty that, left unchecked, poses a threat of domination.

The rich scholarly debate on law and technology has surfaced a range of approaches for addressing some of these concerns, from legal standards for privacy and data use to antitrust and public utility regulation, and more. These proposals and interventions can be reframed as part of a broader challenge of defusing the threat of domination created by these technological systems. Regulating and responding to new technologies and modern forms of economic and political power thus represent a variation on familiar questions of public law and constitutional design: how to structure the exercise of potentially arbitrary, state-like power, rendering it contestable, and therefore legitimate.

Continue reading

Friday Roundup

Happy Friday!

This week, the public continued to grapple with the revelation that Facebook disclosed more than 50 million users’ account information to the right-wing political consultancy Cambridge Analytica. LPE contributor Frank Pasquale has previously described how we should understand the big internet platforms as exercising a form of “functional sovereignty,” a description seemed as apt than this week as ever, when people realized how little government regulation restricted Facebook’s use of their private information. Here are three recent pieces that have caught our attention with an LPE take on the issue:

  • Beware the Big Five – the U.S. military and intelligence sector’s venture capital funding has fostered the tech sector’s consolidation and permitted the growth of private empires on the back of publicly funded R&D.
  • Facebook Isn’t Just Violating Our Privacy – Facebook is insistent on seeing its failures as harming individuals, never society as a whole, but we must insist on using collective questions to challenge Silicon Valley’s libertarian perspective.
  • The New Military-Industrial Complex of Big Data Psy-Ops – Silicon Valley’s behavioral science research has been critical to the military and intelligence apparatus.


Elsewhere on the web:


Nick Werle is a student at Yale Law School.

The Law and Political Economy of the “Future of Work”

Brishen Rogers

How will new advanced information technologies impact work? This is a major focus of public debate right now, driven by widespread fears that automation will soon leave tens of millions unemployed. But debate so far has tended to neglect the relationship among technological innovation, political economy, and the law of work. This is a major omission, since the automation of particular tasks doesn’t just happen. Rather, it takes place under laws that are subject to democratic oversight and revision – and with different laws, we could encourage a radically different path of technological development, one in which workers have a real voice, and in which they share consistently in technology-driven productivity gains.

Take two upcoming transformations that we’re all familiar with: the automation of some kinds of driving and some kinds of fast-food work. Within a few years, truckers, delivery drivers, and taxi drivers may be able to use an autonomous mode consistently on highways. Later on, they may be able to do so on major suburban and rural streets. But given the wide variation in road quality, humans will likely need to pilot vehicles in residential areas and on city streets for some time to come. And given the wide variation in building structures that delivery robots would need to navigate, humans will almost certainly need to complete deliveries in many instances.

Similarly, in fast food, ordering kiosks are already displacing cashiers, but not in their entirety. Some customers are unable to use the kiosks, including the 70% of McDonalds customers who use the drive-through. Sometimes the kiosks will break down, and sometimes orders won’t be processed appropriately, and thus workers will need to step in. Food preparation may also be automated in part, but given the fine motor control and tacit knowledge required for cooking, it has proven resistant to full automation. Like the transformation in driving, then, this change will likely be gradual and iterative. Technology will augment human capabilities rather than replacing humans wholesale, and workers, companies, and consumers will need to adapt over time.

Continue reading