Information rules (Part 2: Capitalism, democracy and the path forward)
Phuah Eng Chye (21 November 2020)
Information rules and capitalism
Information rules, particularly those that codify rights, play a critical role in shaping social relations and reinforcing ideological beliefs. Katharina Pistor points out the coding of assets “have characterized the evolution of capitalism over the past centuries: First came land, then firms and debt; since the late twentieth century, intangibles – intellectual property rights and financial assets – have dominated other capital assets…far from being a fixed feature, law is malleable. The contents and meaning of property rights and collateral, the rights and obligations of trustees, creditors and shareholders is altered at the margin all the time, and in this process, new capital assets are being created”.
“With the right legal coding any object, claim or idea can be turned into a wealth generating, or capital, asset. The key attributes of capital are priority, durability, universality, and convertibility…an asset has a much greater propensity to generate wealth than others. Priority means that asset holders enjoy stronger rights over others; these rights can be extended in time, lending them durability and allowing capital can grow; priority and durability can be extended and bind others, lending these legal privileges universality; and last but not least, asset holders may be given an option to convert their assets to lock in past gains”.
Katharina Pistor argues “capital is indeed a social relation, but it is mediated by the state’s coercive powers, which have been institutionalized as law. Critically, private parties can avail themselves of the state’s coercive powers by using the law for organizing their social and economic relations with one another and they enjoy considerable flexibility in doing so. Courts did play a critical role in the early evolution of capital…Today, though, private attorneys have taken center stage. They adapt the law to new assets and create new assets through legal coding. They have become the true masters of the code of capital”.
Amy Kapczynski drew on insights from Julie Cohen’s book to explain the “law of informational capitalism”. “To understand what technology signifies for the future of law, we must understand how the design of networked information technologies within business models reflects and reproduces economic and political power…Law, ideologies, and technical constraints work together to map and remap power”.
Julie Cohen’s account begins with “three overarching structural shifts that she argues enabled the transition to an informational economy: the enclosure of intangible resources, the datafication of the basic factors of industrial production, and the embedding of patterns of exchange within information platforms. As law evolved to create more property or property-like protection for information, information became an increasingly viable and valuable form of capital. Finance was datafied or dematerialized as technologies and law co-evolved to enable instantaneous transactions in global markets, exotic forms of securitization, and demonetization. Labor, another key factor of production, also became datafied via platforms like TaskRabbit and Uber that fragment the workplace, and by the use of digital networks to informationalize, deterritorialize, and fissure production. Even the value and workings of land as a resource were altered by informational dynamics. New information technologies have enabled the emergence of complex instruments such as credit-default swaps and derivatives that are increasingly important to the functioning of land as capital. Such changes turned not just on technology but on law”.
In particular, “it ushered in regulatory approaches that are procedurally informal, standards-based, mediated by expert professional and technical networks, and increasingly financialized. Intended to introduce flexibility and speed into the regulatory process, these changes also made processes more opaque and proliferated new points of entry for economic power”.
Amy Kapczynski notes “the increasing information intensity in industry and commerce today creates vast and undeniable challenges for regulators. How do you detect discrimination, manipulative marketing, or regulatory evasion when so much is buried in intricate decisions made by data-gatherers, software, and hardware? The paranoid style in regulatory reform has not helped matters: it has downplayed concerns about private power and focused obsessively on state coercion – failing to recognize the threats that private power can create and the central role that regulators have played in the emergence of robust modern informational industries”.
Amy Kapczynski notes legal discourse has been influenced by the “valorization of innovation”. “Innovation is also not risk-free or inevitably beneficial. But the fetish for innovation in contemporary policy and legal thought has had enormous power, and it has helped forestall regulatory adaptations to address systemic threats associated with informational infrastructures. It has also propelled stronger exclusionary rules”.
In addition, “the dream that open software could free us all…did not stand in the way of the development of troubling forms of private power”. “Simple arguments for freedom made possible by an unregulated internet seem naïve today, given the manipulation, extremism, and harassment that have flourished there…What was our relationship to markets – did we want spaces free from markets or free for markets, and could the same domain be both?”.
Amy Kapczynski points out “the law of intellectual property and trade secrets, of internet immunity and free speech, and of trade and contracts morphed to enable the capture of information and data as corporate capital, and to allow their deployment to extract surplus in new ways. Our legal order, intertwined with the architecture of digital networks, has enabled the creation of vast new firms that wield new forms of surveillance and algorithmic power, but it also has delivered us a form of neoliberal capitalism that is inclined toward monopoly, concentrated power, and inequality. Most troubling are the developments in takings law, free speech law, and free trade law that are working to insulate growing private economic and surveillance power from democratic control. Can public power sufficient to govern this private power be built? With what laws, ideas, and technologies? Questions of data and democracy, not just data and dignity, must be at the core of our concern today and are among the most important questions of our time…Cohen’s account suggests, importantly, that there will be no magic bullet. Just as there is no single law that constructs private power in the digital age, there will be no single law to democratize it. Data is not oil but a product of social and legal creation”.
Overall, capitalism relies on information rules – or the rule of law – to define and protect private interests. However, from a historical perspective, many of the concepts supporting capitalism such as corporations, governance and disclosures are modern notions and their legal and social meanings are still evolving. In recent years, law is increasingly complemented by code. This affects the balance of interests among government, capital, labour and consumers in society. The concern is that the power derived from code has concentrated into the hands of a few global platforms outside the jurisdiction of democratically-elected rule-making bodies. The accumulation of private power is also of concern because of the dangers it poses to democracy.
Information rules and democracy
The information rules that code ownership rights are important for capitalist societies. But information rules also affect the types of democracy in that society. Amy Kapczynski highlights there is a need “to more deeply theorize the relationships we envision among freedom, markets, the state, and society; and about the importance of incorporating not only an analytic of power but of market society and capitalism, that is sensitive to how spaces that in one register appear free and neutral nonetheless can be primed to reproduce the hierarchies of old”. In this context, “legal ordering is being used not simply to help generate and sustain private power but to insulate it from democratic control”.
Samer Hassan and Primavera De Filippi notes “we are spending increasing amounts of our lives interacting within platforms, whose user base belittle that of existing nation states…And yet, their governance is very far from the values of democratic countries. Instead, they are governed by software and algorithms that regulate our interactions and online communications through obscure rules embedded in source code, and elaborated by a handful of private actors”.
Lawrence Lessig points out “the basic code of the Internet implements a set of protocols called TCP/IP…enable the exchange of data…without the networks knowing the content of the data, or without any true idea of who in real life the sender of a given bit of data is. This code is neutral about the data, and ignorant about the user. These features… make regulating behavior difficult. To the extent that it is hard to identify who people are, it is harder to trace behavior back to a particular individual. And to the extent it is hard to identify what kind of data is being sent, it is harder to regulate the use of particular kinds of data. These architectural features of the Internet mean that governments are relatively disabled in their ability to regulate behavior on the Net”.
Lawrence Lessig notes in some instances, “this unregulability is a virtue…protects free speech…makes it relatively hard for governments, or powerful institutions, to control who says what when…The Net makes it hard because its architecture makes it hard”. In other instances, “this unregulability is not a virtue…not just with Nazi speech and child porn…the architecture does not enable secure transactions; where it makes it very easy to hide the source of interference; where it facilitates the distribution of illegal copies of software and music”.
He argues “no thought is more dangerous to the future of liberty in cyberspace than this faith in freedom guaranteed by the code. For the code is not fixed. The architecture of cyberspace is not given. Unregulability is a function of code, but the code can change. Other architectures can be layered onto the basic TCP/IP protocols, and these other architectures can make behavior on the Net fundamentally regulable. Commerce is building these other architectures; the government can help; the two together can transform the character of the Net. They can and they are”.
In this regard, Lawrence Lessig notes “architectures are not binary…What the architecture enables, and how it limits its control, are choices. And depending upon these choices, much more than regulability will be at stake…The technology could make it possible to selectively certify facts about you, while withholding other facts about you. The technology could function under a least-revealing-means test in cyberspace even if it can’t in real space…The difference between these designs is that one enables privacy in a way that the other does not. One codes privacy into an identification architecture by giving the user a simple choice about how much is revealed; the other is oblivious to that value. Thus whether the certification architecture that emerges protects privacy depends upon the choices of those who code. Their choices depend upon the incentives they face. If protecting privacy is not an incentive – if the market has not sufficiently demanded it and if law has not, either – then this code will not provide it”.
He reasons “the choice about code and law will be a choice about values…Our choice is not between regulation and no regulation. The code regulates. It implements values, or not. It enables freedoms, or disables them. It protects privacy, or promotes monitoring. People choose how the code does these things. People write the code. Thus the choice is not whether people will decide how cyberspace regulates. People – coders – will. The only choice is whether we collectively will have a role in their choice – and thus in determining how these values regulate–or whether collectively we will allow the coders to select our values for us”.
Lawrence Lessig also suggests when governments step aside, “other interests take their place. Do we know what those interests are? And are we so certain they are anything better?” It is important to “interrogate the architecture of cyberspace” to reaffirm “our commitment to fundamental values” or “we will miss the threat that this age presents to the liberties and values that we have inherited. The law of cyberspace will be how cyberspace codes it, but we will have lost our role in setting that law”.
Tim O’Reilly cautions “while such a future no doubt raises many issues and might be seen by many as an assault on privacy and other basic freedoms, early versions of that future are already in place in countries like Singapore and can be expected to spread more widely…It’s important to understand that these manual interventions are only an essential first step…The use of algorithmic regulation increases the power of regulators, and in some cases, could lead to abuses, or to conditions that seem anathema to us in a free society. Mission creep is a real risk. Once data is collected for one purpose, it’s easy to imagine new uses for it…The answer to this risk is not to avoid collecting the data, but to put stringent safeguards in place to limit its use beyond the original purpose. As we have seen, oversight and transparency are particularly difficult to enforce when national security is at stake and secrecy can be claimed to hide misuse…Whenever possible, governments putting in place algorithmic regulations must put in place similar quality measurements, emphasizing not just compliance with the rules that have been codified so far but with the original, clearly-specified goal of the regulatory system. The data used to make determinations should be auditable, and whenever possible, open for public inspection” There are also huge privacy risks involved in the collection of the data needed to build true algorithmic regulatory systems”. “It is true that that government governs best that governs least. But the secret to governing least is to identify key outcomes that we care about as a society – safety, health, fairness, opportunity – encode those outcomes into our laws, and then create a constantly evolving set of regulatory mechanisms that keep us on course towards them”.
Amy Kapczynski argues there is more at stake “particularly for those interested in building a more democratic political economy – lessons, perhaps, about the limits of hacks to the property system in the absence of more transformative changes to our market society; about the need to more deeply theorize the relationships we envision among freedom, markets, the state, and society; and about the importance of incorporating not only an analytic of power but of market society and capitalism, that is sensitive to how spaces that in one register appear free and neutral nonetheless can be primed to reproduce the hierarchies of old”.
Jaron Laniere and Glen Weyl suggest that “ultimately, what people need in their digital lives is not maximized privacy per se, any more than what they need in their work lives is maximized leisure. In both cases, people need, in essence, the right to be left alone: a reasonable ability to construct what is seen and known about themselves by others, reasonable limitations on what efforts are demanded of them, an accessible means of self-determination, fair compensation for what they do give up, and an affirmative environment in society for seeking meaning and happiness”.
Overall, the boundaries for a democracy are set by the rules on transparency; i.e. society’s preferred mix of opacity and transparency. There are various trade-offs. Prioritising opacity (such as privacy) means creating rules to enable individuals to exercise significant control over personal data and protecting anonymity. Opacity can act as a protective buffer but it also creates blind spots which hampers problem identification, remedial actions and compromises efficiency, authenticity and accountability.
Prioritising transparency implies creating rules that favour data sharing, disclosure and the right-to-know. But the removal of opacity (privacy) buffers exposes individuals and firms to the harshness of a transparent environment – such as exploitation, discrimination (including tainting by stale data), harassment and other retaliatory actions. The loss of anonymity and the inclination to sanitise content can have chilling effects on free speech. In an intolerant and polarised society, debate can turn into haranguing and unruly behaviour that can impede coordination and paralyse an economy. Despite these flaws, democracy and freedoms are more often closely associated with transparency. It is doubtful a sanitised environment, where information is extensively moderated (censored) or hidden, can really function as a true democracy.
The path forward to the information society
In recent decades, societies felt a growing sense of helplessness in its ability to deal with challenges. Information disrupted the traditional modes of control and created a power vacuum which is a source of instability. In this regard, information rules are integral to efforts to rebalance power among different stakeholders and re-establish order in society. Information rules set the path towards achieving the type of democracy we want in an information society. A robust democracy will require information rules that promote greater democracy of information and transparency as the means of restoring order and making those who hold the reins of power accountable.
However, rule-making cannot move ahead of the curve and needs to be based on what is visible. The stakes are high in the context of the transition to an information society. Countries that pick the right set of information rules are likely to advance while those give in to reactionary forces will likely regress – paralysed by their inability to coordinate efficiently, hamstrung by their inability to manage pressures from transparency and to resolve disputes efficiently.
Amy Kapczynski (March 2020) “The law of informational capitalism”. The Yale Law Journal. https://www.yalelawjournal.org/pdf/KapczynskiBookReview_b2hvici9.pdf
Gilles Deleuze (Winter 1992) “Postscript on the societies of control”. October, Vol. 59. https://cidadeinseguranca.files.wordpress.com/2012/02/deleuze_control.pdf
Jaron Laniere, Glen Weyl (26 September 2018) “A blueprint for a better digital society”. https://hbr.org/2018/09/a-blueprint-for-a-better-digital-society
Katharina Pistor (20 September 2019) “The secret code of capital and the origin of wealth inequality”. Promarket. https://promarket.org/the-secret-code-of-capital-and-the-origin-of-wealth-inequality/
Lawrence Lessig (January 2000) “Code Is law: On liberty in cyberspace”. Harvard magazine. https://www.harvardmagazine.com/2000/01/code-is-law-html
Phuah Eng Chye (2015) Policy paradigms for the anorexic and financialised economy: Managing the transition to an information society.
Phuah Eng Chye (29 February 2020) “The journey from privacy to transparency (and back again)”. http://economicsofinformationsociety.com/the-journey-from-privacy-to-transparency-and-back-again/
Phuah Eng Chye (14 March 2020) “Features of transparency”.
Phuah Eng Chye (28 March 2020) “The transparency paradigm”.
Phuah Eng Chye (11 April 2020) “Anonymity, opacity and zones”.
Phuah Eng Chye (7 November 2020) “Information rules (Part 1: Law, code and changing rules of the game)”. http://economicsofinformationsociety.com/information-rules-part-1-law-code-and-changing-rules-of-the-game/
Samer Hassan, Primavera De Filippi (2017) “The expansion of algorithmic governance: From code is law to law is code”. The Journal of Field Actions. https://journals.openedition.org/factsreports/4518
Tim O’Reilly (2013) “Open data and algorithmic regulation”. Chapter 22 Beyond Transparency. https://beyondtransparency.org/chapters/part-5/open-data-and-algorithmic-regulation/
 Between truth and power: The legal constructions of informational capitalism.
 Such as cashless payment systems and cryptocurrencies.
 See “The journey from privacy to transparency (and back again)”, “Features of transparency”, “The transparency paradigm” and “Anonymity, opacity and zones”.
 Gilles Deleuze’s narrative suggests technology is causing a shift from disciplinary societies, a concept proposed by Michel Foucault, to societies of control. “In the disciplinary societies one was always starting again (from school to the barracks, from the barracks to the factory), while in the societies of control, one is never finished with anything – the corporation, the education system, the armed services being metastable states coexisting in one and the same modulation, like a universal system of deformation…We no longer find ourselves dealing with the individual/pair. Individuals have become dividuals and masses, samples, data, markets, or banks”.