Information rules (Part 6: Disinformation, transparency and democracy)

Information rules (Part 6: Disinformation, transparency and democracy)

Phuah Eng Chye (16 January 2021)

The politicisation of content is destabilising the order established by mainstream narratives. Rival groups are weaponising content; deploying overload and complexity-related strategies to by-pass controls and attack incumbent narratives. There is a growing sense no one is really in control of the public narrative; a trend aggravated by a generational gap between an aging leadership with the young social media-savvy generation.

The consequences are significant because the information[1] sphere has expanded and power is increasingly defined in information rather than physical terms. The battle to control narratives has become more critical because it is becoming less feasible to exercise power through physical force. Content provides the ambiguous context that sets the tone for narratives that influences hearts and minds. Content-related misbehaviours[2] are increasingly deemed as serious crimes[3] and are subject to severe penalties. Content missteps by individuals, firms and governments are increasingly subject to retaliatory actions such as boycotts, sanctions or cyber-attacks.

Disinformation in the information society

Scott Lash’s theory predicts disinformation and disorder would increase as the unintended consequence of an information society. Scott Lash reminds us this is “the paradox of the information society…such highly rational production result in the incredible irrationality of information overload, misinformation, disinformation and out-of-control information”. “At issue indeed is the desinformierte informationsgesellschaft (disinformed information society)”.

Scott Lash explains “the contradiction is that as the information leads ever more to a smartening up, it at the same time brings with it a certain inevitable dumbing down…Unlike narrative, information compresses beginning, middle and end into a present immediacy of a now-here. Unlike discourse, information does not need legitimating arguments. Does not take the form of proportional utterances, but works with an immediate communicational violence”.

The unintended consequence of knowledge and rationality as a factor of production is related to the (global) information culture. “It has to do with information overload…information becomes ubiquitous, spins out of control. Now informationalisation leads to an overload of communications. We need to go no further than the newspapers to understand the nature…written immediately, without reflection, for that day, under the pressure of a deadline, of no use tomorrow; of value for 24 hours and no longer. Such information loses meaning, loses significance very quickly. This might also be a clue to the way that value might be understood in the information society. This sort of information-value and its temporality is different from both use-value and exchange-value. Use-value and exchange-value comprise a past and a future. Information-value is ephemeral. It is immediate. Information-value has no past, no future: no space for reflection and reasoned argument…It is instead a mass of particulars without a universal…There is no logical or analytic ordering. The newspaper headlines are ordered perhaps only by what sells papers: telegraph and newspaper ordered by urgency. Newsprint’s power comes not through argument, but through a violently forceful facticity. Operating under the most restrictive of constraints – time deadlines, space considerations – its force and temporality is similar to the violence of the event…Newsprint, or information, has neither logical nor existential meaning…Its meaning is accidental, ephemeral and very often trivial…They have no meaning at all outside of real time. Outside the immediacy of real time, news and information are, literally, garbage. You throw out the newspaper with the disused food and the baby’s disposable diapers…the whole of the consumer capitalist city may be understood as information. In the heavily branded environment of the informational city, goods, lifestyle and design are ephemeral. Duration is short. Turnover is fast. Muzak is information; adverts in the cinema, TV, the internet are information, even when fully non-didactic and image-based…It is indeed branded products that are closest to being information…Fast-moving consumer goods, the branded products, we know lots about. We are all experts”.

Scott Lash describes disinformation as an externality or a form of pollution. “Garbage is a metaphor for the whole of the information society. It has to do with information surplus. Garbage is disposable…Information (and fast-moving consumer goods are information) is also disposable. It needs to be disposed of. The question is how to we regulate it? How do we govern it? How do we frame it…The side effects of the industrial society were material bads, physical pollutants. The side effects of the information society are symbolic bads, mental pollutants. We are polluted by the waste products of the information society: the sort of things that cultured people do not want in their living rooms”.

Content and transparency

One major challenge of transparency is its tendency to polarise societies and increase confrontations. Obfuscation and confidentiality acts as buffers to facilitate the bargaining and compromise that cushions society from political and social conflicts. Transparency removes these buffers. Visibility leads communities to vigorously defend their distinct social, cultural, political and economic identities. In addition, communication channels have shifted from centralised channels towards headlines, images, memes and fragmented conversations in social media. Short attention spans and proclivity towards instantaneous reactions is removing the time to think, increasing the sensitivity to misunderstanding and reducing the tolerance to faults. Spontaneous actions can go viral and trigger a chain of adverse reactions and leave little room for manoeuvre. Communication risk rises as crowd-driven opinions, perceptions and fashion carries greater weight. Individuals, firms and governments react by scripting and sanitising canned responses. Transparency ends up promoting self-censorship, superficiality and a feel-good culture”. Phuah Eng Chye (2015) The anorexic and financialised economy: Transition to an information society.

Among the different elements[4] of the information society, transparency perhaps poses the greatest challenge to mitigating disinformation and restoring order. It should be noted that the order that was previously established through central control of channels has become ineffectual. This is due to the inability to control the visibility of information because content is now rapidly transmitted by individual agents via multiple platforms.

In the context of a transparency[5] paradigm, transparency makes differences in beliefs, cultures, race and religion visible, and actions easily observable. This makes it difficult to compromise or provide tailored responses to different communities. Transparency thus reinforces a culture of loyalty or team dynamics and heightens sensitivity to content. This makes it difficult to resolve issues. In addition, while transparency increases the power of content, it also makes content less thoughtful; with tweets and likes replacing informed debate. The rise of transparency will expand the number of fronts where content issues need to be managed.

Transparency fosters an unforgiving and harsh environment. Misspoken, offensive or politically incorrect content – such as the wrong phrases, symbols, t-shirt colours, cartoons and even maps – can have substantial economic and social repercussions. Content is turning into a minefield for global commerce which is exposed to retaliatory government and public harassment, boycotts, enforcements, sanctions or cyber-attacks.

Generally, the preferred strategy is to avoid trouble by sanitising content. Sanitisation can be achieved through self-censorship where individuals and corporations voluntarily refrain publishing content that are potentially “politically incorrect” or through content moderation. Governments can also intervene directly by legally defining some types of content as an information crime with punishments, establish censorship requirements or boards, or impose obligations on the private sector to moderate content in line with political or social norms.

However, if the sanitisation of content is taken too far, it can elevate superficiality. This can be construed as a form of misinformation (i.e. propaganda) in itself with unintended consequences for democracy and economic coordination. It also remains an open question whether there can be effective sanitisation in today’s highly transparent environment.

Content intolerance and democracy in an information society

On the surface, an information society should present ideal conditions for a content democracy. But we seem to be moving further away from the ideal. Transparency breeds polarisation. Polarisation, in turn, seems to be breeding content intolerance. Deepening political and social divides are increasing the areas of disagreements and increasing the pressure to sanitise “politically incorrect” content.

Growing content intolerance poses a dilemma for free speech and a democracy both on the domestic and international fronts. Prominent writers expressed their concern in a letter to Harper’s Magazine about “a new set of moral attitudes and political commitments that tend to weaken our norms of open debate and toleration of differences in favor of ideological conformity…The free exchange of information and ideas, the lifeblood of a liberal society, is daily becoming more constricted…censoriousness is also spreading more widely in our culture: an intolerance of opposing views, a vogue for public shaming and ostracism, and the tendency to dissolve complex policy issues in a blinding moral certainty. We uphold the value of robust and even caustic counter-speech from all quarters. But it is now all too common to hear calls for swift and severe retribution in response to perceived transgressions of speech and thought. More troubling still, institutional leaders, in a spirit of panicked damage control, are delivering hasty and disproportionate punishments instead of considered reforms. Editors are fired for running controversial pieces; books are withdrawn for alleged inauthenticity; journalists are barred from writing on certain topics; professors are investigated for quoting works of literature in class; a researcher is fired for circulating a peer-reviewed academic study; and the heads of organizations are ousted for what are sometimes just clumsy mistakes. Whatever the arguments around each particular incident, the result has been to steadily narrow the boundaries of what can be said without the threat of reprisal. We are already paying the price in greater risk aversion among writers, artists, and journalists who fear for their livelihoods if they depart from the consensus, or even lack sufficient zeal in agreement…This stifling atmosphere will ultimately harm the most vital causes of our time. The restriction of debate, whether by a repressive government or an intolerant society, invariably hurts those who lack power and makes everyone less capable of democratic participation. The way to defeat bad ideas is by exposure, argument, and persuasion, not by trying to silence or wish them away. We refuse any false choice between justice and freedom, which cannot exist without each other. As writers we need a culture that leaves us room for experimentation, risk taking, and even mistakes. We need to preserve the possibility of good-faith disagreement without dire professional consequences. If we won’t defend the very thing on which our work depends, we shouldn’t expect the public or the state to defend it for us”.

The stakes seem higher on the international front where disinformation is viewed as part of content campaigns to destabilise political stability. Max Bergmann and Carolyn Kenney highlight “the irony is that concerns over national security have given illiberal and authoritarian-leaning governments around the world top cover to enact a range of censorship-enabling measures that are then used to crack down on dissent, target political opponents and instil a culture of self-censorship. fear of foreign speech could exacerbate ongoing tensions between states in a way that will likely hurt civil society and press freedom. Although influence operations have little (if any) actual impact on a state’s national security, governments may use the fear of foreign speech to expel, control and surveil foreign journalists and civil society…The ongoing rhetoric of fear surrounding foreign influence operations and espionage is now expanding to include foreign students and businesses”.

They are critical of “the West’s countermeasures in the fight against misinformation and disinformation has relied on repressive strategies – root out the networks, shut down the accounts and remove the content. However, this strategy is, as many have pointed out already, a never-ending game of whack-a-mole that (at best) provides short-term tactical gains”. They explain “evidence of activity is not evidence of impact”. “An overemphasis on bad actors and their supply of disinformation diverts our attention from the material problems that drive our demand for and receptivity to dubious content of suspicious origin”.

Max Bergmann and Carolyn Kenney suggest “we should be aware of such operations, bringing them to light and, when appropriate, removing them. However, if the free flow of ideas, freedom of expression and a better quality of democratic participation are the ultimate goals, relying on detection and deletion is not enough, and…the exaggeration of the threat of foreign influence operations may do more harm than good. Instead, we should invest in solutions that shore up trust and increase political participation, civil discourse and pluralism”. Hence, they suggest “redressive strategies should also be explored to regain and restore trust and legitimacy in our institutions, politicians and governing bodies. And, where possible, domestic policy should be directed at making democratic participation easier”.

There is a need to be wary of arguments advocating censorship. Censorship laws usually aim to prevent dissemination of content relating to secrets, politics (e.g. overthrow of government, to protect the monarchy), behaviours (e.g. hate crimes, pornography) or sensitive issues endangering public order (e.g. race, religion) and libel (to protect reputations). However, censorship laws are often used to reinforce the narrative and positions of incumbents and to stifle dissent and criticism; thereby inducing a culture of conformity. Censorship laws thus inevitably have chilling effects on free speech, debate and a vibrant democracy and protect corrupt and incompetent officials and politicians that may be abusing their powers.  

Regardless of whether it is based on national security or social sensitivities, censorship represent a hard extension of power. Censorship reduces transparency and grants information privileges usually at the expense of the majority of citizens. In this regard, “the threat of dystopia arises not because more information is captured but because power over information is concentrated and the transparency of information is one-sided”[6]. “Two-sided transparency puts information out in the open at everyone’s reach. It may increase disputes but it also increases accountability, and acts as a check and balance on behaviour. Transparency is the best form of governance”.

Overall, content intolerance may not represent an objection to disinformation or misinformation. Instead, it may represent the desire for pliant content that supports a certain group or ideology in which case it seeks to suppress critical and opposing views, and unfavourable facts. As the power of content grows in the information society, the regulatory challenge is to shape content rules to be acceptable to a wide spectrum of society.


The growing power of content is destabilising the traditional narrative and order in the information society. New rules for managing content are needed to maintain a balance between restoring information order while preserving a diversity of views in a democracy. But experience with managing content in a transparent and polarised information society is limited.

We should set aside our reservations on the harmful effects of disinformation and misinformation. A functioning democracy is best served by the democracy of content to ensure citizens are well informed. Policies and rules should promote content that is truthful and thoughtful. They should promote disclosure and civil debate. Importantly, fences, such as whistle-blowing provisions, should be established to protect individuals from official, legal or social intimidation to support a regime of “transparency without fear or favour”. The inalienable right of citizens in the information society is not the right to bear arms but the right to bear mobile devices to record wrong-doings and to post unfavourable content. This will set us on the path towards achieving democracy in an information society.


Dan Sales (19 November 2020) “Covid anti-vaxxers should face action for spreading false information that could cost lives, Britain’s top counter-terror officer says”. Mailonline.

Harpers (7 July 2020) “A letter on justice and open debate”.

Phuah Eng Chye (2015) Policy paradigms for the anorexic and financialised economy: Managing the transition to an information society.

Phuah Eng Chye (11 May 2019) “Critique of information”.

Phuah Eng Chye (29 February 2020) “The journey from privacy to transparency (and back again)”.

Phuah Eng Chye (14 March 2020) “Features of transparency”.

Phuah Eng Chye (28 March 2020) “The transparency paradigm”.

Phuah Eng Chye (11 April 2020) “Anonymity, opacity and zones”.

Phuah Eng Chye (7 November 2020) “Information rules (Part 1: Law, code and changing rules of the game)”.

Phuah Eng Chye (21 November 2020) “Information rules (Part 2: Capitalism, democracy and the path forward)”.

Phuah Eng Chye (5 December 2020) “Information rules (Part 3: Regulating platforms – Reviews, models and challenges)”.

Phuah Eng Chye (19 December 2020) “Information rules (Part 4: Regulating platforms – Paradigms for competition)”.

Phuah Eng Chye (2 January 2021) “Information rules (Part 5: The politicisation of content)”.

Scott Lash (2002) Critique of Information. Sage Publications.

[1] See “Information rules (Part 1: Law, code and changing rules of the game)”.

[2] Content-based crimes that may offend sensitivities or rules relating to religion, ideology, race, sex, community, politics or national security.

[3] In UK, the government is contemplating new laws to tackle misinformation – such as by anti-vaxxers spreading false news about the dangers of a coronavirus cure. See Dan Sales.

[4] The four elements are intangibility, speed, size and transparency. See The anorexic and financialised economy: Transition to an information society.

[5] See reference for articles on transparency.

[6] See “The transparency paradigm”.