Information rules (Part 5: The politicisation of content)

Information rules (Part 5: The politicisation of content)

Phuah Eng Chye (2 January 2021)

TikTok was never supposed to be political. When it launched in the US in 2018, the video app was marketed as a fun place to discover goofy content and experiment with its sophisticated editing software and vast music library. Yet nearly two years and 165 million nationwide downloads later, TikTok has been a platform for teachers strikes, QAnon conspiracy theories, Black Lives Matter protests, and a teen-led campaign to sabotage a Trump rally in Tulsa, Oklahoma. The TikTok algorithm is perfectly suited to spread political content faster and to a wider audience than any social media app in history, whether the company wants to admit it or not…a TikTok ban would have serious effects on American youth culture, where hundreds of teenagers have now built massive followings and spread important political messaging on an app that allowed them to reach huge audiences. It’s changed not only the experience of being online but the experience of being a young person. Rebecca Jennings (2020) “The case for and against banning TikTok”.

Concerns with the internet have shifted over time. Initially, economists focused on the disruptive impact on businesses and labour. In recent years, their attention shifted to privacy, data and competition-trust concerns. Now it is the turn of content to occupy centre stage.

The experience of Tik Tok encapsulates the evolution of the internet in society. Leonard Kleinrock notes “we enjoyed a wonderful culture of openness, collaboration, sharing, trust and ethics…adherence to netiquette persisted for the first two decades of the Internet”. He suggests the decline began “in the early 1990s when spam first appeared at the same time there was an intensifying drive to monetize the Internet as it reached deeply into the world of the consumer. This enabled many aspects of the dark side to emerge (fraud, invasion of privacy, fake news, denial of service, etc.). It also changed the nature of internet technical progress and innovations as risk aversion began to stifle the earlier culture of moon shots”. Leonard Kleinrock argues “today, almost no one would say that the internet was unequivocally wonderful, open, collaborative, trustworthy or ethical. How did a medium created for sharing data and information turn into such a mixed blessing of questionable information? How did we go from collaboration to competition, from consensus to dissention, from a reliable digital resource to an amplifier of questionable information?”

Linus Torvalds[1] remarks “I absolutely detest modern social media – Twitter, Facebook, Instagram. It’s a disease. It seems to encourage bad behavior…On the internet, nobody can hear you being subtle. When you’re not talking to somebody face to face, and you miss all the normal social cues, it’s easy to miss humor and sarcasm, but it’s also very easy to overlook the reaction of the recipient, so you get things like flame wars, etc., that might not happen as easily with face-to-face interaction…The whole liking and sharing model is just garbage. There is no effort and no quality control. In fact, it’s all geared to the reverse of quality control, with lowest common denominator targets, and click-bait, and things designed to generate an emotional response, often one of moral outrage. Add in anonymity, and it’s just disgusting. When you don’t even put your real name on your garbage (or the garbage you share or like), it really doesn’t help. I’m actually one of those people who thinks that anonymity is overrated. Some people confuse privacy and anonymity and think they go hand in hand, and that protecting privacy means that you need to protect anonymity. I think that’s wrong. Anonymity is important if you’re a whistle-blower, but if you cannot prove your identity, your crazy rant on some social-media platform shouldn’t be visible, and you shouldn’t be able to share it or like it”.

Policy-makers have reviewed and tackled these concerns as part of their anti-competition regulation. But competition and content are distinct and should be tackled separately. Orla Lynskey highlights “existing legal mechanisms, including competition law, may not capture the power over communications exercised by digital platforms…The first claim is that the term platform power fails to reflect the potentially problematic power at the heart of the information society…The second claim is that blind spots exist when the issue of platform power is viewed solely through an economic lens. As a result, competition law will not necessarily capture and sanction practices that negatively impact upon non-economic parameters, such as freedom of expression and privacy…gatekeepers control what content we access and the terms on which this content can be accessed. The terms on which this control is exercised are under-publicised and individuals therefore lack the knowledge and power to have a disciplining influence on gatekeepers. This control can have ramifications for fundamental rights that are not captured by competition law analysis due to their nature (intangible implications) and scale (while they may have significant consequences for some individuals, they may be deemed minimal in overall terms)”.

Orla Lynskey points out “even if a competition authority could establish that a gatekeeper had market power, competition analysis would not capture intangible harms, such as harms to fundamental rights. Rather, competition law seeks to protect specified parameters of competition – namely price, quality, choice and innovation – that enhance consumer welfare…Competition scholars will rightly argue that it is not the role of competition law to tackle such fundamental rights concerns. This being so, competition law should not be offered as a solution to all problems caused by gatekeeper practices. These examples also caution against a purely economic approach to the question of gatekeeper regulation, as encouraged by some stakeholders”. She cautions “nevertheless, it is necessary to acknowledge that the mere existence of regulatory gaps will not, of itself, justify the introduction of further measures to regulate platforms”.

Therefore, the regulatory frameworks for competition and content should be considered separately. Competition issues are commercial while content issues are political. For purposes of this article, I have used the term content loosely to refer to non-technical content (which excludes data and copyright) and to refer mainly to context (such as opinions and narratives) in relation to politics, communities, national security, education, entertainment and advertisement. I used content to mainly refer to text and speech but it also applies to images, recordings, searches, memes and preferences.

From context collapse to content collapse

It is important to understand content has undergone changes. Nicholas Carr explains social media has moved from an initial phase of “context collapse” to “content collapse”. He notes “before social media, you spoke to different audiences in different ways. You modulated your tone of voice, your words, your behavior, and even your appearance to suit whatever social context you were in (workplace, home, school, nightclub, etc)…On a social network, the theory went, all those different contexts collapsed into a single context. Whenever you posted a message or a photograph or a video, it could be seen by your friends, your parents, your coworkers, your bosses, and your teachers, not to mention the amorphous mass known as the general public. And, because the post was recorded, it could be seen by future audiences as well as the immediate one. When people realized they could no longer present versions of themselves geared to different audiences – it was all one audience now – they had to grapple with a new sort of identity crisis…an infinite number of contexts collapsing upon one another into that single moment of recording…a crisis of self-presentation”.

Nicholas Carr note however “the recent history of social media isn’t a story of context collapse. It’s a story of its opposite: context restoration. Young people led the way, moving much of their online conversation from the public platform of Facebook, where parents and teachers lurked, to the more intimate platform of Snapchat, where they could restrict their audience and where messages disappeared quickly…Context collapse remains an important conceptual lens, but what’s becoming clear now is that a very different kind of collapse – content collapse – will be the more consequential legacy of social media. Content collapse, as I define it, is the tendency of social media to blur traditional distinctions among once distinct types of information – distinctions of form, register, sense, and importance. As social media becomes the main conduit for information of all sorts – personal correspondence, news and opinion, entertainment, art, instruction, and on and on – it homogenizes that information as well as our responses to it”.

In this regard, content collapse was triggered by digitization which made redundant “the formal standards and organizational hierarchies inherent to the old mediums” and converged information “through a single, universal medium”. “It wasn’t just that the headlines, free-floating, decontextualized motes of journalism ginned up to trigger reflexive mouse clicks, had displaced the stories. It was that the whole organizing structure of the newspaper, its epistemological architecture, had been junked. The news section (with its local, national, and international subsections), the sports section, the arts section, the living section, the opinion pages: they’d all been fed through a shredder, then thrown into a wind tunnel. What appeared on the screen was a jumble, high mixed with low, silly with smart, tragic with trivial. The cacophony of the RSS feed, it’s now clear, heralded a sea change in the distribution and consumption of information. The new order would be disorder”. The smartphone “completed the collapse of content…further compacted all forms of information. The instant notifications and infinite scrolls that became the phone’s default design standards required that all information be rendered in a way that could be taken in at a glance, further blurring the old distinctions between types of content. Now all information belongs to a single category, and it all pours through a single channel”.

Nicholas Carr points out “many of the qualities of social media that make people uneasy stem from content collapse. First, by leveling everything, social media also trivializes everything – freed of barriers, information, like water, pools at the lowest possible level…Second, as all information consolidates on social media, we respond to it using the same small set of tools the platforms provide for us. Our responses become homogenized, too. That’s true of both the form of the responses (repost, retweet, like, heart, hashtag, fire emoji) and their content (Love! Hate! Cringe!). The software’s formal constraints place tight limits on our expressiveness, no matter what we’re talking about. Third, content collapse puts all types of information into direct competition. The various producers and providers of content, from journalists to influencers to politicians to propagandists, all need to tailor their content and its presentation to the algorithms that determine what people see. The algorithms don’t make formal or qualitative distinctions; they judge everything by the same criteria. And those criteria tend to promote oversimplification, emotionalism, tendentiousness, tribalism – the qualities that make a piece of information stand out, at least momentarily, from the screen’s blur. Finally, content collapse consolidates power over information, and conversation, into the hands of the small number of companies that own the platforms and write the algorithms. The much maligned gatekeepers of the past could exert editorial control only over a particular type of content that flowed through a particular medium – a magazine, a radio station, a TV network. Our new gatekeepers control information of all kinds. When content collapses, there’s only one gate”.

The politicisation of content

In the past, content was manually produced and distributed. Governments and elites maintained order over public narratives through control over media channels (newspapers, radio and television). The situation has changed with the emergence of distributed networks, mobility, social media and the diffusion of knowledge on the use of content to influence the narrative.

As the level of informationalisation increases, there is an intensification in the contest for control of political and social narratives[2]. Players are learning how to weaponise content to manipulate public opinion. Aaron Z. Lewis notes “marketing agencies and foreign entities are blurring the line between what’s real and what’s fake…the internet has democratized the ability to create new reality bubbles and distort old ones. We’re just beginning to grapple with the consequences of this seismic cultural shift”. In this regard, “online conversations are ripe for manipulation because they take place on an infrastructure that’s built for advertising. This infrastructure has three main components: audience consolidation, personalized targeting, and game-able algorithms. It’s a particularly powerful combo for those who wish to sow discord”.

He describes “the post-truth present…We’re now living through a gold rush for social capital accumulation: media influencers and extremists alike are competing with traditional religions to pull as many people as possible into their filter bubbles. And we’re becoming god-like in our ability to shape people’s realities. These alternate conceptions of reality are coming into conflict with one another…Never before have we had to deal with so many competing versions of the truth…All of these reality bubbles tell different stories about the past and paint vastly different visions for the future. They’re making us more fragmented than ever before in an age that requires more coordination than ever before. So, how can we move through and beyond these deep divisions?”.

Aaron Z. Lewis points out “inauthenticity is becoming the hallmark of our era: faked Facebook data, faked college admissions applications, faked resumes and bullshit jobs, faked news, faked mortgage-back security ratings, AI pretending to be humans, humans pretending to be AI…Authenticity itself is a relatively new way of thinking about identity and human behavior. I trust we’ll create a new mode of understanding that’s better fit for the surreality we live in – one that demands transparency and acknowledges that people are beginning to see through all the manipulative corporate PR”.

Movements and counter-movements are emerging and colliding on platforms. Social movements were prominent in labour[3] and Hong Kong’s pro-democracy movement[4]. Darren Loucaides relates the political rise of a social movement to power in Italy which became a model for populist movements in other countries. In Brazil[5], WhatsApp groups were infiltrated and propaganda videos circulated to destroy “the strike’s momentum by offering false promises, sowing confusion, re-directing the narrative to serve their interests, and making it difficult to separate fact from fiction. Like a virus, they infected the host cell and started controlling it from the inside”. Marielle Descalsota describes how photographs and images are manipulated to support political narratives.   

Reed Berkowitz observes the strategies to “indoctrinate” followers used by QAnon[6] are based on “many of the same gaming mechanisms and rewards” to keep gamers addicted. In particular, they rely on “guided apophenia” – “the tendency to perceive a connection or meaningful pattern between unrelated or random things (such as objects or ideas)”. Hence, “QAnon grows on the wild misinterpretation of random data, presented in a suggestive fashion in a milieu designed to help the users come to the intended misunderstanding…pre-seeded the conclusions…creating a meaning for them that fits the propaganda message…Because you were convinced to connect the dots yourself” you can see the absolute logic of it. This is the conclusion you arrived at…The difference is that these manufactured connections lead to the desired conclusions Q’s handlers have created. When players arrive at the correct answers they are showered with adoration, respect, and social credit”.

Reed Berkowitz points out “indoctrination is obviously not a game mechanic. There is going viral, and there is active training and recruitment. Advertising’s goal is to be so entertaining, useful, or uplifting that people pass on the information to their friends. Games seek to be so engaging that people want to play and they want to play with their friends. To some extent that is happening, there is a viral quality to Q, but there is also something else in addition that is highly troubling. Q is teaching QAnons how to proselytize”.

The outcome is a sense of a loss of control. Martin Gurri argues “elites have progressively lost the ability to mediate between events and the old shared stories. Elite omissions and evasions, falsehoods and failures, are now out in the open for all to see…No established authority remains to settle questions of fact. In that sense, the interpretation of reality is up for grabs…The mirror is broken, and the great narratives are fracturing into shards. What passes for authority is devolving to the political war-band and the online mob – that is, to the shock troops of populism left and right…What comes next? Maybe chaos…On the far end of the turbulence, the system will be reconstituted along somewhat different lines.  It is impossible from here to predict the character of the new organizing principles – but it’s safe to say that the radical egalitarianism favored by anti-establishment movements will not be among them. Authority will not devolve from the elites to the public…Stable interpretations of reality seldom arise from a free-for-all…A complex society can’t dispense with elites…The sociopolitical disorders that torment our moment in history, including the fragmentation of truth into post-truth, flow primarily from a failure of legitimacy, of the bond of trust between rulers and ruled…Those at the top have forsaken their function yet cling, illicitly, to their privileged perches…If my analysis is correct, the re-formation of the system, and the recovery of truth, must depend on the emergence of a legitimate elite class”.

The platform battles have spilled over into geo-political arena. Max Bergmann and Carolyn Kenney warns “the use of disinformation, or dezinformatsiya, attempt to undermine public trust in the authenticity of information crucial to a healthy and lively democratic society. Russia uses disinformation in sophisticated and complex information operations that use multiple and mutually reinforcing lines of effort – through cyberhacking, the employment of cyber trolls, and overt propaganda outlets. Today’s online media environment is rich with increasing political polarization, growing distrust of traditional media sources, the hardening of echo chambers, online dialogue that is caustic in nature, and the ability to spread information easily – true or otherwise – through the body politic…Notably, this is not a media environment or online culture that Russia created, but it is an environment that Russia has aggressively sought to exploit. The disaggregated news and social media landscape has enabled Russia to intervene in elections, discredit governments, undermine public trust, and foster internal discord in ways that it could only have dreamed of during the Cold War”.

Conclusion

An internet that once thrived on anonymity and freedom has matured into the public mainstream. Platforms are displacing traditional media as the main channels to distribute public information and as the main forum for political and ideological debate. The power of content has been amplified as rival groups weaponise content; deploying overload and complexity-related strategies to by-pass and attack the incumbent narratives. The politicisation of content is destabilising traditional power structures and there is a growing sense of the loss of control of the public narrative; a trend aggravated by a generational gap with the younger social media-savvy generation. The consequences of disinformation and distrust are significant and needs to be carefully considered.

References

Aaron Z. Lewis (29 May 2019) “You can handle the post-truth: A pocket guide to the surreal internet”. AZL.BLOG. https://aaronzlewis.com/blog/2019/05/29/you-can-handle-the-post-truth/

Brandy Zadrozny, Ben Collins (14 August 2018) “How three conspiracy theorists took Q and sparked Qanon”. NBC News. https://www.nbcnews.com/news/amp/ncna900531

Darren Loucaides (14 February 2019) “What happens when techno-utopians actually run a country”. Wired. https://www.wired.com/story/italy-five-star-movement-techno-utopians/

Leonard Kleinrock (17 March 2019) “Fifty years of the internet: What we learned, and where will we go next”. Techcrunch. https://techcrunch.com/2019/03/18/fifty-years-of-the-internet/

Marielle Descalsota (23 November 2020) “The problem with Hunger Games photographs of the Thai protests”. SCMP. https://www.scmp.com/week-asia/opinion/article/3110742/problem-hunger-games-photographs-thai-protests

Martin Gurri (31 May 2017) “The revolt of the public and the age of post-truth. The Fifth Wave. https://thefifthwave.wordpress.com/2017/05/31/the-revolt-of-the-public-and-the-age-of-post-truth/

Max Bergmann, Carolyn Kenney (6 June 2017) “War by other means: Russian active measures and the weaponization of information”. American Progress.

Nicholas Carr (13 January 2020) “From context collapse to content collapse.” Rough Type. http://www.roughtype.com/?p=8724

Orla Lynskey (21 February 2017) Regulating platform power”. LSE Legal Studies. https://ssrn.com/abstract=2921021

Phuah Eng Chye (29 September 2018) “Future of work: The labour movement (Part 2: Labour as a social movement)”. http://economicsofinformationsociety.com/future-of-work-the-labour-movement-part-2-labour-as-a-social-movement/

Phuah Eng Chye (6 October 2018) “Future of work: The labour movement (Part 3: Assessing the social media-based model)”.

Phuah Eng Chye (22 June 2019) “Policy conversations and the language of information”. http://economicsofinformationsociety.com/policy-conversations-and-the-language-of-information/

Phuah Eng Chye (23 November 2019) “Information and organisation: China’s surveillance state growth model (Part 2: The clash of models)”.

Phuah Eng Chye (7 November 2020) “Information rules (Part 1: Law, code and changing rules of the game)”. http://economicsofinformationsociety.com/information-rules-part-1-law-code-and-changing-rules-of-the-game/

Phuah Eng Chye (21 November 2020) “Information rules (Part 2: Capitalism, democracy and the path forward)”.

Phuah Eng Chye (5 December 2020) “Information rules (Part 3: Regulating platforms – Reviews, models and challenges)”.

Phuah Eng Chye (19 December 2020) “Information rules (Part 4: Regulating platforms – Paradigms for competition)”. http://economicsofinformationsociety.com/900-2/

Rebecca Jennings (23 July 2020) “The case for and against banning TikTok”. Vox. https://www.vox.com/the-goods/2020/7/23/21334871/tiktok-ban-us-trump-china

Reed Berkowitz (1 October 2020) “A game designer’s analysis of QAnon playing with reality”. Curiouser Institute. Medium. https://medium.com/curiouserinstitute/a-game-designers-analysis-of-qanon-580972548be5

Robert Young (2 April 2019) “25 years later: Interview with Linus Torvalds”. https://www.linuxjournal.com/content/25-years-later-interview-linus-torvalds


[1] See Robert Young.

[2] See “Policy conversations and the language of information”.

[3] See “Future of work: The labour movement (Part 2: Labour as a social movement)”; “Future of work: The labour movement (Part 3: Assessing the social media-based model)”.

[4] See “Information and organisation: China’s surveillance state growth model (Part 2: The clash of models)”.

[5] See Aaron Z. Lewis.

[6] Brandy Zadrozny and Ben Collins relates how three conspiracy theorists took Q and sparked Qanon.