Anonymity, opacity and zones
Phuah Eng Chye (11 April 2020)
In a transparency paradigm, the default setting is for information to be visible. Therefore, the operating conditions are described in terms of exceptions to the rule; i.e. the circumstances under which data is non-transparent. My approach focuses on three aspects of non-transparency. The first is anonymity or the hiding of identities. This term is usually applied at the level of the individual. The second is opacity or the hiding of information. Its effects are usually assessed at the system level (markets, economy or society). The third is zones which reflect how the rules of anonymity and opacity vary by platforms.
I define anonymity as de-identification. This can be achieved by separating identity from other data or by cloaking it using pseudonyms. However, the Justice Srikrishna Committee on Data Protection note the lack of consensus on the meanings of de-identification, pseudonymisation and anonymisation. They explain “pseudonymised data and de-identified data are inflection points on the spectrum nearer to anonymisation”. They differentiate anonymisation – “the use of mathematical and technical methods to distort data to irreversibly ensure that identification is not possible” – as distinct from de-identification – “which involves the masking or removal of identifiers from data sets to make identification more difficult”. They also note that “given the pace of technological advancement, it is desirable not to precisely define or prescribe standards which anonymisation must meet in the law”.
The discussion on the meaning of anonymity is related to the legal debate over identity. The Justice Srikrishna Committee on Data Protection points out “since the 1980s, the standard for determining whether data is personal has been whether such data is related to an identified or identifiable individual…This standard of identifiability has served data protection very well over the years”. “However, developments in data science have considerably changed the understanding of identifiability. Data no longer exists in binary states of identifiable or non-identifiable. For instance, whether dynamic IP addresses constitute data about an identifiable individual depends on whether the person processing the data has additional data that enables the identification of the individual. The degree of identifiability of an IP address may also be contextual in a different sense as several persons could be using the same machine. With advancements in technology, more and more identifiers of this nature are expected to emerge…These concerns, however, do not necessarily lead to the conclusion that the standard of identifiability must be abandoned. In fact, despite the criticism, there is no alternative which provides a workable standard for demarcating data that must be protected under the law”.
In my view, there are several other points to be mindful of. First, anonymity is different from privacy in that privacy centers on keeping personal data private rather than imply, for example, the right to make comments anonymously. Second, there are various shades of anonymity. Anonymity can be complete in that the comments or activity cannot be traced to a person. It can also be partial in that the identity is not visible but is traceable. Partial anonymity usually implies an element of authenticity.
Jesse Hirsh notes “the Chinese government has introduced a concept called controlled anonymity, which presents a method of protecting people’s privacy that could be adopted and modified for use in other countries… While controlled anonymity may seem like a contradiction, it does provide a valuable means of balancing privacy, trust and accountability”. In China, controlled anonymity is being applied to retain “the benefits of using cash (anonymous purchasing)” and to facilitate the use of “digital currency while also exercising the right to privacy and security”. He highlights there are similarities with Know your customer (KYC) laws which “are largely designed to prevent businesses from becoming complicit in criminal activities, while also mitigating fraud”. The concept of controlled anonymity “can also be applied to identity, expression and participation in online communities and networks”. Jesse Hirsh suggests the emergence of distributed ledger technologies, such as blockchain and hyperledger which can enable the independent verification of identity, will facilitate “distributed identity and trust services that make the concept of controlled anonymity possible”.
The benefits of anonymity are keenly debated. Daniel J. Solove explains “anonymity and pseudonymity protect people from bias based on their identities and enable people to vote, speak, and associate more freely by protecting them from the danger of reprisal. Anonymity can enhance the persuasiveness of one’s ideas, for identification can shade reception of ideas with readers’ biases and prejudices. This is why, in many universities and schools, exams are graded anonymously. Anonymity provides people with the ability to criticize the companies they work for and to blow the whistle. Anonymity also protects people who read or listen to certain unpopular ideas…identification and fear of reprisal might deter perfectly peaceful discussions of public matters of importance…chill free speech”. Hence, identification “creates architectural problems, for it increases the government’s power over individuals. Identification has been a critical tool for governments seeking to round up radicals or disfavored citizens. It is also an efficient tool for controlling people”.
Some privacy advocates are fiercely opposed to any form of imposed identification. Jesse Hirsh argues the “ongoing debate around the real names policies upheld by platforms such as Facebook… The policies aren’t all bad – they were written, at least in part, to encourage the use of legal names and reduce fraud, limit the influence of bots, and to encouraging civil and accountable behaviour”. “However, in 2011, internet researcher Danah Boyd argued that insisting on real names is an abuse of power: The people who most heavily rely on pseudonyms in online spaces are those who are most marginalized by systems of power. Real names policies aren’t empowering; they’re an authoritarian assertion of power over vulnerable people. In 2015, the Nameless Coalition brought together more than 75 human rights, digital rights, LGBTQ and women’s rights organizations from around the world, all asking Facebook to fix its authentic identity or real name policies”.
It is difficult to come to a firm conclusion on the relative merits of identification and anonymity. Identification can engender trust because it validates authenticity but anonymity can also engender (personal) trust because you are assured your identity is protected. Identification can undermine trust if it results in reprisals but anonymity can undermine (public) trust if is used to send spam, spread disinformation, commit crimes, or ferment instability.
Opacity generally refers to the non-visibility of information rather than just the identity. Its effects are usually assessed at the system level (markets, economy or society). The level of opacity is set by the rules for disclosure established by an operator (platform) or a regulator. Participants will then react with their own competitive strategies to obtain or neutralise informational advantages in transactions and markets. Opacity strategies include strategies of omission (withholding information) or obfuscation (overloading information to make it difficult to see the truth). Opacity can also arise from erasure or deletions of information.
Charles Goldfinger explains an “intangible economy brings about a momentous change in the relationship between suppliers and consumers – the end of information asymmetry. Today in many businesses, the customer knows as much about products and markets as the supplier…This two-way approach reflect a fundamental ambivalence of information: for any market participants, information is their biggest asset but also their largest liability. They can make huge profits from the information trading but also huge, potentially fatal losses. Thus, they constantly grapple with major dilemmas: how much information to disclose…If they do not disclose their views and information they hold, nobody will trade with them. If they disclose too much, the size of position they hold in a given instrument, for instance, other participants will take advantage of it, causing large losses”.
“In the financial market context, greater information transparency does not reduce the risks to the participants, to the contrary it may increase it. Thus market players often seek to limit transparency, to make market more viscous, by introducing zones and periods of opacity…Market regulators are continuously trying to strike a right balance between transparency and opacity but such balance appears elusive as it is quite difficult to set hard and fast rules, applicable all the time to all markets…The persistence of market diversity and fragmentation can be explained by the strategic behavior of market participants seeking to preserve their particular mix of transparency and opacity and the concerns of regulatory authorities, who want to maintain control of markets”.
Transparency is usually associated with the notion that information should be produced to reduce information asymmetry and to improve market efficiency. But the effect isn’t straightforward. “In some situations, greater information disclosure is deemed to have harmful effects on liquidity and efficiency. While disclosure is a core principle in securities regulation, anti-trust regulation views the exchange of information as aiding collusion in some situations…On the one hand, it is argued that increased information dissemination improves firm planning to the benefit of society (including buyers) and allows potential buyers to make correct decisions given their preferences. On the other hand, economic literature also shows that increased information dissemination can raise prices through tacit or explicit collusion to the benefit of firms but at the expense of society at large.”
Zones reflects, in practice, the substantial variations in the rules of anonymity and opacity. Generally, zones can be separated into those that are transparent and those that are non-transparent.
Transparent zones are usually operated by governments, financial institutions and some commercial operators. Generally, access to transparent zones are restricted by permission and requires genuine identities or membership. Transparent zones can provide some leeway for anonymity (e.g. to hide the identity of buyers) but some form of assurance is provided that the players are genuine. In recent decades, commercially-operated, lightly-regulated and relatively open transparent zones – e.g. social media and sharing platforms – have emerged. These zones require minimal identification details but are supported by an audit trail.
In non-transparent zones, participants tend to be anonymous or maintain pseudo identities. Opacity levels are high. Non-transparent zones are either lightly regulated (dark pools) or almost completely unregulated (bazaars, shadow or informal economy, shadow banking, offshore markets and crypto currencies). All kinds of products and services (counterfeit, contraband and unlicensed financial offerings) are available for discreet consumption. The products and services are not subject to oversight, the audit trail is almost non-existent and revenues and income are generally not reported to the tax authorities.
The existence of zones with diverse rules means individuals have considerable flexibility to control their personal space and achieve their privacy preferences. At the extreme, individuals can choose to stay off the grid as much as possible and minimise their presence in obligatory transparent zones. But there are social costs from non-participation. The costs of non-participation are expected to rise with increasing digitalisation.
Generally, individuals participate in a mix of zones and have leeway to control who can see their data and discussions. They can choose to exit zones which no longer meet their needs or are uncomfortable with. Children and teenagers migrate from platforms they feel are monitored by their parents. High-profile incidences relating to data misuse or security lapses can trigger an exodus. Hence, platforms and apps must carefully balance their offerings of anonymity and opacity in relation to their targeted communities. In particular, they may offer control of personal space through access controls, encryption or selective anonymity by allowing users to hide or delete photos, videos and messages.
In the future, it would not be realistic to expect full privacy as transparent zones continue to expand. Our faces, fingerprints, DNA, transactions, activities, locations and communications probably already exist on record. But there is more support for transparent zones than assumed. For example, there is a practice of disregarding unidentified phone calls and messages as a means of handling unwanted intrusions by scammers and sales centers. Hence, many individuals prefer the existence of zones where the identity of participants are visible and authenticated. Transparent zones are also consistent with a future of IOT and AI where it will become common for customers to be visually recognised; payment to be digital; and where appliances, houses, cars and cities will increasingly capture information from the environment around them.
Expanding transparency need not be at the expense of non-transparency given that information is non-rivalrous. Instead, non-transparent space can also expand simultaneously. Individuals will likely participate in both types of zones as this allows them to toggle transparency to their liking.
At the macroeconomic level, non-transparent zones are dominant in developing economies as they are physical, informal and information-based organisation is low. The expansion of transparent zones reflects a broadening of participants and increasing autonomous exchange on the back of rising levels of organisation, efficiency and governance. “There is little doubt improvements in information flows and transparency has generally been a positive force in building confidence in markets. The role of regulation in increasing transparency and accountability remain critical aspects of successful reforms in building trust and confidence” .
The expansion of the transparent zone diminishes the relative share of non-transparent zones but does not eliminate them. “Where there is sunshine, there are also shadows”. Shadow economies and markets are prevalent in developed economies as well. Non-transparent zones or shadow markets act as safety valves; to provide relief from the harshness of regulations in transparent zones. They provide channels for desperate borrowers and greedy investors, to consumers of prohibited products and services or for behaviours individuals prefer to be unnoticed. However, this creates vulnerabilities in the form of predatory lending, criminal activities, blackmail and extortion. Non-transparent zones could also turn into a dumping ground for dangerous and high-risk products.
In contrast, the transparent zone or blue-ribbon segment is characterised by rules on disclosure, governance and reliability. Transparent and non-transparent zones not only co-exist but are interlinked. Shadow banks and markets are able to undertake high-risk intermediation because of support from regulated institutions in the transparent zone. Products and services sold in a transparent zone commands a large premium for safety while the discount in the non-transparent sector reflects the higher risks and uncertainties. Regulated institutions are able to use the shadow market to arbitrage the high returns and yet comply with regulatory requirements. These linkages mean a fall-out in the shadow banking system can trigger contagion in the regulated banking sector; as it did in the global financial crisis of 2007.
Hence, macroeconomic regulation of transparency can have several objectives. It can attempt to calibrate the level of stability risks through requiring disclosure or through changing the speed of information transmission (safety bumps). Transparency regulation can also be used to calibrate the level of opacity to ensure a level-playing field or fairness between different groups such as employees-employers or buyers-sellers.
Regulators pay a lot of attention to the effects of transparency on competition. In this regard, the different transparency rules among zones gives rise to arbitrage. For example, dark pools are able to capture market liquidity from traditional exchanges as the latter are burdened by disclosure and other regulations. Globally, China is perceived as having a competitive advantage as its internet is secured by a firewall while it is able to compete in the open systems in the West. Many jurisdictions are thus evolving new regulatory frameworks to address the effect of different data protection regimes on geopolitical competition.
The non-transparency concepts of anonymity, opacity and zones brings us closer to the concept of an information order. Scott Lash explains “postmodernisation means the replacement of social structures by information and communication structures…no longer is social class determined by access to the mode of production but by access to the mode of information. Social inequality is then decisively now a question of access to global flows”.
Hence, “the global information culture depends on power for exclusion. This means mainly exclusion from the loop, from the means of information, from the global flows of information and communication. The principal actors in the national manufacturing society were nations, institutions and organisations. In the information order, relationships are less within a country than between global cities in different countries. The importance of relations of production internal to organisations is now paralleled by new relations of production and communication between smaller and more amorphous disorganisations”.
Overall, the varying levels of transparency among zones even within a single jurisdiction suggest privacy rules should not be applied uniformly across all zones; i.e. one-size-fits-all regulation is inappropriate. The variation in the rules of anonymity and opacity suggest users are able to control their personal space through their choice of zones. Hence, the focal point for regulating personal data should shift from an individual-centric perspective towards the regulation of zones. In this context, there is considerable room for self-regulation in instances where participants are able to exercise choice. In other instances, the rules and protocols for connectivity among zones will be a critical aspect of the sweeping overhaul of information rules.
Abacus News (28 August 2019) “Teens are shunning WeChat, showing shifting tastes in Chinese social media”. https://www.abacusnews.com/digital-life/teens-are-shunning-wechat-showing-shifting-tastes-chinese-social-media/article/3024368
Ben Kelmanson, Koralai Kirabaeva, Leandro Medina, Borislava Mircheva, Jason Weiss (November 2019) “Explaining the shadow economy in Europe: Size, causes and policy options”. IMF Working Paper. https://www.imf.org/en/Publications/WP/Issues/2019/12/13/Explaining-the-Shadow-Economy-in-Europe-Size-Causes-and-Policy-Options-48821
Charles Goldfinger (4th quarter 2000) “Intangible economy and financial markets”. Communication & Strategies No. 40. http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=C4D6E97EB7C9D254292B4261405F5EAE?doi=10.1.1.461.6988&rep=rep1&type=pdf
Daniel J. Solove (January 2006) “A taxonomy of privacy”. University of Pennsylvania Law Review, Vol. 154; GWU Law School Public Law Research Paper No. 129. https://ssrn.com/abstract=667622
Enrico Perotti (16 January 2014) “The roots of shadow banking”. https://voxeu.org/article/roots-shadow-banking
European Commission (19 February 2020) “Communication: A European strategy for data”. https://ec.europa.eu/info/sites/info/files/communication-european-strategy-data-19feb2020_en.pdf
Jesse Hirsh (5 February 2020) “The value of controlled anonymity”. Center for International Innovation Governance (CIGON). https://www.cigionline.org/articles/value-controlled-anonymity
Justice Srikrishna Committee on Data Protection (2018) “A free and fair digital economy: Protecting privacy, empowering Indians”. http://meity.gov.in/writereaddata/files/Data_Protection_Committee_Report.pdf
Phuah Eng Chye (2015) Policy paradigms for the anorexic and financialised economy: Managing the transition to an information society.
Phuah Eng Chye (11 May 2019) “Critique of information”.
Phuah Eng Chye (12 October 2019) “Information and organisation: Shades of surveillance”. http://economicsofinformationsociety.com/information-and-organisation-shades-of-surveillance/
Phuah Eng Chye (7 December 2019) “Information and organisation: China’s surveillance state growth model (Part 3: The relationship between surveillance and growth)”. http://economicsofinformationsociety.com/information-and-organisation-chinas-surveillance-state-growth-model-part-3-the-relationship-between-surveillance-and-growth/
Phuah Eng Chye (21 December 2019) “The debate on regulating surveillance”.
Phuah Eng Chye (4 January 2020) “The economics and regulation of privacy”.
Phuah Eng Chye (18 January 2020) “Big data and the future for privacy”.
Phuah Eng Chye (15 February 2020) “The costs of privacy regulation”
Phuah Eng Chye (29 February 2020) “The journey from privacy to transparency (and back again)”. http://economicsofinformationsociety.com/the-journey-from-privacy-to-transparency-and-back-again/
Phuah Eng Chye (14 March 2020) “Features of transparency”.
Phuah Eng Chye (28 March 2020) “The transparency paradigm”.
Masha Borak (20 November 2019) “Tech companies vying to be the next banks are facing trust issues”. Abacus News. https://www.abacusnews.com/tech/tech-companies-vying-be-next-banks-are-facing-trust-issues/article/3038209?_ga=2.229600670.368044388.1565659975-38484865.1565659975
Scott Lash (2002) Critique of Information. Sage Publications.
Thorsten Kaeseberg (12 December 2019) “Promoting competition in platform ecosystems”. Voxeu. https://voxeu.org/article/promoting-competition-platform-ecosystems
 On financial markets, the identity of the buyers and sellers are not visible but the system provides assurance that the counter-party is genuine.
 “While transparency ostentatiously reduce information asymmetry risks, intermediaries typically respond by increasing complexity or routing parts of a transaction through unregulated institutions and jurisdictions to obfuscate risks and unethical conduct”. The anorexic and financialised economy: Transition to an information society.
 The anorexic and financialised economy: Transition to an information society.
 “Young people prefer niche communication tools and the apps that their parents do not use to seek privacy and similar interests…The backlash against oversharing is part of a global trend. In 2018, more than three million people under 25 either quit or stopped regular use of Facebook, which has 2 billion registered users worldwide”. See Abacus News.
 The anorexic and financialised economy: Transition to an information society.
 The anorexic and financialised economy: Transition to an information society.
 See Ben Kelmanson, Koralai Kirabaeva, Leandro Medina, Borislava Mircheva and Jason Weiss.
 See Enrico Perotti.
 See Thorsten Kaeseberg.