Features of transparency

Features of transparency

Phuah Eng Chye (14 March 2020)

The destination to an information society can only be reached by following the path of transparency. In this regard, economic advancement has always been accompanied by rising levels of transparency. But as societies became more transparent, the harms from transparency becomes more evident. In recent years, society has turned to privacy regulations to assure and protect individuals against these harms. The problem is that privacy is the converse of transparency – i.e. privacy reduces transparency – and this causes society to regress.

Rather than rely on privacy to address transparency harms, it is probably more effective to use transparency-based remedies. But transparency and its features have not been sufficiently researched. Most such studies have a narrow scope – focusing on how disclosure and information asymmetry affects markets, bargaining or the distribution of value. What is missing is an economics of transparency that offers a holistic perspective on how transparency affects society and that can be used to guide the design of coherent policies and rules.

I start by exploring the features of transparency. I define transparency as an environment where information is visible and is used by as many people as possible. Transparency is thus by definition a high information environment. Its features can be discerned by contrasting a high-information technologically-driven environment with a low-information manual environment.

The challenges and solutions for a low-information environment are different from those for a high-information environment. First information is manual and require human intervention. Hence, information is relatively scarce, inaccurate and time lags are lengthy. Second, information is costly to distribute, knowledge relatively difficult to acquire while power over information tends to be centralised. Third, information asymmetry problems are acute. Therefore, low information solutions tend to contain a lot of buffers or redundancies – such as queues, inventories, regulated pricing and lengthy settlement periods to cater for verification and error correction. In a low information environment, the many information chokepoints act as a constraint on throughput.

This can be contrasted with a high information environment. First, information is automated and require minimal human intervention. Hence, information is relatively abundant, accurate and instantaneous. Second, it is cheap to distribute information, knowledge is widely available while power over information tends to be distributed. Third, the amount of information available to solve problems is enormous, transactions are instantaneous and cost-efficient, and errors are low. The nature of information asymmetry changes because transparency increases efficiency (speed, low-cost, search) and trustworthiness (verification, price and reliability). The traditional buffers or redundancies are eliminated by information disruption and the emergence of new business models.

In a low-information environment, high levels of trust in relationships (family, community and institution) are needed to overcome information asymmetry and facilitate transactions. In a high-information environment, trust in data authenticity benefits from the scaled network[1] effects of transparency and digital audit trails. Hence, transparency is able to support autonomous transactions (e.g. stranger sharing) and self-organising.

Sharing provides an example of how trust works in a transparency model. Shanu Athiparambath explains “when you list your mountaintop cabin on Airbnb, the property doesn’t change physically, but it suddenly becomes useful…Millions of people stay in Airbnb homes every night. It’s not trust which makes this possible…Airbnb puts hosts and guests in a position where behaving badly would ruin their reputations…Intellectuals miss this obvious distinction…The Airbnb review system is an extremely powerful third-party norm enforcement system…When the hosts try to deceive guests through evolutionarily familiar ways, the penalty comes in evolutionarily novel ways. A negative review can haunt you for very long, but it’s hard for many hosts to get their heads around this…You don’t need to be unconditionally trustworthy to fare well in short-term interactions. It is enough to have high social intelligence and be moderately trusting and trustworthy. Airbnb can’t produce genuine trust. No institution can”.

It is also worth noting the data philosophies embedded in privacy-based regulation are different from transparency-based regulation. Privacy regulation controls are built on the implicit assumption that data is stored in traditional structures which are sequential, vertically integrated and built-to-purpose. In contrast, the modern data structures are based on modularity which implies non-sequentiality, flexibility and non-purpose[2]. In this regard, transparency supports modularity where data (on activities and assets) are broken into small independent bits and can be easily recombined into multiple forms (like derivatives). Modularity and combinational flexibility facilitate calibrated and sophisticated strategies to enhance activities, liquidity and risk management. Under a modular approach, data is stored for its sake and never destroyed because so long as its there, it will come in handy when it is needed (for any purpose).

Transparency is also associated with openness, disclosure, well-defined rules and well-informed participants. “Transparency is an essential requirement for societal progress. Transparency is needed to enable collaboration, alignment, self-organising and sharing. Transparency is also vital to ensure that leaders and other elite are sensitive to public opinion and provides a means to discipline public administration and business conduct. Conversely, making information confidential hampers collaboration, learning, discipline and impedes progress. Good government, innovation and societal progress are thus highly dependent on the degree of openness in public dialogue and knowledge exchange”. Thus, transparency is an essential element in facilitating participation, monetisation, exchange, innovation and informed decision-making.

Nonetheless, “transparency is a double-edge sword”[3] with substantial harms. Transparency opens up new organisational possibilities and alters opportunities and risks. This, in turn, makes change inevitable and disrupts legacy business and regulatory arrangements. In this context, transparency shifts the information chokepoints and increases the speed of adjustments. The rapid adjustments can undermine market (overshooting, herding, contagion) and industry (low friction, law of one price) stability. Societal dislocation is amplified by transparency working with other information effects (intangibility, size and speed) to polarise and aggravate imbalances. This, in turn, increases the risk of social confrontations.

In particular, transparency has malevolent features. Everyone likes their own private space because it acts as a shield. No one is comfortable being subject to 24-by-7 monitoring, having every action and association, past and present, scrutinised by the government, firms or even family members. Minority groups can be prosecuted, work performance and individual habits can be tracked. Once their personal information is visible, individuals are exposed to intrusive threats from suppression, bullying, scams, burglary, extortion, blackmail, discrimination, exploitation, hacking, spams, identity theft and a range of retaliatory actions.

It does not need intrusive surveillance or malicious cybercrime to cause information harms. Daniel J. Solove points out information dissemination constitutes “one of the broadest groupings of privacy harms. These harms consist of the revelation of personal data or the threat of spreading information”. Access to personal information could lead to contraventions of “laws protecting against asking questions about a person’s political views or associations”, against rape victimisation, disabilities, diseases and convictions as well as dilute protections of privileged communications between attorneys and clients, priests and penitents, and doctors and patients.

At the personal level, privacy acts as a shield. At the level of society, privacy functions as a buffer. Differences in race, beliefs, opinions, opportunities, income and wealth can be accentuated by transparency. Social media typically reinforce team dynamics and amplify the anger, frustration, and disrespect arising from differences. Transparency therefore removes the protective buffers that (1) hide problems and differences and (2) allows different beliefs and systems to co-exist peacefully in ignorance. Removing the buffer forces society to face up to issues and, unfortunately, this tends often to lead to conflict. Below is an analysis of three systemic harms; namely harshness,discrimination and inequality.

Harshness

Harshness can result when misdeeds are allowed to permanently scar an individual’s life. Joshua A.T. Fairfield and Christoph Engel warns “large pools of data accumulated over time and from many different sources can exert a corrosive effect on social welfare…The first is that data accumulates across time. Humans do not remember contributing the information and do not take precautions against misuse. The second feature is that data accumulates across sources. Again, humans do not adequately account for the fact that what they tell one counterparty will be communicated many times to many others. In both senses, the accumulated data is experienced as toxic: it can harm people in ways they did not foresee…Stale data can cause damage because of its privacy impact”.

They cite an example: “Decades ago, she could have moved on with her life with confidence that her prior conduct would not come back to haunt her, because the information was not concatenated with other datasets or stored in easily searchable fashion. Now, a conviction results in exclusion from the economy because the information is permanently recorded and spreads into background-check databases. Stale data damages citizens’ ability to reinvent themselves; it increases the risk of identity theft; it increases price discrimination; and, through filter bubbling (the practice of limiting search results based on the searching party’s data profile), it decreases the ability of citizens to make informed choices drawn from a range of data sources, among a number of other potential effects”.

Kate Eichhorn notes the threat of being tainted by stale data is being exacerbated by social media; starting with the uploading of photographs of “these young people’s earliest moments online” by family and friends. These images are “eventually sucked into other databases” and “many of these photographs are now available to audiences for which they were never intended’. The “digital natives are also the most intensively tracked generation at school” where online tools monitor their skill progress and social interactions while surveillance tools are used to monitor “everything from students’ text messages, emails, and social-media posts to their viewing habits on YouTube”. “The data is also frequently shared with law enforcement when potential threats are identified…Students have little control over how their data is being used …[they] are often unaware of the amount and type of data being collected about them and who may have access to it…could be sold to a job recruitment agency years later”.

“In most communities, most people agree that children and teens should be able to make mistakes from time to time and have those mistakes both forgotten and forgiven. This is precisely why most jurisdictions treat young offenders differently from adults”.

Kate Eichhorn argues that “for digital natives, the constant recording of even their most minor mistakes and embarrassments means that this long-standing agreement now appears to be threatened. And this isn’t bad news only for them, but for society at large”. Her “research on youth and media practices indicates that as young people lose their ability to explore new ideas and identities and mess up without consequence, there are two critical dangers”. The first is that some teens become “so risk-averse that they may be missing out on at least some of the experimentation that has long defined adolescence…The risk is that this will produce generations of increasingly cautious individuals – people too worried about what others might find or think to ever engage in productive risks or innovative thinking”.

The second is that “in a world where the past haunts the present, young people may calcify their identities, perspectives, and political positions at an increasingly young age…The risk is that young people who hold extreme views as teenagers may feel there’s no use changing their minds if a negative perception of them sticks regardless…Identities and political perspectives will be hardened in place, not because people are resistant to change but because they won’t be allowed to shed their past. In a world where partisan politics and extremism continue to gain ground, this may be the most dangerous consequence of coming of age in an era when one has nothing left to hide”.

Rachel Botsman points out that in a future “where we will all be branded online and data-mined…an individual’s actions will be judged by standards they can’t control and where that judgement can’t be erased. The consequences are not only troubling; they’re permanent. Forget the right to delete or to be forgotten, to be young and foolish”. She adds that “where these systems really descend into nightmarish territory is that the trust algorithms used are unfairly reductive. They don’t take into account context. For instance, one person might miss paying a bill or a fine because they were in hospital; another may simply be a freeloader…If life-determining algorithms are here to stay, we need to figure out how they can embrace the nuances, inconsistencies and contradictions inherent in human beings and how they can reflect real life”.

Discrimination

Transparency may foster systemic discrimination. Alessandro Acquisti, Curtis Taylor and Liad Wagman points out exposure of personal information studies on Airbnb[4] found patterns of rental discrimination when racial identities were known. In addition, “evidence suggests that employers do use criminal records to screen candidates. Because of the stigmatizing effect associated with a criminal history, individuals with criminal records are more likely to experience job instability and wage decline”.

Inequality

Information visibility reduce frictions, magnifies the ability to exploit information and endowment advantages and erode regulatory barriers protecting inefficient and weaker players. Inequality is aggravated by the winner-take-all effects, oligopolistic strategies, financialisation (markets, rising costs and worsening income distribution), merit and price-based discrimination and by the rising importance of signalling and status (conspicuous consumption and advertising) in economic competition.

The disadvantages of poverty are reinforced by discriminatory treatment. Ciara Byrne notes “low-income communities have historically been monitored by government and their privacy has been routinely invaded. In Colonial America, most towns had an overseer of the poor who tracked poor people and either chased them out of town or auctioned off their labor. Current public benefits programs ask applicants extremely detailed and personal questions and sometimes mandate home visits, drug tests, fingerprinting, and collection of biometric information”. “Yet higher-income recipients of government income transfers are not subjected to the same kind of aggressive data collection”. They “get valuable government benefits in my mortgage home deduction, childcare tax credits, my employer health benefits aren’t taxed…Those are income transfers just as much as food stamps or welfare” but without intrusive questioning, verification requirements or home visits”.

Ciara Byrne highlights that “once the welfare system collects an applicant’s data, that data is shared and compared across multiple government and commercial databases. These databases are plagued by outdated, inaccurate, and incomplete information”. “if the data has already been purchased by data brokers, that arrest record[5] can still limit someone’s access to housing, jobs, and other opportunities far into the future”. In some instances, the decisions to deny benefits are opaque because it is based on non-transparent algorithms. “Personal data is used to deny low-income people access to resources or opportunities, but it’s also used to target them with predatory marketing for payday loans or even straight-up scams”.

Conclusion

The clamour for stringent privacy regulation is becoming louder as the malevolent features of transparency become more evident. But it is on the wrong track. First, it is not clear that regulation can regain the private space ceded to transparency. Second, it is not clear that privacy regulation is an effective remedy for information harms. Lastly, it should be emphasised that transparency ddidn’t create the problem, it merely identified them. We therefore should not try to avoid the task ahead of us which is to figure out how to make transparency work in a high-information environment. Society needs to start figuring out how life should be organised in a fishbowl.

References

Alessandro Acquisti, Curtis Taylor, Liad Wagman (8 March 2016) “The economics of privacy”. Journal of Economic Literature, Vol. 52, No. 2; Sloan Foundation Economics Research Paper. https://ssrn.com/abstract=2580411

Ciara Byrne (19 March 2019) “Trading privacy for survival is another tax on the poor.” Fast Company. https://www.fastcompany.com/90317495/another-tax-on-the-poor-surrendering-privacy-for-survival

Joshua A.T. Fairfield, Christoph Engel (December 2015) “Privacy as a public good”. Duke Law Journal. https://scholarship.law.duke.edu/cgi/viewcontent.cgi?article=3824&context=dlj

Kate Eichhorn (27 December 2019) “Why an internet that never forgets is especially bad for young people”. MIT Technology Review.  https://www.technologyreview.com/s/614941/internet-that-never-forgets-bad-for-young-people-online-permanence/

Phuah Eng Chye (2015) Policy paradigms for the anorexic and financialised economy: Managing the transition to an information society.

Phuah Eng Chye (21 December 2019) “The debate on regulating surveillance”.

Phuah Eng Chye (4 January 2020) “The economics and regulation of privacy”. http://economicsofinformationsociety.com/the-economics-and-regulation-of-privacy/

Phuah Eng Chye (18 January 2020) “Big data and the future for privacy”.

Phuah Eng Chye (15 February 2020) “The costs of privacy regulation”.

Phuah Eng Chye (29 February 2020) “The journey from privacy to transparency (and back again)”. http://economicsofinformationsociety.com/the-journey-from-privacy-to-transparency-and-back-again/

Rachel Botsman (21 October 2017) “Big data meets big brother as China moves to rate its citizens”. Wired. https://www.wired.co.uk/article/chinese-government-social-credit-score-privacy-invasion

Shanu Athiparambath (2019) “How Airbnb Is silently changing Himalayan villages”. Veridici. https://veridici.com/how-airbnb-is-silently-changing-himalayan-villages/


[1] https://www.investopedia.com/terms/n/network-effect.asp.

[2] Non-purpose means a decoupling between the data and purpose.

[3] The anorexic and financialised economy: Transition to an information society.

[4] Study by Edelman and Luca (2014).

[5] Referring to those that did not lead to convictions.