The economics and regulation of privacy

The economics and regulation of privacy

Phuah Eng Chye (4 January 2020)

Privacy is a branch of information economics. Alessandro Acquisti, Curtis Taylor and Liad Wagman notes “the value and regulation of information assets have been among the most interesting areas of economic research since Hayek’s 1945 treatise on the use of knowledge in society…Seminal studies have investigated the informative role of prices in market economies (Stigler, 1961); the creation of knowledge and the incentives to innovate (Arrow, 1962); the prevalence of asymmetric information and adverse selection (Akerlof, 1970); the transmission of private information through signaling activity (Spence, 1973); and voluntary disclosures (Grossman, 1981; Milgrom, 1981). It may be proper, however, to think of information economics not as a single field, but as an amalgam of many related sub-fields. One such sub-field now receiving growing attention…the study of privacy”.

Privacy is difficult to define. It means different things to different people. It has been described as the protection of someone’s personal space and their right to be left alone; as the control over and safeguard of personal information; and as an aspect of dignity, autonomy, and ultimately human freedom. While seemingly different, these definitions are related, because they pertain to the boundaries between the self and the others, between private and shared, or, in fact, public. As individuals and as consumers, we constantly navigate those boundaries, and the decisions we make about them determine tangible and intangible benefits and costs, for ourselves and for society. Thus, at its core, the economics of privacy concerns the trade-offs associated with the balancing of public and private spheres between individuals, organizations, and governments. Alessandro Acquisti, Curtis Taylor and Liad Wagman (2016) The economics of privacy

As with many aspects of information economics, it is difficult to be definitive on privacy. Alessandro Acquisti, Curtis Taylor and Liad Wagman notes “previous scholarship has distinguished different dimensions of privacy (such as seclusion, secrecy, solitude, anonymity[1], autonomy, freedom, and so forth)”. The literature also covers “the economic dimensions of spam or the do-not-call registry (which relate to intrusions of a person’s cyberspace made possible by knowledge of her information); or the burgeoning literature on the economics of information security (…data breaches or identity theft that involve personal data, but more often relates to the protection of information infrastructures and other types of informational assets). Overall, “the value of information protection and sharing are almost entirely context-dependent” and “whether privacy protection entails a net positive or negative change in purely economic terms: its impact is context specific”.

The theoretical exploration of privacy is at two levels. The first explores privacy at the level of individuals. This revolves around the microeconomic impact of disclosing information and its impact on the value of privacy. Alessandro Acquisti, Curtis Taylor and Liad Wagman notes the value of privacy is highly affected by context.

  • “Privacy sensitivities and attitudes are subjective and idiosyncratic…Different pieces of information will matter differently to different people[2]…Specifically, individuals differ in what they may experience if some private information were to be shared with others or made public, as well as in their beliefs that the information may in fact be released”.
  • “The value of information will change over time”[3]. “Privacy trade-offs are also inherently intertemporal: disclosing data often carries an immediate benefit, be it intangible (friends liking your online status updates) or tangible (a merchant offering you a discount). The costs of doing so are often uncertain, and are generally incurred at a more distant point in time”[4].
  • “The value and sensitivity of one piece of personal information will change depending on the other pieces of data with which it can be combined”.
  • “Disclosing data often causes a reversal of informational asymmetries: beforehand, the data subject may know something the data holder does not[5]; afterwards, the data subject may not know what the data holder will do with their data, and with what consequences”[6].
  • “Privacy trade-offs often mix the tangible (the discount I will receive from the merchant; the increase in premium I will pay to the insurer), with the intangible (the psychological discomfort I experience when something very personal is exposed without my consent), and the nearly incommensurable (the effect on society of surveillance; the loss of autonomy we endure when others know so much about us)”.
  • “Privacy has elements of both a final good (one valued for its own sake), and an intermediate good (one valued for instrumental purposes). Attitudes towards privacy mainly capture subjective preferences; that is, people’s valuations of privacy as a good in itself (privacy as a final good). But those valuations are separate and sometimes even disjoint from the actual trade-offs that arise following the protection or sharing of personal data (from price discrimination to identity theft; from coupons to personalized services) – that is, from the value of privacy as an intermediate good”[7].
  • “It is not always obvious how to properly value privacy and personal data. Should the reference point be the price one would accept to give away their data, or the amount they would pay to protect it? Or, should it be the expected cost the data subject may suffer if her data is exposed, or the expected profit the data holder can generate from acquiring her personal information?”
  • The “apparent dichotomy between privacy attitudes, privacy intentions, and actual privacy behaviors” creates a “privacy paradox” which asks whether “people actually care about privacy?” and “how much exactly do they value the protection of their personal data?” In this regard, “at the same time as they profess their need for privacy, most consumers remain avid users of information technologies that track and share their personal information with unknown third parties. If anything, the adoption of privacy-enhancing technologies lags vastly behind the adoption of sharing technologies”. Alessandro Acquisti, Curtis Taylor and Liad Wagman suggest the privacy paradox is “the result of many, coexisting, and not mutually exclusive different factors…various decision-making hurdles consumers face when dealing with privacy challenges, especially online, such as asymmetric information, bounded rationality, and various heuristics. For instance, some individuals may not be aware of the extent to which their personal information is collected and identified online[8]”.
  • “Market interactions involving personal data often take place in the absence of individuals’ fully informed consent”. In this regard, “there is yet no open, recognized market for personal data in which data subjects themselves can participate. Personal data is continuously bought, sold, and traded among firms, but consumers themselves do not have access to those markets: they cannot yet efficiently buy back their data, or offer their data for sale”.
  • Alessandro Acquisti, Curtis Taylor and Liad Wagman points out privacy can be viewed as a form of “control over sharing”. “Shared personal information can have, sometimes, characteristics of a public good, such as non-rivalry and non-excludability[9]”. Since the value of privacy is dependent on context, the “potential benefits of strategically sharing certain data” vis-à-vis the “potential costs of having too much information disclosed to the wrong parties”[10] is indeterminate. On the other hand, “individuals can directly benefit from sharing their data”[11]. “Those benefits turn into opportunity costs when the individual chooses not to reveal certain personal data”. In addition, both positive and negative externalities arise through the complex interplay of data creation and transmission…the benefits arising from individuals sharing their information…may be enjoyed by society as a whole”.

Context thus complicates estimating the value of privacy. An individual produces information with the intention of making the information public to only some users and at certain times. He may or may not mind if others find out. Since it is not possible to anticipate when and what the data will be used for, it will be difficult to know in advance the users and usage that individuals would agree to and those he would object to. This is why it is difficult to be definitive on the cost-benefits of privacy regulation.

The second explores privacy at the level of society. Two propositions have garnered attention; namely the unravelling of privacy and privacy as a public good. These propositions highlight the paradoxes or inherent contradictions of privacy regulation.

Scott R. Peppet’s proposition on the unravelling of privacy is based on the observation that “when a few people have the ability and incentive to disclose, everyone may ultimately be forced to do so””. He attributes this to the shift “towards a signaling economy, as opposed to the sorting[12] economy in which we have lived since the late 1800s, poses a very different threat to privacy than the threats of data mining, aggregation, and sorting that have preoccupied the burgeoning field of informational privacy for the last decade…In a world of verifiable information and low-cost signaling, the game-theoretic unraveling[13] effect kicks in, leading self-interested actors to fully disclose their personal information for economic gain. Although at first consumers may receive a discount for using a driving or health monitor, privacy may unravel as those who refuse to disclose are assumed to be withholding negative information and therefore stigmatized and penalized”.

Scott R. Peppet notes “the literature is filled with calls to give individuals greater control over their personal information through the common law of property and tort and through stronger statutory privacy rights”. Yet, “in a signaling economy, even if individuals have control over their personal information, that control is itself the undoing of their privacy. Because individuals hold the keys, they can be asked – or forced – to unlock the door to their personal information. Those who refuse to share their private information will face new forms of economic discrimination”.

In this regard, “privacy law is grounded in classical liberal conceptions of autonomy and individualism. It assumes…control over information will reestablish that autonomy…Privacy is not violated, accordingly, if individuals freely consent to disclosure of information about themselves”. Hence, Scott R. Peppet notes “scholars have sought to correct these market failures, often by turning to Fair Information Practices[14] (or Principles) (FIPs) to protect information. FIPs generally require that data collection, aggregation, and storage be transparent, disclosed to individuals, secure, accurate, and limited in duration…and is backed up by “legislating data handling and disclosure standards”. But FIPs “are meant to remedy the market failure of uninformed, false consent but not to stop disclosure completely. FIPs therefore open the door to the unraveling of privacy because, in a signaling economy, even fully informed consumers may find themselves disclosing to counter the negative assumptions attached to silence”.

Scott R. Peppet notes “control proposals have largely failed”. “There is no comprehensive federal privacy statute, and state statutes are erratic and incomplete. Despite many legislative proposals by scholars and privacy advocates, they have not garnered legislative support…One might therefore draw the conclusion that the privacy field’s inability to even enact comprehensive control reforms suggests that addressing the signaling economy will be next to impossible”.

Scott R. Peppet highlights that “incentives to signal raise exactly the questions to which informational privacy law must turn: questions of justice, fairness, paternalism, and power; questions about coercion and the limits of voluntary disclosure; questions, in short, about how to deal with the threat of unraveling privacy”. However, he considers that “ultimately, however, unraveling may lead to full participation and full disclosure, even by those that might at first hesitate for fear that their checkered personal histories will be used to discriminate against them”. In this regard, the economy is changing as “the need for rationalizing information to price risk, sort consumers, and the like both drives invasive maneuvers to access information…and leads to attempts to induce its disclosure”. Scott R. Peppet argues that “if privacy advocates fear a future of full disclosure, they must articulate why. In a signaling economy, they will face even more organized opposition to restricting information disclosure than they have faced to date”.

The other paradox is based on the concept of “privacy as a public good”. Joshua A.T. Fairfield and Christoph Engel points out “individual control of data is a fundamentally flawed concept because individuals cannot know what the data they reveal means when aggregated with billions of other data points…No matter how healthy or creditworthy or committed to work a person may be, he might not receive a home loan, job offer, or affordable insurance, because of correlations ascertained from others’ data. If you believe in the effectiveness of incentivizing, informing, and empowering individual citizens to protect their own privacy, this is very bad news. As long as the immediate benefit from disclosing your data exceeds the ensuing long-term risk for your own privacy, you will give away your data”. Hence, “the data that a person produces concerns both herself and others…Individuals are vulnerable merely because others have been careless with their data. As a result, privacy protection requires group coordination. Failure of coordination means a failure of privacy. In short, privacy is a public good”.

Joshua A.T. Fairfield and Christoph Engel suggests the “manner in which law addresses privacy will and must undergo a sea change. Today’s social, legal, and self-regulatory tools focus on empowering individuals. They must equally be focused on empowering groups. Individual empowerment is not enough because an individual’s disclosure of information about herself impacts many other people”. “In the absence of public-policy attention to privacy’s group dimension, individual consumers have been left to negotiate, unsuccessfully, with companies over the use of their data. Private companies have accumulated deep and potentially toxic pools of consumer data, and have made this data available to governments with few legal safeguards”.

Joshua A.T. Fairfield and Christoph Engel highlights that “when a lawyer discusses the public good, and an economist explores a public good, they are likely talking about two quite different things”. In the legal version, “the public good refers to what is good for the public…The public good is a general assertion that the public will be better off …The public good does not necessarily suffer from a free-rider problem…Actions taken to promote the public good are not necessarily social dilemmas”.

In the economic version, “a public good refers to a good, a product, which is produced by groups under certain conditions that create a tension between selfishness and cooperation…A public good is a product, good, service, or other benefit that may not be produced, because everyone can share equally in it, whether they contribute to it or not…A public good is defined by a free-rider problem…A public good necessarily involves a social dilemma”.

Joshua A.T. Fairfield and Christoph Engel argues it is “quite clear that not everything that is in the public good is necessarily a public good”. “Much legal scholarship begins with the premise that individual privacy is good, and that because it is good, protecting privacy is socially beneficial, or in the public good. The problem is that…many legal analyses are confusing as to whether they truly address a public good…The assertion that something is good for each citizen individually does not mean that it is good for society as a whole – that is precisely the nature of a social dilemma. So assertions that privacy is an individual right which, when enjoyed by society as a whole, is beneficial, do not capture the tension at the heart of public goods”.

They explain that “in a social dilemma, defection – by free riding on a public good, or contributing to a public bad – is a dominant strategy. Cooperation, defined as contributing to a public good, or refraining from contributing to a public bad, is socially optimal, but an inferior strategy from the individual perspective…Privacy will fall prey to social dilemmas. In weighing important decisions about privacy, individual and group incentives diverge. And without measured intervention, individuals’ fully informed privacy decisions tend to reduce overall privacy, even if everyone cherishes privacy equally and intensely”. “Informing and empowering individual players does not resolve a social dilemma. It is precisely the fully informed, rational, and empowered individual who knows she is better off contributing fully to a public bad, and free riding on a public good, regardless of the actions of others…In short, if privacy is a social dilemma, the very education and empowerment that regulators rely on to ameliorate the dilemma may instead exacerbate it”.

Joshua A.T. Fairfield and Christoph Engel suggest “the relevant legal tools therefore should be redesigned to…permit groups to sustain cooperation and protect privacy even without direct government intervention. We suggest a focus on empowering groups. We suggest leveraging inequity aversion, reciprocity, and normativity to lessen exploitation among group members. We suggest positive framing to promote altruism. We suggest that communication and (private) sanctions are key components of group coordination. With these tools, groups may be able to sustain privacy without governmental intervention and the challenges and distortions that flow therefrom”.

The social dilemmas confronting privacy as a type of public good can be compared to “a kind of commons” for which “clean air, safety, roads, and the common defense all share the same incentive structure”. In particular, surveillance (and spam) has been considered as a form of pollution and this “provides a strong set of analogies for how to craft and maintain political coalitions to resolve particularly harmful collective-action problems”.

Joshua A.T. Fairfield and Christoph Engel recognises the “tension between individual privacy and public need to know particularly influences modern discussions of the reach and role of the surveillance state”. They note however it is not “useful to pit public goods against one another in pairwise comparisons. Why should we pit security versus privacy, and not against public education, or clean air, or any of millions of other public goods…The existence of many public goods does not reduce the need to examine each, and to maximize social welfare from investment in that good”. They argue this “creates a false dichotomy between privacy and security, and does not adequately account for harms created by mass surveillance…we maintain that exploring how to maximize the social value of privacy is valuable regardless of any trade-off effects between privacy and security”.

Overall, the phenomena of unravelling and signalling is the outcome of society’s transition from a low to a high information environment. As the level of information needed for society to function efficiently rises, the abilities of individuals to keep their information private will be reduced. In this regard, privacy has been affected by the emergence of “big data”.

References

Alessandro Acquisti, Curtis Taylor, Liad Wagman (8 March 2016) “The economics of privacy”. Journal of Economic Literature; Sloan Foundation Economics Research Paper. https://ssrn.com/abstract=2580411

Daniel J. Solove (2016) “A brief history of information privacy law”. Proskauer on Privacy, PLI; GWU Law School Public Law Research Paper. https://ssrn.com/abstract=914271

Joshua A.T. Fairfield, Christoph Engel (December 2015) “Privacy as a public good”. Duke Law Journal. https://scholarship.law.duke.edu/cgi/viewcontent.cgi?article=3824&context=dlj

Phuah Eng Chye (21 December 2019) “The debate on regulating surveillance”. http://economicsofinformationsociety.com/the-debate-on-regulating-surveillance/

Scott R. Peppet (7 August 2010) “Unraveling privacy: The personal prospectus & the threat of a full disclosure future”. Northwestern University Law Review. https://scholarlycommons.law.northwestern.edu/cgi/viewcontent.cgi?article=1157&context=nulr

Woodrow Hartzog (2017) “The inadequate, invaluable fair information practices”. Maryland Law Review. https://digitalcommons.law.umaryland.edu/cgi/viewcontent.cgi?referer=https://www.google.com/&httpsredir=1&article=3759&context=mlr


[1] Anonymity is a form of privacy for identity information achieved by removing the link between a person and data items relating to that person.

[2] “Your piano teacher may not be as interested in the schools you attended, unlike your potential employer”.

[3] “An online advertiser may not be as interested in logs of your online activity from five years ago as in your activity right now”.

[4] “A future prospective employer may not like that risque photo you had uploaded from vacation as much as your friends did at the time; a merchant may collect information about you today, and use it for price discrimination the next time you visit its store”.

[5] “For instance, a customer’s willingness to pay for a good”.

[6] “For instance, how the merchant will use the customer’s information, including estimates of her reservation price, following a purchase”.

[7] “For instance, regardless of whether an individual thinks my life is an open book, I have nothing to hide, that individual will still suffer tangible harm if she is a victim of identity theft”.

[8] “Many Internet users are substantially unaware of the extent of behavioral targeting, and many believe that there is an implied duty of confidentiality and law that protects their data despite disclosure”.

[9] “A complex online advertising ecosystem engages in trades of Internet users’ personal information; in fact, it is hard to prevent released data from being duplicated and accessed by other parties, or to control its secondary uses”.

[10] “(from price discrimination to other more odious forms of discrimination; from social stigma to blackmailing; from intangible nuisances to identity theft)”.

[11] “personalized services and discounts” after joining a loyalty program; or “reduced search costs and increased accuracy of information retrieval”.

[12] Scott R. Peppet explains “economic actors do not always need to sort or screen each other based on publicly available information; instead, they can incentivize each other to signal their characteristics.

[13] Economist Robert Frank coined the full disclosure principle to describe this phenomenon: “if some individuals stand to benefit by revealing a favorable value of some trait, others will be forced to disclose their less favorable values.” See Scott R. Peppet.

[14] There is further analysis of FIPs by Daniel J. Solove and by Woodrow Hartzog.