by Ajay Agrawal
3806), the Data Protection Act of 2017 (H.R. 3904), the Market Data Protection Act of 2017
(H.R. 3973), Cyber Breach Notifi cation Act (H.R. 3975), Data Broker Accountability and Transparency Act (S. 1815) and Data Security and Breach Notifi cation Act (S. 2179). They are under committee review and likely consolidated.
41. The National Conference of State Legislatures collects information on these state laws.
For data breach laws, see http:// www .ncsl .org/ research/ telecommunications- and- information
- technology/ security- breach- notifi cation- laws .aspx. For privacy laws, see http:// www .ncsl .org
/ research/ telecommunications- and- information- technology/ state- laws- related- to-internet-privacy .aspx.
Artifi cial Intelligence and Consumer Privacy 457
research point of view, these variations are useful for studying the impact of
data breach laws on identity theft (Romanosky, Acquisti, and Telang 2011)42
and data breach lawsuits (Romanosky, Hoff man, and Acquisti 2014), but
they can be diffi
cult to comply if a fi rm operates in multiple states. It is also
diffi
cult for consumers to form an expectation of privacy protection, espe-
cially if they transact with both in-state and out- of-state fi rms.
In short, the US system is piecemeal and multilayered, in contrast to the
European Union’s attempt to unify data protection via its General Data
Protection Regulation (eff ective in 2018).43 Which approach is better for
society is subject to an ongoing debate.
18.4 Future
Challenges
To summarize, there are pressing issues in consumer privacy and data
security, many of which are likely to be reshaped by AI and other data tech-
nologies.
A number of big questions arise: shall we continue to let the market
evolve under the current laws, or shall we be more aggressive in govern-
ment regulation? How do fi rms choose data technology and data policy
if consumers demand both convenience and privacy? How to balance AI-
powered innovations against the extra risk that the same technology brings
to privacy and data security? If action is needed from policymakers, shall
we let local governments use trial and error, or shall we push for federal
legislations nationwide? Shall we wait for new legislations to address stand-
ing loopholes, or shall we rely on the court system to clarify existing laws
case by case? These questions deserve attention from researchers in many
disciplines, including economics, computer science, information science,
statistics, marketing, and law.
In my opinion, the leading concern is that fi rms are not fully accountable
for the risk they bring to consumer privacy and data security.44 To restore
full accountability, one needs to overcome three obstacles, namely (a) the
diffi
culty to observe fi rms’ actual action in data collection, data storage,
and data use; (b) the diffi
culty to quantify the consequence of data prac-
tice, especially before low- probability adverse events realize themselves; and
(c) the diffi
culty to draw a causal link between a fi rm’s data practice and its
consequence.
These diffi
culties exist, not only because of technical limits, but also
because of misaligned incentives. Even if blockchain can track every piece
of data and AI can predict the likelihood of every adverse event, whether
42. Romanosky, Acquisti, and Telang (2011) explore diff erences among state data breach notifi cation laws and link them to a FTC database of identity theft from 2002 to 2009. They fi nd that adoption of data breach disclosure laws reduces identity theft caused by data breaches by an average 6.1 percent.
43. An overview of GDPR is available at http:// www .eugdpr .org/.
44. The same problem applies to nonprofi t organizations and governments.
458 Ginger Zhe Jin
to develop and adopt such technology is up to fi rms. In the current setting,
fi rms may still have incentives to hide real data practice from the public, to
obfuscate information disclosed to consumers, or to blame other random
factors for consumer harm.
There is a case for further changes to instill more transparency into the
progression from data practice to harmful outcomes, and to translate out-
comes (realized or probabilistic) into incentives that directly aff ect fi rms’
choice of data practice. These changes should not aim to slow down data
technology or to break up big fi rms just because they are big and on the
verge of an AI breakthrough. Rather, the incentive correction should aim
to help consumer- friendly data practice stand out from lemons, which in
turn fosters innovations that respect consumer demand for privacy and
data security.
There might be multiple ways to address misaligned incentives, including
new legislation, industry self- regulation, court ruling, and consumer protec-
tion. Below I comment on the challenges of a few of them.
First, it is tempting to follow the steps in safety regulation. After all,
the information problems we encounter in privacy and data security—as
highlighted in section 18.1—are similar to those in food, drug, air, car, or
nuclear safety. In those areas, the consequence of inadequate quality con-
trol is random and noisy, just as identity thefts and refund frauds are. In
addition, fi rm input and process choices—like ingredients and plant main-
tenance—are often unobservable to fi nal consumers. A common solution
is direct regulation on the fi rm’s action: for example, restaurants must keep
food at a certain temperature, nuclear plants must pass periodical inspec-
tions, and so forth. These regulations are based on the assumption that we
know what actions are good and what actions are bad. Unfortunately, this
assumption is not easy to come by in data practice. With fast evolving tech-
nology, are we sure that politicians in Washington, DC, are the best ones to
judge whether multifactor authentication is better than a twenty- character
password? How do we ensure that the regulation is updated with every round
of technological advance?
The second approach relies on fi rm disclosure and consumer choice.
“Notice and choice” is already the backbone of FTC enforcement (in pri-
vacy), and data breach notifi cation laws follow a similar principle. For this
approach to be eff ective, we assume consumers can make the best choice
for themselves as long as they have adequate information at hand. This
assumption is unlikely to hold in privacy and data security because most
consumers do not read privacy notices (McDonald and Cranor 2008), many
data- intensive fi rms may not have a consumer interface, and it could be
diffi
cult for consumers to choose as they do not have the ability to evalu-
ate diff erent data practices and do not know what choices are available to
mitigate the potential harm. Furthermore, fi rms’ data practice may change
frequently in light of technological advance, thus delivering updated notices
to consumers may be infeasible and overwhelming.
Artifi cial Intelligence and Consumer Privacy 459
The third approach is industry self- regulation. Firms
know more about
data technology and data practice, and therefore are better positioned to
identify best practices. However, can we trust fi rms to impose and enforce
regulations on themselves? History suggests that industry self- regulation
may not occur without the threat of government regulation (Fung, Graham,
and Weil 2007). This suggests that eff orts pushing for government action
may be complementary rather than substitutable to industry attempts to
self- regulate. Another challenge is technical: many organizations are try-
ing to develop a rating system on data practice, but it is challenging to fi nd
comprehensive and updated information fi rm by fi rm. This is not surprising,
given the information asymmetry between fi rms and consumers. Solving this
problem is crucial for any rating system to work.
The fourth approach is defi ning and enforcing privacy and data use as
“rights.” Law scholars have long considered privacy as a right to be left
alone, and debated whether privacy rights and property rights should be
treated separately (Warren and Brandeis 1890). As summarized in Acquisti,
Taylor, and Wagman (2016), when economists consider privacy and data use
as rights, they tend to associate them with property rights. In practice, the
European Union has followed the “human rights” approach, which curtails
transfer and contracting rights that are often assumed under a “property
rights” approach. The European Union recognized individual rights of data
access, data processing, data rectifi cation, and data erasure in the new legis-
lation (GDPR, eff ective in May 2018). The impact of GDPR remains to be
seen, but two challenges are worth mentioning: fi rst, for many data- intensive
products (say self- driving cars), data do not exist until the user interacts
with the product, often under third- party support (say GPS service and car
insurance). Should the data belong to the user, the producer, or third parties?
Second, even if property rights over data can be clearly defi ned, it does not
imply perfect compliance. Music piracy is a good example. Both challenges
could deter data- driven innovations if the innovator has to obtain the rights
to use data from multiple parties beforehand.
Apparently, no approach is challenge free. Given the enormous impact
that AI and big data may have on the economy, it is important to get the mar-
ket environment right. This environment should respect consumer demand
for privacy and data security, encourage responsible data practices, and fos-
ter consumer- friendly innovations.
References
Ablon, Lilian, Paul Heaton, Diana Lavery, and Sasha Romanosky. 2016. Consumer
Attitudes Toward Data Breach Notifi cations and Loss of Personal Information.
Santa Monica, CA: RAND Corporation.
Accenture. 2017. 2017 Insights on the Security Investments That Make a Diff erence.
460 Ginger Zhe Jin
https:// www .accenture .com/ t20170926T072837Z__w__/ us- en/ _acnmedia/ PDF
- 61/ Accenture- 2017-CostCyberCrimeStudy .pdf.
Acquisti, Alessandro, Laura Brandimarte, and George Loewenstein. 2015. “Privacy
and Human Behavior in the Age of Information.” Science 347 (6221): 509– 14.
Acquisti, Alessandro, Ralph Gross, and Fred Stutzman. 2014. “Face Recognition
and Privacy in the Age of Augmented Reality.” Journal of Privacy and Confi den-
tiality 6 (2): Article 1. http:// repository.cmu .edu/ jpc/ vol6/ iss2/ 1.
Acquisti, Alessandro, Curtis Taylor, and Liad Wagman. 2016. “The Economics of
Privacy.” Journal of Economic Literature 54 (2): 442– 92.
Athey, Susan, Christian Catalini, and Catherine E. Tucker. 2017. “The Digital
Privacy Paradox: Small Money, Small Costs, Small Talk.” MIT Sloan Research
Paper no. 5196-17, Massachusetts Institute of Techonolgy. https:// ssrn .com
/ abstract=2916489.
Campbell, Katherine, Lawrence A. Gordon, Martin P. Loeb, and Lei Zhou. 2003.
“The Economic Cost of Publicly Announced Information Security Breaches:
Empirical Evidence from the Stock Market.” Journal of Computer Security 11
(3): 431– 48.
Catalini, Christian, and Joshua S. Gans. 2017. “Some Simple Economics of the
Blockchain.” Rotman School of Management Working Paper no. 2874598.
https:// ssrn .com/ abstract=2874598.
Cavusoglu, Huseyin, Birendra Mishra, and Srinivasan Raghunathan. 2004. “The
Eff ect of Internet Security Breach Announcements on Market Value: Capital
Market Reactions for Breached Firms and Internet Security Developers.” Inter-
national Journal of Electronic Commerce 9 (1): 69– 104.
Chiou, Lesley, and Catherine E. Tucker. 2014. “Search Engines and Data Retention:
Implications for Privacy and Antitrust.” MIT Sloan Research Paper no. 5094-14,
Massachusetts Institute of Technology.
Dwork, Cynthia, Frank McSherry, Kobbi Nissim, and Adam Smith. 2006. “Cali-
brating Noise to Sensitivity in Private Data Analysis.” In Theory of Cryptography
Conference, Lecture Notes in Computer Science, vol. 3876, edited by S. Halevi and T. Rabin, 265– 84. Berlin: Springer.
Erlingsson, Úlfar, Vasyl Pihur, and Aleksandra Korolova. 2014. “RAPPOR: Ran-
domized Aggregatable Privacy- Preserving Ordinal Response.” In Proceedings
of the ACM SIGSAC Conference on Computer and Communications Security
(CCS):1054– 67.
Federal Trade Commission (FTC). 2012. Protect Consumer Privacy in an Era of
Rapid Change: Recommendations for Businesses and Policy Makers. https:// www
.ftc .gov/ sites/ default/ fi les/ documents/ reports/ federal- trade- commission- report
- protecting- consumer- privacy- era- rapid- change- recommendations/ 120326
privacyreport .pdf.
———. 2014. Consumer Sentinel Network Data Book from January—December
2014. https:// www .ftc .gov/ system/ fi les/ documents/ reports/ consumer- sentinel
- network- data- book- january- december- 2014/ sentinel- cy2014– 1 .pdf.
———. 2015. Consumer Sentinel Network Data Book from January—December
2015. https:// www .ftc .gov/ system/ fi les/ documents/ reports/ consumer- sentinel
- network- data- book- january- december- 2015/ 160229csn- 2015databook .pdf.
———. 2016. Consumer Sentinel Network Data Book from January—December
2016. https:// www .ftc .gov/ system/ fi les/ documents/ reports/ consumer- sentinel
- network- data- book- january- december- 2016/ csn_cy- 2016_data_book .pdf.
Florêncio, Dina, and Cormac Herley. 2010. “Where Do Security Policies Come
From?” Symposium on Usable Privacy and Security (SOUPS), July 14– 16, Red-
mond, WA.
Artifi cial Intelligence and Consumer Privacy 461
Fung, Archon, Mary Graham, and David Weil. 2007. Full Disclosure: The Perils and
Promise of Transparency. Cambridge: Cambridge University Press.
Goldfarb, Avi, and Catherine E. Tucker. 2012. “Shifts in Privacy Concerns.”
American Economic Review: Papers and Proceedings 102 (3): 349– 53.
Government Accountability Offi
ce (GAO). 2015. Identity Theft and Tax Fraud:
Enhanced Authentication Could Combat Refund Fraud, but IRS Lacks an Esti-
mate of Costs, Benefi ts and Risks, GAO- 15– 119, January. https:// www .gao .gov
/ products/ G
AO- 15– 119.
———. 2017. Identity Theft Services: Services Off er Some Benefi ts but Are Limited in Preventing Fraud, GAO- 17-254, March. http:// www .gao .gov/ assets/ 690/ 683842 .pdf.
Harrell, Erika. 2014. Victims of Identity Theft, 2014. Bureau of Justice Statistics.
https:// www .bjs .gov/ content/ pub/ pdf/ vit14 .pdf.
Hoff man, Mitchell, Lisa B. Kahn, and Danielle Li. 2018. “Discretion in Hiring*.”
Quarterly Journal of Economics 133 (2): 765– 800.
Hsu, Justin, Marco Gaboardi, Andreas Haeberlen, Sanjeev Khanna, Arjun Narayan,
Benjamin C. Pierce, and Aaron Roth. 2014. “Diff erential Privacy: An Economic
Method for Choosing Epsilon.” In 27th IEEE Computer Security Foundations
Symposium (CSF):398– 410.
Jin, Ginger Zhe, and Andrew Stivers. 2017. “Protecting Consumers in Privacy and
Data Security: A Perspective of Information Economics.” Working paper. https://
ssrn .com/ abstract=3006172.
Ko, Myung, and Carlos Dorantes. 2006. “The Impact of Information Security
Breaches on Financial Performance of the Breached Firms: An Empirical Inves-
tigation.” Journal of Information Technology Management 17 (2): 13– 22.
McDonald, Aleecia, and Lorrie Faith Cranor. 2008. “The Cost of Reading Pri-
vacy Policies.” I/ S: A Journal of Law and Policy for the Information Society 4 (3): 540– 65.
Miller, Amalia, and Catherine E. Tucker. Forthcoming. “Privacy Protection, Per-
sonalized Medicine and Genetic Testing.” Management Science.
Odlyzko, Andrew. 2003. “Privacy, Economics, and Price Discrimination on the Inter-
net.” In Economics of Information Security, edited by L. Jean Camp and Stephen
Lewis, 187– 212. Norwell, MA: Kluwer Academic Publishers.
Pew Research Center. 2016. The State of Privacy in Post- Snowden America. http://
www .pewresearch .org/ fact- tank/ 2016/ 09/ 21/ the- state- of-privacy- in-america/.
Ponemon. 2017. 2017 Ponemon Cost of Data Breach Study. https:// www .ibm .com
/ security/ data- breach/ index .html.
Posner, Richard A. 1981. “The Economics of Privacy.” American Economic Review