The Economics of Artificial Intelligence
Page 75
user profi le pictures and likenesses without user consent).25
Similarly, a privacy scare prompted Samsung to change its privacy policy.
In February 2015, CNN quoted a paragraph of Samsung’s privacy policy,
22. https:// www .washingtonpost .com/ business/ economy/ 2012/ 06/ 26/ gJQATDUB5V_story
.html?utm_term=.1ab4fedd7683, accessed October 19, 2017.
23. Matt McKeon gives a graphical account of how Facebook privacy evolves from 2005 to 2010, at http:// mattmckeon .com/ facebook- privacy/ , accessed on October 24, 2017.
24. http:// 60secondmarketer .com/ blog/ 2014/ 09/ 21/ facebook- tightens- privacy- controls
- aff ect- marketing/ , accessed October 24, 3017.
25. https:// www .wired .com/ 2013/ 08/ judge- approves- 20-million- facebook- sponsored
- stories- settlement/ , accessed October 24, 2017.
452 Ginger Zhe Jin
which stated that words spoken in front of a Samsung Smart TV are cap-
tured and transmitted to a third party through use of voice recognition.26 In
response to the intense fear that smart TVs “spy” in a private living room,
Samsung later changed its privacy policy.27 Samsung also clarifi ed that voice
recognition can be disabled and it uses industry standard encryption to
secure the data.
The privacy competition in the smartphone market is even more inter-
esting. In 2015, Google launched Android Marshmallow in Android 6.0,28
which prompts users to grant or deny individual permissions (e.g., access to
the camera) to a mobile app when it is needed for the fi rst time, rather than
automatically grant apps all of their specifi ed permissions at installation.
It also allows users to change the permissions at any time. Similar features
were made available earlier in Apple iOS 8.29 Apple’s commitment to privacy
protection was also highlighted when Apple refused to unlock the iPhone
from one of the shooters in the December 2015 terrorist attack in San Ber-
nardino, California.
As a pioneer in biometric authentication, Apple recently announced Face
ID in its next smartphone launch (iPhone X). Using infrared cameras, Face
ID uniquely identifi es a user’s face and utilizes that information to unlock
the smartphone and authorize Apple Pay. Though it is meant to enhance
convenience and security, Face ID has stirred a number of privacy concerns
including exposing consumer privacy to Apple employees and allowing the
police to forcefully unlock a phone using the owner’s face. Whether this AI-
powered technology will reduce or enhance privacy protection is an open
question.
Note that market mechanisms can also work against consumer privacy
and data security. Dina Florêncio and Cormac Herley (2010) examined the
password policy of seventy- fi ve websites and found that password strength
is weaker for some of the largest, most attacked sites that should have greater incentives to protect their valuable database. Compared to security demand,
it seems that competition is more likely to drive websites to adopt a weaker
password requirement as they need to compete for users, traffi
c, and adver-
tising. The sample size of this study is too small to represent the whole mar-
ket, but the message is concerning: consumer demand in privacy and data
26. According to CNN (http:// money.cnn .com/ 2015/ 02/ 09/ technology/ security/ samsung
- smart- tv- privacy/ index .html), Samsung’s privacy policy said “Please be aware that if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted to a third party through your use of Voice Recognition.”
The article further points out that, Samsung SmartTV has a set of pre- programmed commands that it recognizes even if you opt out of voice recognition.
27. https:// www .cnet .com/ news/ samsung- changes- smarttv- privacy- policy- in-wake- of
-spying- fears/ , accessed on October 24, 2017.
28. Android Mashmellow was fi rst released as a beta on May 28, 2015, followed by the offi cial
release on October 5, 2015. Its new model of app permission was received positively: https:// fpf
.org/ 2015/ 06/ 23/ android- m- and- privacy- giving- users- control- over- app- permissions/.
29. https:// fpf .org/ 2014/ 09/ 12/ ios8privacy/ , accessed on October 24, 2017.
Artifi cial Intelligence and Consumer Privacy 453
security may compete with the same consumers’ demand for convenience,
usability, and other attributes (such as lower price). When these demands
confl ict with each other, fi rms may have a stronger incentive to accommo-
date the attributes that are more visible and easier to evaluate. Probably the
same reason explains why only a small fraction of fi rms adopt multifactor
authentication,30 despite its ability to reduce data risk.
So far, we have considered AI as an external factor that potentially
increases the risk of privacy violation and data breach. It is important to
recognize that AI could also serve as a tool to mitigate the risk. Recently, AI has demonstrated super intelligence in games such as Go, even without the
help of any human knowledge (Silver et al. 2017). Imagine what data risk
would look like if the same AI power is used to grant data access to autho-
rized personnel, to detect data attack when (or even before) it materializes,
and to precisely predict whether a user- generated posting is authentic or
fake. In fact, the technology frontier is moving this direction, though its net
benefi ts remain to be seen.
Take diff erential privacy as an example. It was invented more than ten
years ago (Dwork et al. 2006) and claimed by Apple as a key feature to
protect consumer identity in some of its data collection since 2016. The
basic logic goes as follows: the data collecting fi rm adds random noise to an
individual user’s information before uploading it to the cloud. That way, the
fi rm can still use the collected data for meaningful analysis without knowing
each user’s secret. The eff ectiveness of this technology depends on how much
noise to add, a parameter under the control of the data- collecting fi rm.
To evaluate how Apple implements diff erential privacy in practice, Tang
et al. (2017) reverse- engineered Apple’s MacOS and iOS operating systems.
They fi nd that the daily privacy loss permitted by Apple’s diff erential pri-
vacy algorithm exceeds values acceptable by the theoretical community (Hsu
et al. 2014), and the overall privacy loss per device may be unbounded. Apple
disputes the results and argues that its diff erential privacy feature is subject
to user opt- in. Google is another user of diff erential privacy (in its web
browser Chrome). The “noise” parameter that Google uses—as estimated
by Erlingsson, Pihur and Korolova (2014)—seems to be more privacy-
protective than what is claimed to be used in Apple, but still falls outside
the most acceptable range.31 These debates cast doubt on the promise of dif-
ferential privacy, especially on its real use relative to its theoretical potential.
Another promising technology is blockchain. In plain English, blockchain
is an ever- growing list of records (blocks) that are linked with timestamp
and transaction data. Secured by cryptography, blockchain is designed to
30. Multifactor a
uthentication is a security measure that requires two or more independent credentials to verify the identity of the user. https:// twofactorauth .org/ allows one to search whether a fi rm uses multi- factor authentication in various types of products or services.
31. https:// www .wired .com/ story/ apple- diff erential- privacy- shortcomings/ , accessed on October 24, 2017.
454 Ginger Zhe Jin
be verifi able, permanent, and resistant to data modifi cation. Its successful
application in Bitcoin suggests that similar technology could trace identi-
ties in data trade and data use, thus reducing the risk in privacy and data
security (Catalini and Gans 2017). Ironically, a ransomware attacker in May
2017 demanded Bitcoin instead of traditional money, probably for a similar
security reason.
18.3.3 Policy
Landscape
Any market description is incomplete without a summary of the policy
background. In the United States, there is no overarching legislation on con-
sumer privacy or data security. So far, the policy landscape is a patchwork
of federal and local regulations.
Only a few federal laws are explicit on privacy protection and they all
tend to be industry specifi c. For example, the Gramm- Leach- Bliley Act
(GLBA) controls the ways that fi nancial institutions deal with personal data,
the Health Insurance Portability and Accountability Act of 1996 (HIPAA)
provides data privacy and security provisions for medical records, and the
Children’s Online Privacy Protection Act of 1998 (COPPA) disciplines
online services directed to children under the age of thirteen. In accor-
dance, privacy is subject to federal regulation by sectors: the Department
of Health & Human Resources (DHHS) enforces HIPAA in health care, the
Federal Communication Commission (FCC) regulates telecommunication
services, the federal reserve systems monitors the fi nancial sector, the Secu-
rity and Exchange Commission (SEC) focuses on public fi rms and fi nancial
exchanges, and the Department of Homeland Security (DHS) deals with
terrorism and cybercrimes related to national security.
Two exceptions are worth mentioning. First, the Federal Trade Commis-
sion (FTC) can address privacy violations and inadequate data security as
deceptive and unfair practice, following the 1914 FTC Act. This enforce-
ment authority covers almost every industry and overlaps with many sector-
specifi c regulators.
More specifi cally, FTC’s privacy enforcement focuses on “notice and
choice,” which emphasizes how fi rms’ actual data practice deviates from
the privacy notice they disclose to the public. For industries not subject to
GLBA, HIPAA, or COPPA, there is no legislation that mandates privacy
notice, but many fi rms provide it voluntarily and seek consumer consent
before purchase or consumption. Some industries also adopt self- regulatory
programs to encourage certain privacy practices.32 This background allows
the FTC to obtain privacy notice of the targeted fi rm and enforce it under
the FTC Act.
32. For example, Digital Advertising Alliance (DAA), a nonprofi t organization led by advertising and marketing trade associations, establishes and enforces privacy practices for digital advertising.
Artifi cial Intelligence and Consumer Privacy 455
The FTC has published a number of guidelines on privacy,33 but the best
way to understand its enforcement is through cases. For example, the FTC
alleged that Practice Fusion misled consumers by fi rst soliciting reviews
for their doctors and then publicly posting these reviews on the internet
without adequate consumer notice. The case eventually settled in June
2016.34 In another case against Vizio, FTC alleged Vizio captured second-
by- second information about video displayed on its smart TV, appended
specifi c demographic information to the viewing data, and sold this infor-
mation to third parties for targeted ads and other purposes. According to
the complaint, VIZIO touted its “Smart Interactivity” feature that “enables
program off ers and suggestions,” but failed to inform consumers that the
settings also enabled the collection of consumers’ viewing data.35 The case is
joint with New Jersey Attorney General and settled for $2.2 million in Feb-
ruary 2017. The third case is against Turn, a digital advertising company
that tracks consumers in online browser and mobile devices and uses that
information to target digital advertisements. The FTC alleged that Turn
used unique identifi ers to track millions of Verizon consumers even after
they choose to block or delete cookies from websites, which is inconsistent
with Turn’s privacy policy. Turn settled with FTC in December 2016.36
While privacy notice is something that consumers can access, read
(whether they read them is another question) and consent to, most data
security practices are not visible until someone exposes the data vulnerability
(via data breach or white- hat discovery). Accordingly, FTC enforcement
on data security focuses on whether a fi rm has adequate data security, not
whether the fi rm has provided suffi
cient information to consumers. Follow-
ing this logic, the FTC has settled with Ashley Madison, Uber, Wyndham
Hotel and Resorts, Lenovo, and TaxSlayer, but is engaged in litigation with
LabMD and D-Link.37
The second exception relates to government access to personal data.
Arguably, the US Constitution, in particular the First and Fourth Amend-
ments, has already covered individual rights in free speech and limited gov-
ernment ability to access and acquire personal belongings. However, exactly
how the Constitution applies to electronic data is subject to legal debate
(Solove 2013).
33. The most comprehensive FTC guideline is its 2012 privacy report (FTC 2012). A list
of privacy- related press releases can be found at https:// www .ftc .gov/ news- events/ media
- resources/ protecting- consumer- privacy/ ftc- privacy- report.
34. https:// www .ftc .gov/ news- events/ press- releases/ 2016/ 06/ electronic- health- records
- company- settles- ftc- charges- it- deceived, accessed on October 24, 2017.
35. https:// www .ftc .gov/ news- events/ press- releases/ 2017/ 02/ vizio- pay- 22-million- ftc- state
- new- jersey- settle- charges- it, accessed on October 25, 2017.
36. https:// www .ftc .gov/ news- events/ press- releases/ 2016/ 12/ digital- advertising- company
- settles- ftc- charges- it- deceptively, accessed on October 25, 2017.
37. For a list of FTC cases in data security, see https:// www .ftc .gov/ enforcement/ cases
- proceedings/ terms/ 249.
456 Ginger Zhe Jin
Beyond the Constitution, a series of federal laws—the Electronic Com-
munications Privacy Act of 1986 (ECPA), the Stored Communications Act
(1986), the Pen Register Act (1986), and the 2001 USA Patriot Act—stipu-
late when and how the government can collect and process electronic infor-
mation of individuals. But many of these laws were enacted in the wake of
the Watergate scandal, long before the use of the internet, email, search
engines, and social media. It is unclear how they apply to real cases. The
legal ambiguity is highlighted in t
hree events: fi rst, as exposed by Edward
Snowden, the NSA has secretly harvested tons of personal information for
its global surveillance programs. The exposure generates an outcry for pri-
vacy and a hot debate in the balance between individual privacy and na-
tional security. Second, the Microsoft email case, regarding whether the US
government has the right to access emails stored by Microsoft overseas, has
reached the US Supreme Court. In March 2018, the CLOUD Act clarifi ed
how US law enforcement orders issued under the Stored Communication
Act may reach data in other countries and how data hosting companies may
challenge such law enforcement requests.38 Third, Apple refused to unlock
the iPhone of one of the shooters in the 2015 San Bernardino terrorist at tack.
Since the FBI was able to unlock the phone before the court hearing, it
re mains unknown whether Apple has the legal obligation to help the FBI.39
At the local level, all fi fty states have enacted data breach notifi cation laws,
but no federal law has been passed on this topic.40 According to the National
Conference of State Legislatures, at least seventeen states have also passed
some law on privacy. For example, the California Consumer Privacy Act
was enacted in June 2018 and set to be eff ective on January 1, 2020. These
local laws tend to vary greatly in content, coverage, and remedy.41 From the
38. http://techcrunch.com/2018/04/17/supreme-court-dismisses-warrant-case-against
-microsoft-after-cloud-act-renders-it-moot/, accessed January 13, 2019. The CLOUD Act was enacted in March 2018 and stands for the Clarifying Lawful Overseas Use of Data Act.
39. http:// www .latimes .com/ local/ lanow/ la- me- ln- fbi- drops- fight- to-force- apple- to
-unlock- san- bernardino- terrorist- iphone- 20160328-story .html, accessed October 25, 2017.
40. There have been multiple eff orts towards a federal data breach notifi cation law. In 2012, Senator Jay Rockefeller advocated for a cyber security legislation that strengthens the requirement to report cybercrimes. In January 2014, the Senate Commerce, Science, and Transportation Committee (led by Senator Rockefeller) introduced a bill to create a federal requirement for data breach notifi cation (S. 1976 Data Security and Breach Notifi cation Act of 2014). In his 2015 State of the Union Speech, President Obama proposed new legislation to create a national data breach standard with a thirty- day notifi cation requirement for data breach. A related bill was later introduced by the US House of Representatives (H.R. 1770L Data Security and Breach Notifi cation Act of 2015). All of them failed. In the wake of the mega breaches in 2017, Congress has introduced Personal Data Notifi cation and Protection Act of 2017 (H.R.