6750-01-P
FEDERAL TRADE COMMISSION
16 CFR Part 464
Trade Regulation Rule on Commercial Surveillance and Data Security
Federal Trade Commission.
ACTION: Advance notice of proposed rulemaking; request for public comment; public forum.
publishing this advance notice of
proposed rulemaking (“ANPR”) to request public comment on the prevalence of commercial
surveillance and data security practices that harm consumers. Specifically, the Commission
invites comment on whether it should implement new trade regulation rules or other regulatory
alternatives concerning the ways in which companies (1) collect, aggregate, protect, use, analyze,
and retain consumer data, as well as (2) transfer, share, sell, or otherwise monetize that data in
ways that are unfair or deceptive.
DATES: Comments must be received on or before [60 DAYS AFTER DATE OF
PUBLICATION IN THE FEDERAL REGISTER]. The Public Forum will be held virtually on
Thursday, September 8, 2022, from 2 p.m. until 7:30 p.m. Members of the public are invited to
attend at the website https://www.ftc.gov/news-events/events/2022/09/commercial-surveillancedata-security-anpr-public-forum.
ADDRESSES: Interested parties may file a comment online or on paper by following the
instructions in the Comment Submissions part of the SUPPLEMENTARY INFORMATION
section below. Write “Commercial Surveillance ANPR, R111004” on your comment, and file
your comment online at https://www.regulations.gov. If you prefer to file your comment on
1
paper, mail your comment to the following address: Federal Trade Commission, Office of the
Secretary, 600 Pennsylvania Avenue, NW, Suite CC-5610 (Annex B), Washington, DC 20580.
FOR FURTHER INFORMATION CONTACT: James Trilling, 202-326-3497; Peder Magee,
202-326-3538; Olivier Sylvain, 202-326-3046; or commercialsurveillancerm@ftc.gov.
I. Overview
Whether they know it or not, most Americans today surrender their personal information
to engage in the most basic aspects of modern life. When they buy groceries, do homework, or
apply for car insurance, for example, consumers today likely give a wide range of personal
information about themselves to companies, including their movements,1 prayers,2 friends,
3
menstrual cycles,
4 web-browsing,5 and faces,6 among other basic aspects of their lives.
1 See, e.g., Press Release, Fed. Trade Comm’n, Mobile Advertising Network InMobi Settles FTC Charges It Tracked
Hundreds of Millions of Consumers’ Locations Without Permission (June 22, 2016), https://www.ftc.gov/newsevents/press-releases/2016/06/mobile-advertising-network-inmobi-settles-ftc-charges-it-tracked. See also Stuart A.
Thompson & Charlie Warzel, Twelve Million Phones, One Dataset, Zero Privacy, N.Y. Times (Dec. 19, 2019),
https://www.nytimes.com/interactive/2019/12/19/opinion/location-tracking-cell-phone.html; Jon Keegan & Alfred
Ng, There’s a Multibillion-Dollar Market for Your Phone’s Location Data, The Markup (Sept. 30, 2021),
https://themarkup.org/privacy/2021/09/30/theres-a-multibillion-dollar-market-for-your-phones-location-data; Ryan
Nakashima, AP Exclusive: Google Tracks Your Movements, Like It or Not, Associated Press (Aug. 13, 2018),
https://apnews.com/article/north-america-science-technology-business-ap-top-news828aefab64d4411bac257a07c1af0ecb.
2 See, e.g., Joseph Cox, How the U.S. Military Buys Location Data from Ordinary Apps, Motherboard (Nov. 16,
2020), https://www.vice.com/en/article/jgqm5x/us-military-location-data-xmode-locate-x. 3 See, e.g., Press Release, Fed. Trade Comm’n, Path Social Networking App Settles FTC Charges It Deceived
Consumers and Improperly Collected Personal Information from Users’ Mobile Address Books (Feb. 1, 2013),
https://www.ftc.gov/news-events/press-releases/2013/02/path-social-networking-app-settles-ftc-charges-it-deceived. 4 See, e.g., Press Release, Fed. Trade Comm’n, FTC Finalizes Order with Flo Health, a Fertility-Tracking App that
Shared Sensitive Health Data with Facebook, Google, and Others (June 22, 2021), https://www.ftc.gov/newsevents/press-releases/2021/06/ftc-finalizes-order-flo-health-fertility-tracking-app-shared. 5 See, e.g., Fed. Trade Comm’n, A Look at What ISPs Know About You: Examining the Privacy Practices of Six
Major Internet Service Providers: An FTC Staff Report (Oct. 21, 2021),
https://www.ftc.gov/system/files/documents/reports/look-what-isps-know-about-you-examining-privacy-practicessix-major-internet-service-providers/p195402_isp_6b_staff_report.pdf. 6 See, e.g., Press Release, Fed. Trade Comm’n, FTC Finalizes Settlement with Photo App Developer Related to
Misuse of Facial Recognition Technology (May 7, 2021), https://www.ftc.gov/news-events/pressreleases/2021/05/ftc-finalizes-settlement-photo-app-developer-related-misuse. See also Tom Simonite, Face
Recognition Is Being Banned—but It’s Still Everywhere, Wired (Dec. 22, 2021), https://www.wired.com/story/facerecognition-banned-but-everywhere/.
2
Companies, meanwhile, develop and market products and services to collect and
monetize this data. An elaborate and lucrative market for the collection, retention, aggregation,
analysis, and onward disclosure of consumer data incentivizes many of the services and products
on which people have come to rely. Businesses reportedly use this information to target
services—namely, to set prices,7 curate newsfeeds,8 serve advertisements,9 and conduct research
on people’s behavior,10 among other things. While, in theory, these personalization practices
have the potential to benefit consumers, reports note that they have facilitated consumer harms
that can be difficult if not impossible for any one person to avoid.11
7 See, e.g., Casey Bond, Target Is Tracking You and Changing Prices Based on Your Location, Huffington Post
(Feb. 24, 2022), https://www.huffpost.com/entry/target-tracking-location-changingprices_l_603fd12bc5b6ff75ac410a38; Maddy Varner & Aaron Sankin, Suckers List: How Allstate’s Secret Auto
Insurance Algorithm Squeezes Big Spenders, The MarkUp (Feb. 25, 2020), https://themarkup.org/allstatesalgorithm/2020/02/25/car-insurance-suckers-list. See generally Executive Office of the President of the United
States, Big Data and Differential Pricing, at 2, 12-13 (Feb. 2015),
https://obamawhitehouse.archives.gov/sites/default/files/whitehouse_files/docs/Big_Data_Report_Nonembargo_v2.
pdf.
8 See, e.g., Will Oremus et al., Facebook under fire: How Facebook shapes your feed: The evolution of what posts
get top billing on users’ news feeds, and what gets obscured, Wash. Post (Oct. 26, 2021),
https://www.washingtonpost.com/technology/interactive/2021/how-facebook-algorithm-works/. 9 See, e.g., Nat Ives, Facebook Ad Campaign Promotes Personalized Advertising, Wall. St. J. (Feb. 25, 2021),
https://www.wsj.com/articles/facebook-ad-campaign-promotes-personalized-advertising-11614261617. 10 See, e.g., Elise Hu, Facebook Manipulates Our Moods for Science and Commerce: A Roundup, NPR (June 30,
2014), https://www.npr.org/sections/alltechconsidered/2014/06/30/326929138/facebook-manipulates-our-moodsfor-science-and-commerce-a-roundup. 11 See, e.g., Matthew Hindman et al., Facebook Has a Superuser-Supremacy Problem, The Atlantic (Feb. 10, 2022),
https://www.theatlantic.com/technology/archive/2022/02/facebook-hate-speech-misinformation-superusers/621617/;
Consumer Protection Data Spotlight, Fed. Trade Comm’n, Social Media a Gold Mine for Scammers in 2021 (Jan.
25, 2022), https://www.ftc.gov/news-events/blogs/data-spotlight/2022/01/social-media-gold-mine-scammers-2021;
Jonathan Stempel, Facebook Sued for Age, Gender Bias in Financial Services Ads, Reuters (Oct. 31, 2019),
https://www.reuters.com/article/us-facebook-lawsuit-bias/facebook-sued-for-age-gender-bias-in-financial-servicesads-idUSKBN1XA2G8; Karen Hao, Facebook’s Ad Algorithms Are Still Excluding Women from Seeing Jobs, MIT
Tech. Rev. (Apr. 9, 2021), https://www.technologyreview.com/2021/04/09/1022217/facebook-ad-algorithm-sexdiscrimination; Corin Faife & Alfred Ng, Credit Card Ads Were Targeted by Age, Violating Facebook’s AntiDiscrimination Policy, The MarkUp (Apr. 29, 2021), https://themarkup.org/citizen-browser/2021/04/29/credit-cardads-were-targeted-by-age-violating-facebooks-anti-discrimination-policy. Targeted behavioral advertising is not the
only way in which internet companies automate advertising at scale. Researchers have found that contextual
advertising may be as cost-effective as targeting, if not more so. See, e.g., Keach Hagey, Behavioral Ad Targeting
Not Paying Off for Publishers, Study Suggests, Wall St. J. (May 29, 2019), https://www.wsj.com/articles/behavioralad-targeting-not-paying-off-for-publishers-study-suggests-11559167195 (discussing Veronica Marotta et al., Online
Tracking and Publishers’ Revenues: An Empirical Analysis (2019), https://weis2019.econinfosec.org/wpcontent/uploads/sites/6/2019/05/WEIS_2019_paper_38.pdf).
3
Some companies, moreover, reportedly claim to collect consumer data for one stated
purpose but then also use it for other purposes.12 Many such firms, for example, sell or otherwise
monetize such information or compilations of it in their dealings with advertisers, data brokers,
and other third parties.13 These practices also appear to exist outside of the retail consumer
setting. Some employers, for example, reportedly collect an assortment of worker data to
evaluate productivity, among other reasons14—a practice that has become far more pervasive
since the onset of the COVID-19 pandemic.15
Many companies engage in these practices pursuant to the ostensible consent that they
obtain from their consumers.16 But, as networked devices and online services become essential
to navigating daily life, consumers may have little choice but to accept the terms that firms
12 See, e.g., Drew Harvell, Is Your Pregnancy App Sharing Your Intimate Data with Your Boss?, Wash. Post (Apr.
10, 2019), https://www.washingtonpost.com/technology/2019/04/10/tracking-your-pregnancy-an-app-may-be-morepublic-than-you-think/; Jon Keegan & Alfred Ng, The Popular Family Safety App Life360 Is Selling Precise
Location Data on Its Tens of Millions of Users, The MarkUp (Dec. 6, 2021),
https://themarkup.org/privacy/2021/12/06/the-popular-family-safety-app-life360-is-selling-precise-location-data-onits-tens-of-millions-of-user. 13 See, e.g., Fed. Trade Comm’n, Data Brokers: A Call for Transparency and Accountability (May 2014),
https://www.ftc.gov/system/files/documents/reports/data-brokers-call-transparency-accountability-report-federaltrade-commission-may-2014/140527databrokerreport.pdf. See also, e.g., Press Release, Fed. Trade Comm’n, FTC
Puts an End to Data Broker Operation that Helped Scam More Than $7 Million from Consumers’ Accounts (Nov.
30, 2016), https://www.ftc.gov/news-events/press-releases/2016/11/ftc-puts-end-data-broker-operation-helpedscam-more-7-million; Press Release, Fed. Trade Comm’n, Data Broker Defendants Settle FTC Charges They Sold
Sensitive Personal Information to Scammers (Feb. 18, 2016), https://www.ftc.gov/news-events/pressreleases/2016/02/data-broker-defendants-settle-ftc-charges-they-sold-sensitive. 14 See, e.g., Drew Harwell, Contract Lawyers Face a Growing Invasion of Surveillance Programs That Monitor
Their Work, Wash. Post (Nov. 11, 2021), https://www.washingtonpost.com/technology/2021/11/11/lawyer-facialrecognition-monitoring/; Annie Palmer, Amazon Is Rolling Out Cameras That Can Detect If Warehouse Workers
Are Following Social Distancing Rules, CNBC (June 16, 2020), https://www.cnbc.com/2020/06/16/amazon-usingcameras-to-enforce-social-distancing-rules-at-warehouses.html; Sarah Krouse, How Google Spies on Its Employees,
The Information (Sept. 23, 2021), https://www.theinformation.com/articles/how-google-spies-on-its-employees;
Adam Satariano, How My Boss Monitors Me While I Work From Home, N.Y. Times (May 6, 2020),
https://www.nytimes.com/2020/05/06/technology/employee-monitoring-work-from-home-virus.html. 15 See, e.g., Danielle Abril & Drew Harwell, Keystroke tracking, screenshots, and facial recognition: The box may
be watching long after the pandemic ends, Wash. Post (Sept. 24, 2021),
https://www.washingtonpost.com/technology/2021/09/24/remote-work-from-home-surveillance/. 16 See Tr. of FTC Hr’g, The FTC’s Approach to Consumer Privacy (Apr. 9, 2019), at 50,
https://www.ftc.gov/system/files/documents/public_events/1418273/ftc_hearings_session_12_transcript_day_1_4-9-
19.pdf (remarks of Paul Ohm). See also Fed. Trade Comm’n, Privacy Online: Fair Information Practices in the
Electronic Marketplace: A Report to Congress 26 (May 2000),
https://www.ftc.gov/sites/default/files/documents/reports/privacy-online-fair-information-practices-electronicmarketplace-federal-trade-commission-report/privacy2000.pdf.
4
offer.17 Reports suggest that consumers have become resigned to the ways in which companies
collect and monetize their information, largely because consumers have little to no actual control
over what happens to their information once companies collect it.18
In any event, the permissions that consumers give may not always be meaningful or
informed. Studies have shown that most people do not generally understand the market for
consumer data that operates beyond their monitors and displays.19 Most consumers, for example,
know little about the data brokers and third parties who collect and trade consumer data or build
consumer profiles20 that can expose intimate details about their lives and, in the wrong hands,
could expose unsuspecting people to future harm.21 Many privacy notices that acknowledge such
risks are reportedly not readable to the average consumer.22 Many consumers do not have the
17 See Tr. of FTC Hr’g, The FTC’s Approach to Consumer Privacy (Apr. 10, 2019), at 129,
https://www.ftc.gov/system/files/documents/public_events/1418273/ftc_hearings_session_12_transcript_day_2_4-
10-19.pdf (remarks of FTC Commissioner Rebecca Kelly Slaughter, describing privacy consent as illusory because
consumers often have no choice other than to consent in order to reach digital services that have become necessary
for participation in contemporary society).
18 See Joe Nocera, How Cookie Banners Backfired, N.Y. Times (Jan. 29, 2022),
https://www.nytimes.com/2022/01/29/business/dealbook/how-cookie-banners-backfired.html (discussing concept of
“digital resignation” developed by Nora Draper and Joseph Turow). See also Nora A. Draper & Joseph Turow, The
Corporate Cultivation of Digital Resignation, 21 New Media & Soc’y 1824-39 (2019). 19 See Neil Richards & Woodrow Hartzog, The Pathologies of Digital Consent, 96 Wash. U. L. Rev. 1461, 1477-78,
1498-1502 (2019); Daniel J. Solove, Introduction: Privacy Self-Management and the Consent Dilemma, 126 Harv.
L. Rev. 1879, 1885-86 (2013) (“Solove Privacy Article”). 20 See generally Fed. Trade Comm’n, Data Brokers: A Call for Transparency and Accountability (May 2014),
https://www.ftc.gov/system/files/documents/reports/data-brokers-call-transparency-accountability-report-federaltrade-commission-may-2014/140527databrokerreport.pdf. 21 See, e.g., Press Release, Fed. Trade Comm’n, FTC Puts an End to Data Broker Operation that Helped Scam More
Than $7 Million from Consumers’ Accounts (Nov. 30, 2016), https://www.ftc.gov/news-events/pressreleases/2016/11/ftc-puts-end-data-broker-operation-helped-scam-more-7-million; Press Release, Fed. Trade
Comm’n, Data Broker Defendants Settle FTC Charges They Sold Sensitive Personal Information to Scammers (Feb.
18, 2016), https://www.ftc.gov/news-events/press-releases/2016/02/data-broker-defendants-settle-ftc-charges-theysold-sensitive; FTC v. Accusearch, 570 F.3d 1187, 1199 (10th Cir. 2009). See also Molly Olmstead, A Prominent
Priest Was Outed for Using Grindr. Experts Say It’s a Warning Sign, Slate (July 21, 2021),
https://slate.com/technology/2021/07/catholic-priest-grindr-data-privacy.html. 22 See Brooke Auxier et al., Americans and Privacy: Concerned, Confused and Feeling Lack of Control Over Their
Personal Information, Pew Res. Ctr. (Nov. 15, 2019), https://www.pewresearch.org/internet/2019/11/15/americansand-privacy-concerned-confused-and-feeling-lack-of-control-over-their-personal-information/. See also Solove
Privacy Article, 126 Harv. L. Rev. at 1885; Aleecia M. McDonald & Lorrie Faith Cranor, The Cost of Reading
Privacy Policies, 4 I/S J. of L. & Pol’y for Info. Society 543 (2008); Irene Pollach, What’s Wrong with Online
Privacy Policies?, 50 Comm’s ACM 103 (2007).
5
time to review lengthy privacy notices for each of their devices, applications, websites, or
services,23 let alone the periodic updates to them. If consumers do not have meaningful access to
this information, they cannot make informed decisions about the costs and benefits of using
different services.24
This information asymmetry between companies and consumer runs even deeper.
Companies can use the information that they collect to direct consumers’ online experiences in
ways that are rarely apparent—and in ways that go well beyond merely providing the products or
services for which consumers believe they sign up.25 The Commission’s enforcement actions
have targeted several pernicious dark pattern practices, including burying privacy settings behind
multiple layers of the user interface26 and making misleading representations to “trick or trap”
consumers into providing personal information.27 In other instances, firms may misrepresent or
fail to communicate clearly how they use and protect people’s data.28 Given the reported scale
and pervasiveness of such practices, individual consumer consent may be irrelevant.
23 Kevin Litman-Navarro, We Read 150 Privacy Policies. They Were an Incomprehensible Disaster, N.Y. Times
(2019), https://www.nytimes.com/interactive/2019/06/12/opinion/facebook-google-privacy-policies.html; Alexis C.
Madrigal, Reading the Privacy Policies You Encounter in a Year Would Take 76 Work Days, The Atlantic (Mar. 1,
2012), https://www.theatlantic.com/technology/archive/2012/03/reading-theprivacy-policies-you-encounter-in-ayear-would-take-76-work-days/253851/. See also FTC Comm’r Rebecca Kelly Slaughter, Wait But Why?
Rethinking Assumptions About Surveillance Advertising: IAPP Privacy Security Risk Closing Keynote (“Slaughter
Keynote”) (Oct. 22, 2021), at 4,
https://www.ftc.gov/system/files/documents/public_statements/1597998/iapp_psr_2021_102221_final2.pdf.
24 See FTC Comm’r Christine S. Wilson, A Defining Moment for Privacy: The Time is Ripe for Federal Privacy
Legislation, Remarks at the Future of Privacy Forum (Feb. 6, 2020), https://www.ftc.gov/newsevents/news/speeches/remarks-commissioner-christine-s-wilson-future-privacy-forum. 25 See generally Ryan Calo & Alex Rosenblat, The Taking Economy: Uber, Information, and Power, 117 Colum. L.
Rev. 1623 (2017); Ryan Calo, Digital Market Manipulation, 82 Geo. Wash. L. Rev. 995 (2014). 26 See Press Release, Fed. Trade Comm’n, Facebook Settles FTC Charges That It Deceived Consumers by Failing to
Keep Privacy Promises (Nov. 29, 2011), https://www.ftc.gov/news-events/press-releases/2011/11/facebook-settlesftc-charges-it-deceived-consumers-failing-keep. 27 See Press Release, Fed. Trade Comm’n, FTC Takes Action against the Operators of Copycat Military Websites
(Sept. 6, 2018), https://www.ftc.gov/news-events/press-releases/2018/09/ftc-takes-action-against-operators-copycatmilitary-websites. 28 See generally infra Item III(a).
6
The material harms of these commercial surveillance practices may be substantial,
moreover, given that they may increase the risks of cyberattack by hackers, data thieves, and
other bad actors. Companies’ lax data security practices may impose enormous financial and
human costs. Fraud and identity theft cost both businesses and consumers billions of dollars, and
consumer complaints are on the rise.29 For some kinds of fraud, consumers have historically
spent an average of 60 hours per victim trying to resolve the issue.30 Even the nation’s critical
infrastructure is at stake, as evidenced by the recent attacks on the largest fuel pipeline,31
meatpacking plants,32 and water treatment facilities33 in the United States.
Companies’ collection and use of data have significant consequences for consumers’
wallets, safety, and mental health. Sophisticated digital advertising systems reportedly automate
the targeting of fraudulent products and services to the most vulnerable consumers.34 Stalking
apps continue to endanger people.35 Children and teenagers remain vulnerable to cyber bullying,
cyberstalking, and the distribution of child sexual abuse material.36 Peer-reviewed research has
29 Press Release, Fed. Trade Comm’n, New Data Shows FTC Received 2.8 Million Fraud Reports from Consumers
in 2021 (Feb. 22, 2022), https://www.ftc.gov/news-events/news/press-releases/2022/02/new-data-shows-ftcreceived-28-million-fraud-reports-consumers-2021-0. 30 Fed. Trade Comm’n, Identity Theft Survey Report (Sept. 2003),
https://www.ftc.gov/sites/default/files/documents/reports/federal-trade-commission-identity-theftprogram/synovatereport.pdf.
31 William Turton & Kartikay Mehrotra, Hackers Breached Colonial Pipeline Using Compromised Password, Bloomberg
(June 4, 2021), https://www.bloomberg.com/news/articles/2021-06-04/hackers-breached-colonial-pipeline-usingcompromised-password. 32 Dan Charles, The Food Industry May Be Finally Paying Attention To Its Weakness To Cyberattacks, NPR (July 5,
2021), https://www.npr.org/2021/07/05/1011700976/the-food-industry-may-be-finally-paying-attention-to-itsweakness-to-cyberattack. 33 Josh Margolin & Ivan Pereira, Outdated Computer System Exploited in Florida Water Treatment Plant Hack,
ABC News (Feb. 11, 2021), https://abcnews.go.com/US/outdated-computer-system-exploited-florida-watertreatment-plant/story?id=75805550. 34 See, e.g., Zeke Faux, How Facebook Helps Shady Advertisers Pollute the Internet, Bloomberg (Mar. 27, 2019),
https://www.bloomberg.com/news/features/2018-03-27/ad-scammers-need-suckers-and-facebook-helps-find-them
(noting an affiliate marketer’s claim that Facebook ‘s ad system “find[s] the morons for me”).
35 See Consumer Advice, Fed. Trade Comm’n, Stalking Apps: What to Know (May 2021),
https://consumer.ftc.gov/articles/stalking-apps-what-know. 36 See Ellen M. Selkie, Jessica L. Fales, & Megan A. Moreno, Cyberbullying Prevalence Among U.S. Middle and
High School-Aged Adolescents: A Systematic Review and Quality Assessment, 58 J. Adolescent Health 125 (2016);
Fed. Trade Comm’n, Parental Advisory: Dating Apps (May 6, 2019), https://consumer.ftc.gov/consumer7
linked social media use with depression, anxiety, eating disorders, and suicidal ideation among
kids and teens.37
Finally, companies’ growing reliance on automated systems is creating new forms and
mechanisms for discrimination based on statutorily protected categories,38 including in critical
alerts/2019/05/parental-advisory-dating-apps; Subcommittee on Consumer Protection, Product Safety, and Data
Security, U.S. Senate Comm. on Com., Sci. & Transp., Hearing, Protecting Kids Online: Internet Privacy and
Manipulative Marketing (May 18, 2021), https://www.commerce.senate.gov/2021/5/protecting-kids-online-internetprivacy-and-manipulative-marketing; Aisha Counts, Child Sexual Abuse Is Exploding Online. Tech’s Best Defenses
Are No Match., Protocol (Nov. 12, 2021), https://www.protocol.com/policy/csam-child-safety-online. 37 See, e.g., Elroy Boers et al., Association of Screen Time and Depression in Adolescence, 173 JAMA Pediatr. 9
(2019) at 857 (“We found that high mean levels of social media over 4 years and any further increase in social
media use in the same year were associated with increased depression.”); Hugues Sampasa-Kanyinga & Rosamund
F. Lewis, Frequent Use of Social Networking Sites Is Associated with Poor Psychological Functioning Among
Children and Adolescents, 18 Cyberpsychology, Behavior, and Social Networking 7 (2015) at 380 (“Daily [social
networking site] use of more than 2 hours was. . . independently associated with poor self-rating of mental health
and experiences of high levels of psychological distress and suicidal ideation.”); Jean M. Twenge et al., Increases in
Depressive Symptoms, Suicide-Related Outcomes, and Suicide Rates Among U.S. Adolescents After 2010 and Links
to Increased New Media Screen Time, 6 Clinical Psychological Sci. 1 (2018) at 11 (“[A]dolescents using social
media sites every day were 13% more likely to report high levels of depressive symptoms than those using social
media less often.”); H.C. Woods & H. Scott, #Sleepyteens: Social Media Use in Adolescence is Associated with
Poor Sleep Quality, Anxiety, Depression, and Low Self-Esteem, 51 J. of Adolescence 41-9 (2016) at 1 (“Adolescents
who used social media more . . . experienced poorer sleep quality, lower self-esteem and higher levels of anxiety and
depression.”); Simon M. Wilksch et al., The relationship between social media use and disordered eating in young
adolescents, 53 Int’l J. of Eating Disorders 1 at 96 (“A clear pattern of association was found between [social media]
usage and [disordered eating] cognitions.”).
38 A few examples of where automated systems may have produced disparate outcomes include inaccuracies and
delays in the delivery of child welfare services for the needy; music streaming services that are more likely to
recommend men than women; gunshot detection software that mistakenly alerts local police when people light
fireworks in majority-minority neighborhoods; search engine results that demean black women; and face recognition
software that is more likely to misidentify dark-skinned women than light-skinned men. See Joy Buolamwini &
Timnit Gebru, Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification, 81 Proc.
of Mach. Learning Res. (2018); Latanya Sweeney, Discrimination in Online Ad Delivery: Google Ads, Black Names
and White Names, Racial Discrimination, and Click Advertising, 11 Queue 10, 29 (Mar. 2013); Muhammad Ali et
al., Discrimination Through Optimization: How Facebook’s Ad Delivery Can Lead to Skewed Outcomes, 3 Proc.
ACM on Hum.-Computer Interaction (2019); Virginia Eubanks, Automating Inequality: How High-Tech Tools
Profile, Police, and Punish the Poor (2018); Andres Ferraro, Xavier Serra, & Christine Bauer, Break the Loop:
Gender Imbalance in Music Recommenders, CHIIR ’21: Proceedings of the 2021 Conference on Human
Information Interaction and Retrieval, 249-254 (Mar. 2021), https://dl.acm.org/doi/proceedings/10.1145/3406522.
See generally Anita Allen, Dismantling the “Black Opticon”: Privacy, Race, Equity, and Online Data-Protection
Reform, 131 Yale L. J. Forum 907 (2022),
https://www.yalelawjournal.org/pdf/F7.AllenFinalDraftWEB_6f26iyu6.pdf; Safiya Umoja Noble, Algorithms of
Oppression: How Search Engines Reinforce Racism (2018); Danielle Citron, Hate Crimes in Cyberspace (2014).
8
areas such as housing,39 employment,40 and healthcare.41 For example, some employers’
automated systems have reportedly learned to prefer men over women.42 Meanwhile, a recent
investigation suggested that lenders’ use of educational attainment in credit underwriting might
disadvantage students who attended historically Black colleges and universities.43 And the
Department of Justice recently settled its first case challenging algorithmic discrimination under
the Fair Housing Act for a social media advertising delivery system that unlawfully
discriminated based on protected categories.44 Critically, these kinds of disparate outcomes may
arise even when automated systems consider only unprotected consumer traits.
45
39 See Ny Magee, Airbnb Algorithm Linked to Racial Disparities in Pricing, The Grio (May 13, 2021),
https://thegrio.com/2021/05/13/airbnb-racial-disparities-in-pricing/; Emmanuel Martinez & Lauren Kirchner, The
Secret Bias Hidden in Mortgage-Approval Algorithms, ABC News & The MarkUp (Aug. 25, 2021),
https://abcnews.go.com/Business/wireStory/secret-bias-hidden-mortgage-approval-algorithms-79633917. See
generally Fed. Trade Comm’n, Accuracy in Consumer Reporting Workshop (Dec. 10, 2019),
https://www.ftc.gov/news-events/events-calendar/accuracy-consumer-reporting-workshop. See also Alex P. Miller
& Kartik Hosanagar, How Targeted Ads and Dynamic Pricing Can Perpetuate Bias, Harv. Bus. Rev. (Nov. 8,
2019), https://hbr.org/2019/11/how-targeted-ads-and-dynamic-pricing-can-perpetuate-bias. 40 See Ifeoma Ajunwa, The “Black Box” at Work, Big Data & Society (Oct. 19, 2020),
https://journals.sagepub.com/doi/full/10.1177/2053951720938093. 41 See Donna M. Christensen et al., Medical Algorithms are Failing Communities of Color, Health Affs. (Sept. 9,
2021), https://www.healthaffairs.org/do/10.1377/hblog20210903.976632/full/; Heidi Ledford, Millions of Black
People Affected by Racial Bias in Health-Care Algorithms, Nature (Oct. 24, 2019),
https://www.nature.com/articles/d41586-019-03228-6/. 42 Jeffrey Dastin, Amazon scraps secret AI recruiting tool that showed bias against women, Reuters (Oct. 10, 2018),
https://www.reuters.com/article/us-amazon-com-jobs-automation-insight/amazon-scraps-secret-ai-recruiting-toolthat-showed-bias-against-women-idUSKCN1MK08G; Dave Gershgorn, Companies are on the hook if their hiring
algorithms are biased, Quartz (Oct. 22, 2018), https://qz.com/1427621/companies-are-on-the-hook-if-their-hiringalgorithms-are-biased/. 43 Katherine Welbeck & Ben Kaufman, Fintech Lenders’ Responses to Senate Probe Heighten Fears of Educational
Redlining, Student Borrower Prot. Ctr. (July 31, 2020), https://protectborrowers.org/fintech-lenders-response-tosenate-probe-heightens-fears-of-educational-redlining/. This issue is currently being investigated by the company
and outside parties. Relman Colfax, Fair Lending Monitorship of Upstart Network’s Lending Model,
https://www.relmanlaw.com/cases-406. 44 Compl., United States v. Meta Platforms, Inc., No. 22-05187 (S.D.N.Y. filed June 21, 2022),
https://www.justice.gov/usao-sdny/press-release/file/1514051/download; Settlement Agreement, United States v.
Meta Platforms, Inc., No. 22-05187 (S.D.N.Y. filed June 21, 2022), https://www.justice.gov/crt/casedocument/file/1514126/download.
45 Andrew Selbst, A New HUD Rule Would Effectively Encourage Discrimination by Algorithm, Slate (Aug. 19,
2019), https://slate.com/technology/2019/08/hud-disparate-impact-discrimination-algorithm.html. See also Rebecca
Kelly Slaughter, Algorithms and Economic Justice, 23 Yale J. L. & Tech. 1, 11-14 (2021) (“Slaughter Algorithms
Paper”); Anupam Chander, The Racist Algorithm?, 115 Mich. L. Rev. 1023, 1029-30, 1037-39 (2017); Solon
Barocas & Andrew D. Selbst, Big Data’s Disparate Impact, 104 Calif. L. Rev. 671, 677-87 (2016).
9
The Commission is issuing this ANPR pursuant to Section 18 of the Federal Trade
Commission Act (“FTC Act”) and the Commission’s Rules of Practice46 because recent
Commission actions, news reporting, and public research suggest that harmful commercial
surveillance and lax data security practices may be prevalent and increasingly unavoidable.47
These developments suggest that trade regulation rules reflecting these current realities may be
needed to ensure Americans are protected from unfair or deceptive acts or practices. New rules
could also foster a greater sense of predictability for companies and consumers and minimize the
uncertainty that case-by-case enforcement may engender.
Countries around the world and states across the nation have been alert to these concerns.
Many accordingly have enacted laws and regulations that impose restrictions on companies’
collection, use, analysis, retention, transfer, sharing, and sale or other monetization of consumer
data. In recognition of the complexity and opacity of commercial surveillance practices today,
such laws have reduced the emphasis on providing notice and obtaining consent and have instead
46 15 U.S.C. 57a; 16 CFR parts 0 and 1. 47 In May 2022, three consumer advocacy groups urged the Commission to commence a rulemaking proceeding to
protect “privacy and civil rights.” See Letter of Free Press, Access Now, and UltraViolet to Chair Lina M. Khan
(May 12, 2022), https://act.freepress.net/sign/protect_privacy_civil_rights. Late in 2021, moreover, the Commission
received a petition that calls on it to promulgate rules pursuant to its authority to protect against unfair methods of
competition in the market for consumer data. See Press Release, Accountable Tech, Accountable Tech Petitions
FTC to Ban Surveillance Advertising as an ‘Unfair Method of Competition’ (Sept. 28, 2021),
https://accountabletech.org/media/accountable-tech-petitions-ftc-to-ban-surveillance-advertising-as-an-unfairmethod-of-competition/. In accordance with the provision of its Rules of Practice concerning public petitions, 16
CFR 1.31, the Commission published a notice about the petition, 86 FR 73206 (Dec. 23, 2021), and accepted public
comments, which are compiled at https://www.regulations.gov/docket/FTC-2021-0070/comments. The petitioner
urges new rules that address the way in which certain dominant companies exploit their access to and control of
consumer data. Those unfair-competition concerns overlap with some of the concerns in this ANPR about unfair or
deceptive acts or practices, and several comments in support of the petition also urged the Commission to pursue a
rulemaking using its authority to regulate unfair or deceptive practices. See, e.g., Cmt. of Consumer Reports & Elec.
Privacy Info. Ctr., at 2 (Jan. 27, 2022), https://downloads.regulations.gov/FTC-2021-0070-0009/attachment_1.pdf.
Accordingly, Item IV, below, invites comment on the ways in which existing and emergent commercial surveillance
practices harm competition and on any new trade regulation rules that would address such practices. Such rules
could arise from the Commission’s authority to protect against unfair methods of competition, so they may be
proposed directly without first being subject of an advance notice of proposed rulemaking. See 15 U.S.C. 57a(a)(2)
(Section 18’s procedural requirements, including an ANPR, apply to rules defining unfair or deceptive acts or
practices but expressly do not apply to rules “with respect to unfair methods of competition”).
10
stressed additional privacy “defaults” as well as increased accountability for businesses and
restrictions on certain practices.
For example, European Union (“EU”) member countries enforce the EU’s General Data
Protection Regulation (“GDPR”),48 which, among other things, limits the processing of personal
data to six lawful bases and provides consumers with certain rights to access, delete, correct, and
port such data. Canada’s Personal Information Protection and Electronic Documents Act49 and
Brazil’s General Law for the Protection of Personal Data50 contain some similar rights.51 Laws in
California,52 Virginia, 53 Colorado,54 Utah,55 and Connecticut,56 moreover, include some
comparable rights, and numerous state legislatures are considering similar laws. Alabama,57
Colorado,58 and Illinois,59 meanwhile, have enacted laws related to the development and use of
48 See Data Protection in the EU, Eur. Comm’n, https://ec.europa.eu/info/law/law-topic/data-protection/dataprotection-eu_en. 49 See Personal Information Protection and Electronic Documents Act (PIPEDA), Off. of the Privacy Comm’r of
Can., https://www.priv.gc.ca/en/privacy-topics/privacy-laws-in-canada/the-personal-information-protection-andelectronic-documents-act-pipeda/ (last modified Dec. 8, 2021). 50 Brazilian General Data Protection Law (Law No. 13,709, of Aug. 14, 2018),
https://iapp.org/resources/article/brazilian-data-protection-law-lgpd-english-translation/. 51 In 2021, the European Commission also announced proposed legislation to create additional rules for artificial
intelligence that would, among other things, impose particular documentation, transparency, data management,
recordkeeping, security, assessment, notification, and registration requirements for certain artificial intelligence
systems that pose high risks of causing consumer injury. See Proposal for a Regulation of the European Parliament
and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and
Amending Certain Union Legislative Acts, COM (2021) 206 final (Apr. 21, 2021), https://eur-lex.europa.eu/legalcontent/EN/TXT/?uri=CELEX%3A52021PC0206.
52 See California Privacy Rights Act of 2020, Proposition 24 (Cal. 2020) (codified at Cal. Civ. Code 1798.100 -
199.100); State of Cal. Dep’t of Just., California Consumer Privacy Act (CCPA): Frequently Asked Questions
(FAQs), https://oag.ca.gov/privacy/ccpa. 53 See Consumer Data Protection Act, S.B. 1392, 161st Gen. Assem. (Va. 2021) (codified at Va. Code Ann. 59.1-
575 through 59.1-585 (2021)). 54 See Protect Personal Data Privacy Act, 21 S.B. 190, 73 Gen. Assem. (Colo. 2021). 55 See Utah Consumer Privacy Act, 2022 Utah Laws 462 (codified at Utah Code Ann. 13-61-1 through 13-61-4). 56 See An Act Concerning Personal Data Privacy and Online Monitoring, 2022 Conn. Acts P.A. 22-15 (Reg. Sess.). 57 See Act. No. 2021-344, S.B. 78, 2021 Leg., Reg. Sess., (Ala. 2021). 58 See Restrict Insurers’ Use of External Consumer Data Act, 21 S.B. 169, 73rd Gen. Assem., 1st Reg. Sess. (Colo.
2021).
59 See Artificial Intelligence Video Interview Act, H.B. 53, 102nd Gen. Assem., Reg. Sess. (Ill. 2021) (codified at
820 Ill. Comp. Stat. Ann. 42/1 et seq.).
11
artificial intelligence. Other states, including Illinois,60 Texas,61 and Washington,62 have enacted
laws governing the use of biometric data. All fifty U.S. states have laws that require businesses
to notify consumers of certain breaches of consumers’ data.63 And numerous states require
businesses to take reasonable steps to secure consumers’ data.64
Through this ANPR, the Commission is beginning to consider the potential need for rules
and requirements regarding commercial surveillance and lax data security practices. Section 18
of the FTC Act authorizes the Commission to promulgate, modify, and repeal trade regulation
rules that define with specificity acts or practices that are unfair or deceptive in or affecting
commerce within the meaning of Section 5(a)(1) of the FTC Act.65 Through this ANPR, the
Commission aims to generate a public record about prevalent commercial surveillance practices
or lax data security practices that are unfair or deceptive, as well as about efficient, effective, and
adaptive regulatory responses. These comments will help to sharpen the Commission’s
enforcement work and may inform reform by Congress or other policymakers, even if the
Commission does not ultimately promulgate new trade regulation rules.66
The term “data security” in this ANPR refers to breach risk mitigation, data management
and retention, data minimization, and breach notification and disclosure practices.
60 See Biometric Information Privacy Act, S.B. 2400, 2008 Gen. Assem., Reg. Sess. (Ill. 2021) (codified at 740 Ill.
Comp. Stat. Ann. 14/1 et seq.). 61 See TEX. BUS. & COM. CODE 503.001. 62 See Wash. Rev. Code Ann. 19.375.010 through 19.375.900. 63 See Nat’l Conf. of State Leg., Security Breach Notification Laws (Jan. 17, 2022),
https://www.ncsl.org/research/telecommunications-and-information-technology/security-breach-notificationlaws.aspx.
64 See Nat’l Conf. of State Leg., Data Security Laws, Private Sector (May 29, 2019),
https://www.ncsl.org/research/telecommunications-and-information-technology/data-security-laws.aspx. 65 15 U.S.C. 45(a)(1). 66 Cf. Slaughter Keynote at 4; Oral Statement of Comm’r Christine S. Wilson, Strengthening the Federal Trade
Commission’s Authority to Protect Consumers: Hearing before the Senate Comm. on Com., Sci. & Transp. (Apr. 20,
2021),
https://www.ftc.gov/system/files/documents/public_statements/1589180/opening_statement_final_for_postingrevd.p
df.
12
For the purposes of this ANPR, “commercial surveillance” refers to the collection,
aggregation, analysis, retention, transfer, or monetization of consumer data and the direct
derivatives of that information. These data include both information that consumers actively
provide—say, when they affirmatively register for a service or make a purchase—as well as
personal identifiers and other information that companies collect, for example, when a consumer
casually browses the web or opens an app. This latter category is far broader than the first.
The term “consumer” as used in this ANPR includes businesses and workers, not just
individuals who buy or exchange data for retail goods and services. This approach is consistent
with the Commission’s longstanding practice of bringing enforcement actions against firms that
harm companies67 as well as workers of all kinds.68 The FTC has frequently used Section 5 of
the FTC Act to protect small businesses or individuals in contexts involving their employment or
independent contractor status.69
67 See, e.g., Press Release, Fed. Trade Comm’n, FTC Obtains Contempt Ruling Against ‘Yellow Pages’ Scam (Nov.
25, 2015), https://www.ftc.gov/news-events/press-releases/2015/11/ftc-obtains-contempt-ruling-against-yellowpages-scam; Press Release, Fed. Trade Comm’n, FTC and Florida Halt Internet ‘Yellow Pages’ Scammers (July 17,
2014), https://www.ftc.gov/news-events/press-releases/2014/07/ftc-florida-halt-internet-yellow-pages-scammers; In
re Spiegel, Inc., 86 F.T.C. 425, 439 (1975). See also FTC v. Sperry & Hutchinson Co., 405 U.S. 233, 244 (1972);
FTC v. Bunte Bros., Inc., 312 U.S. 349, 353 (1941); In re Orkin Exterminating Co., Inc., 108 F.T.C. 263 (1986),
aff’d, Orkin Exterminating Co., Inc. v. FTC, 849 F.2d 1354 (11th Cir. 1988); FTC v. Datacom Mktg., Inc., No. 06-c2574, 2006 WL 1472644, at *2 (N.D. Ill. May 24, 2006). Previously, the Commission included “businessmen”
among those Congress charged it to protect under the statute. See Fed. Trade Comm’n, FTC Policy Statement on
Unfairness (Dec. 17, 1980), appended to In re Int’l Harvester Co., 104 F.T.C. 949, 1072 n.8 (1984),
https://www.ftc.gov/public-statements/1980/12/ftc-policy-statement-unfairness. 68 See, e.g., Press Release, Fed. Trade Comm’n, FTC Settles Charges Against Two Companies That Allegedly
Failed to Protect Sensitive Employee Data (May 3, 2011), https://www.ftc.gov/news-events/pressreleases/2011/05/ftc-settles-charges-against-two-companies-allegedly-failed; Press Release, Fed. Trade Comm’n,
Rite Aid Settles FTC Charges That It Failed to Protect Medical and Financial Privacy of Customers and Employees
(July 27, 2010), https://www.ftc.gov/news-events/press-releases/2010/07/rite-aid-settles-ftc-charges-it-failedprotect-medical-financial; Press Release, Fed. Trade Comm’n, CVS Caremark Settles FTC Charges: Failed to
Protect Medical and Financial Privacy of Customers and Employees; CVS Pharmacy Also Pays $2.25 Million to
Settle Allegations of HIPAA Violations (Feb. 18, 2009), https://www.ftc.gov/news-events/pressreleases/2009/02/cvs-caremark-settles-ftc-chargesfailed-protect-medical-financial. See also Press Release, Fed.
Trade Comm’n, Amazon To Pay $61.7 Million to Settle FTC Charges It Withheld Some Customer Tips from
Amazon Flex Drivers (Feb. 2, 2021), https://www.ftc.gov/news-events/press-releases/2021/02/amazon-pay-617-
million-settle-ftc-charges-it-withheld-some. 69 See, e.g., FTC v. IFC Credit Corp., 543 F. Supp. 2d 925, 934-41 (N.D. Ill. 2008) (holding that the FTC’s
construction of the term “consumer” to include businesses as well as individuals is reasonable and is supported by
the text and history of the FTC Act).
13
This ANPR proceeds as follows. Item II outlines the Commission’s existing authority to
bring enforcement actions and promulgate trade regulation rules under the FTC Act. Item III sets
out the wide range of actions against commercial surveillance and data security acts or practices
that the Commission has pursued in recent years as well as the benefits and shortcomings of this
case-by-case approach. Item IV sets out the questions on which the Commission seeks public
comment. Finally, Item V provides instructions on the comment submission process, and Item VI
describes a public forum that is scheduled to take place to facilitate public involvement in this
rulemaking proceeding.
II. The Commission’s Authority
Congress authorized the Commission to propose a rule defining unfair or deceptive acts
or practices with specificity when the Commission “has reason to believe that the unfair or
deceptive acts or practices which are the subject of the proposed rulemaking are prevalent.”70 A
determination about prevalence can be made either on the basis of “cease-and-desist” orders
regarding such acts or practices that the Commission has previously issued, or when it has “any
other information” that “indicates a widespread pattern of unfair or deceptive acts or practices.”71
Generally, a practice is unfair under Section 5 if (1) it causes or is likely to cause
substantial injury, (2) the injury is not reasonably avoidable by consumers, and (3) the injury is
not outweighed by benefits to consumers or competition.72 A representation, omission, or
practice is deceptive under Section 5 if it is likely to mislead consumers acting reasonably under
the circumstances and is material to consumers—that is, it would likely affect the consumer’s
70 15 U.S.C. 57a(b)(3). 71 Id. 72 15 U.S.C. 45(n).
14
conduct or decision with regard to a product or service.73 Under the statute, this broad language
is applied to specific commercial practices through Commission enforcement actions and the
promulgation of trade regulation rules.
In addition to the FTC Act, the Commission enforces a number of sector-specific laws
that relate to commercial surveillance practices, including: the Fair Credit Reporting Act,74
which protects the privacy of consumer information collected by consumer reporting agencies;
the Children’s Online Privacy Protection Act (“COPPA”),75 which protects information collected
online from children under the age of 13; the Gramm-Leach-Bliley Act (“GLBA”),76 which
protects the privacy of customer information collected by financial institutions; the Controlling
the Assault of Non-Solicited Pornography and Marketing (“CAN-SPAM”) Act,77 which allows
consumers to opt out of receiving commercial email messages; the Fair Debt Collection Practices
Act,78 which protects individuals from harassment by debt collectors and imposes disclosure
requirements on related third-parties; the Telemarketing and Consumer Fraud and Abuse
Prevention Act,79 under which the Commission implemented the Do Not Call Registry80; the
Health Breach Notification Rule,81 which applies to certain health information; and the Equal
Credit Opportunity Act,82 which protects individuals from discrimination on the basis of race,
color, religion, national origin, sex, marital status, receipt of public assistance, or good faith
73 See FTC Policy Statement on Deception (Oct. 14, 1983), appended to In re Cliffdale Assocs., Inc., 103 F.T.C. 110,
174 (1984), https://www.ftc.gov/system/files/documents/public_statements/410531/831014deceptionstmt.pdf. 74 15 U.S.C. 1681 through 1681x. 75 15 U.S.C. 6501 through 6506. 76 Pub. L. No. 106-102, 113 Stat. 1338 (1999) (codified as amended in scattered sections of 12 and 15 U.S.C.). 77 15 U.S.C. 7701 through 7713. 78 15 U.S.C. 1692 through 1692p. 79 15 U.S.C. 6101 through 6108. 80 16 CFR part 310. 81 16 CFR part 318. 82 15 U.S.C. 1691 through 1691f.
15
exercise of rights under the Consumer Credit Protection Act and requires creditors to provide to
applicants, upon request, the reasons underlying decisions to deny credit.
III. The Commission’s Current Approach to Privacy and Data Security
a. Case-By-Case Enforcement and General Policy Work
For more than two decades, the Commission has been the nation’s privacy agency,
engaging in policy work and bringing scores of enforcement actions concerning data privacy and
security.83 These actions have alleged that certain practices violate Section 5 of the FTC Act or
other statutes to the extent they pose risks to physical security, cause economic or reputational
injury, or involve unwanted intrusions into consumers’ daily lives.84 For example, the
Commission has brought actions for:
• the surreptitious collection and sale of consumer phone records obtained through false
pretenses85;
• the public posting of private health-related data online86;
83 “Since 1995, the Commission has been at the forefront of the public debate on online privacy.” Fed. Trade
Comm’n, Privacy Online: Fair Information Practices in the Electronic Marketplace—A Report to Congress 3
(2000), http://www.ftc.gov/reports/privacy2000/privacy2000.pdf (third consecutive annual report to Congress after
it urged the Commission to take on a greater role in policing privacy practices using Section 5 as the internet grew
from a niche service to a mainstream utility). The first online privacy enforcement action came in 1998 against
GeoCities, “one of the most popular sites on the World Wide Web.” Press Release, Fed. Trade Comm’n, Internet
Site Agrees to Settle FTC Charges of Deceptively Collecting Personal Information in Agency’s First Internet
Privacy Case (Aug. 13, 1998), http://www.ftc.gov/news-events/press-releases/1998/08/internet-site-agrees-settle-ftccharges-deceptively-collecting. 84 See Fed. Trade Comm’n, Comment to the National Telecommunications & Information Administration on
Developing the Administration’s Approach to Consumer Privacy, No. 180821780-8780-01, 8-9 (Nov. 9, 2018),
https://www.ftc.gov/system/files/documents/advocacy_documents/ftc-staff-comment-ntiadevelopingadministrations-approach-consumer-privacy/p195400_ftc_comment_to_ntia_112018.pdf; FTC Comm’r
Christine S. Wilson, A Defining Moment for Privacy: The Time Is Ripe for Federal Privacy Legislation: Remarks at
the Future of Privacy Forum 11, n.39 (Feb. 6, 2020),
https://www.ftc.gov/system/files/documents/public_statements/1566337/commissioner_wilson_privacy_forum_spee
ch_02-06-2020.pdf. 85 See, e.g., Compl. for Injunctive and Other Equitable Relief, United States v. Accusearch, Inc., No. 06-cv-105 (D.
Wyo. filed May 1, 2006),
https://www.ftc.gov/sites/default/files/documents/cases/2006/05/060501accusearchcomplaint.pdf. 86 See, e.g., Compl., In re Practice Fusion, Inc., F.T.C. File No. 142-3039 (Aug. 16, 2016),
https://www.ftc.gov/system/files/documents/cases/160816practicefusioncmpt.pdf.
16
• the sharing of private health-related data with third parties87;
• inaccurate tenant screening88;
• public disclosure of consumers’ financial information in responses to consumers’ critical
online reviews of the publisher’s services89;
• pre-installation of ad-injecting software that acted as a man-in-the-middle between
consumers and all websites with which they communicated and collected and transmitted
to the software developer consumers’ internet browsing data90;
• solicitation and online publication of “revenge porn”—intimate pictures and videos of
ex-partners, along with their personal information—and the collection of fees to take
down such information91;
• development and marketing of “stalkerware” that purchasers surreptitiously installed on
others’ phones or computers in order to monitor them92;
87 See, e.g., Decision and Order, In re Flo Health, Inc., FTC File No. 1923133 (June 22, 2021),
www.ftc.gov/system/files/documents/cases/192_3133_flo_health_decision_and_order.pdf. 88 See, e.g., Compl. for Civ. Penalties, Permanent Injunction, and Other Equitable Relief, United States v. AppFolio,
Inc., No. 1:20-cv-03563 (D.D.C. filed Dec. 8, 2020), https://www.ftc.gov/system/files/documents/cases/ecf_1_-
_us_v_appfolio_complaint.pdf.
89 See, e.g., Compl., United States v. Mortg. Sols. FCS, Inc., No. 4:20-cv-00110 (N.D. Cal. filed Jan. 6, 2020),
https://www.ftc.gov/system/files/documents/cases/mortgage_solutions_complaint.pdf. 90 See, e.g., Decision and Order, In re Lenovo (United States) Inc., FTC File No. 152 3134 (Dec. 20, 2017),
https://www.ftc.gov/system/files/documents/cases/152_3134_c4636_lenovo_united_states_decision_and_order.pdf. 91 See, e.g., Compl. for Permanent Injunction and Other Equitable Relief, FTC and State of Nevada v. EMP Media,
Inc., No. 2:18-cv-00035 (D. Nev. filed Jan. 9, 2018),
https://www.ftc.gov/system/files/documents/cases/1623052_myex_complaint_1-9-18.pdf; Compl., In re Craig
Brittain, F.T.C. File No. 132-3120 (Dec. 28, 2015),
https://www.ftc.gov/system/files/documents/cases/160108craigbrittaincmpt.pdf. 92 See, e.g., Compl., In re Support King, LLC, F.T.C. File No. 192-3003 (Dec. 20, 2021),
https://www.ftc.gov/system/files/documents/cases/1923003c4756spyfonecomplaint_0.pdf; Compl., In re Retina-X
Studios, LLC, F.T.C. File No. 172-3118 (Mar. 26, 2020),
https://www.ftc.gov/system/files/documents/cases/172_3118_retina-x_studios_complaint_0.pdf; Compl. for
Permanent Injunction and Other Equitable Relief, FTC v. CyberSpy Software, LLC., No. 6:08-cv-01872 (M.D. Fla.
filed Nov. 5, 2008), https://www.ftc.gov/sites/default/files/documents/cases/2008/11/081105cyberspycmplt.pdf.
17
• retroactive application of material privacy policy changes to personal information that
businesses previously collected from users93;
• distribution of software that caused or was likely to cause consumers to unwittingly
share their files publicly94;
• surreptitious activation of webcams in leased computers placed in consumers’ homes95;
• sale of sensitive data such as Social Security numbers to third parties who did not have a
legitimate business need for the information,96 including known fraudsters97;
• collection and sharing of sensitive television-viewing information to target advertising
contrary to reasonable expectations98;
• collection of phone numbers and email addresses to improve social media account
security, but then deceptively using that data to allow companies to target advertisements
in violation of an existing consent order99;