FTC Seeks Information Regarding Companies’ Data Collection, Use, and Advertising Practices

Roberto J. Gonzalez and Jeannie S. Rhee are partners Steven C. Herzog is counsel at Paul, Weiss, Rifkind, Wharton & Garrison LLP. This post is based on a Paul, Weiss memorandum by Mr. Gonzalez, Ms. Rhee, Mr. Herzog, Carly Lagrotteria, Julie L. Rooney, and Cole A. Rabinowitz.

On December 14, 2020, the Federal Trade Commission (“FTC”) announced that it was issuing orders under section 6(b) of the FTC Act to nine social media and video streaming companies. The orders require the companies to produce a sweeping amount of information on each company’s worldwide customer base, how the companies collect, use, and present personal information, their advertising and user engagement practices, and how their practices affect children and teens. The nine companies—Amazon.com, Inc., ByteDance Ltd., Discord Inc., Facebook, Inc., Reddit, Inc., Snap Inc., Twitter, Inc., WhatsApp Inc., and YouTube LLC—were given a 45-day deadline to respond. The FTC voted 4-1 to issue the orders, with Commissioner Noah Joshua Phillips filing a dissenting statement.


Section 6(b) authorizes the FTC to conduct broad-based studies–or “special reports”–about certain aspects of a company’s business or industry sector. The FTC can conduct these studies even without a law enforcement purpose. The FTC seldom relies on section 6(b) to issue orders, but its reliance is not unprecedented. For example, the FTC has issued such orders in the context of requiring alcoholic beverages advertisers to provide data about their past marketing practices. In February 2020, the FTC issued section 6(b) orders to a similar set of tech companies (including Alphabet, Amazon, Apple, Facebook, and Microsoft) seeking information related to their prior acquisitions.

In November 2018, FTC Chairman Joe Simons told the U.S. Senate Commerce Committee that he believed section 6(b) empowered the FTC to collect information “about the data practices of large technology companies” and said that the FTC was “developing plans to issue 6(b) orders in the technology area.” The following March, Chairman Simons told attendees at the Association of National Advertisers conference that there were “serious privacy concerns” from what he described as the lack of transparency in the online behavioral advertising context and “the fact that many of the companies at the heart of this ecosystem operate behind the scenes and without much consumer awareness.”

The Information Sought By the Orders

The FTC’s announcement linked to a sample cover letter and order, which sets forth fifty-three document and data requests on topics ranging from the company’s data collection, storage, and data deletion practices, and its advertising services to its use of algorithms and its relationships with other social media or technology companies. The orders are addressed to the operators of the largest social media platforms in the United States, including one non-U.S. company and at least one company geared towards a video game audience. Each of the companies gathers and processes large amounts of information on users, including through services offered to children and teens. The information sought includes:

Descriptions of the social media and video streaming services provided:

  • Information on the number and characteristics of active, registered users worldwide, including a breakdown by “selected user groups,” which are categories of users by age bracket, race/ethnicity, country location, and other FTC-identified attributes.
  • Interactions between users and the platform, including comments, views, posts and interactions with ads.
  • Dollar-valuations of users.
  • Breakdowns of revenue per individual social media and video streaming service.
  • All documents pertaining to the business and marketing strategies, research & development efforts, board and management presentations, and budgets and financial projections relating to each social media and video streaming service.

Data collection, use, storage, disclosure and deletion practices:

  • Descriptions of the top 1,000 user attributes that are tracked or derived by each product or service, including for digital advertising purposes.
  • Metrics for measuring accuracy of user-attribute information, and procedures for identifying, reporting and remedying harm from inaccurate information (including all oversight provided by senior leadership over this process).
  • Identifications of inaccurate user attributes, including measures of the most prevalent types of inaccuracies, the cost of digital ads based on these data-types and the resulting revenue to the company.
  • Descriptions of practices and policies for minimizing, deleting, sharing and protecting personal information gathered on users and non-users.
  • Identification of procedures related to the assembly and purchase of information on consumer shopping behavior, including at offline and online retail outlets.
  • Procedures for researchers to request access to personal information, and what type of information may be accessed for academic purposes.

Digital advertising services and supporting practices:

  • Descriptions of each company’s ad pricing models, the objective of advertising products, and ad targeting capabilities, including a description of all data points that can be used to target advertising.
  • Types of digital advertising services, including revenue, ad bidding data, number of advertisers, impressions, click-through rate, cost per impression, and advertiser return on investment.
  • Disclosure of top advertisers by revenue in each ad category both within the U.S. and worldwide, with breakdowns by geography, ad format, pricing model, and purchase channel.
  • Documentation of consumer activity tracked on and off of company products or services, the effectiveness of advertising on consumer behavior, and the quality or accuracy of company’s ad targeting capabilities.

The algorithms or data analytics applied in product offerings or advertising practices:

  • Identification of all algorithms and data analytics used and their role in advertising.
  • Identification of the sources of personal information used for algorithms and data analytics, as well as the procedures and techniques used to analyze data.
  • Disclosures of the purposes of algorithms and data analytics as applied to personal information, and the measures each company takes to address the privacy, security, and ethics concerns raised by such practices.
  • Information on the processes used to monitor and test the accuracy or impact of algorithms and data analytics, and the persons responsible for such monitoring and testing.
  • Whether the company examines or tests data sets and algorithms for bias, and whether third-party testing is used.

How each company measures, promotes, and researches user engagement:

  • Information on the tools and practices (including but not limited to algorithms and data analytics) used to study and increase user engagement.
  • Information on each company’s measuring of negative interactions (i.e. blocking or unsubscribing from content) and their impact on user engagement.
  • The policies for content moderation and techniques for promotion of content to users, including metrics used in content promotion and strategies or tools used to improve functionality of products or increase user engagement.
  • Information on how user-created content presentation is influenced by the company’s advertising goals.

Demographic information collected about users and non-users:

  • Disclosures of how demographic information, such as ethnicity, familial status and relationships, is identified and incorporated into data analysis to draw inferences, inform digital advertising offerings (including for ad targeting or exclusions), and personalize content.
  • Information on all mechanisms through which users and non-users may inquire about and request deletion of demographic information.

Services or products directed to children and teens:

  • Information on whether the company has indicated to any third party (including app stores, platform, or advertising network) that its service or portions of its content are directed to children and teens.
  • Description of each company’s policies, procedures, and practices regarding users who indicate that they are children or teens, including regarding collection of personal information and strategies for increasing usage.
  • Disclosure of what measures each company takes to protect children’s privacy, including participation in relevant safe harbor programs, mechanisms for securing parental consent, and providing for parental control over children’s data.
  • Information on systems in place to automatically or algorithmically identify children and teens.

Relationships with other social media or video streaming services:

  • Information measuring consumer willingness to switch services and efforts by each company to compete to attract users and increase user engagement through improvement of product offerings or expansion of privacy practices.
  • Data on lock-in effects of company practices, including measures of switching costs for users due to lack of access to data specific to any company product.
  • Identification of areas of competition between digital advertising services and other forms of advertising, including the cumulative effect of advertising on consumer perceptions or behavior.


  • Detail on all material changes made by each company to comply with the EU’s General Data Protection Regulation (GDPR), including whether those changes applied to only EU users.

Commissioners Rohit Chopra, Rebecca Kelly Slaughter, and Christine S. Wilson issued a joint statement in support of the orders, citing the ubiquity of digital products and the uncertainty regarding the data processing activities of social media and video streaming service. The Commissioners said the study will “lift the hood” on the impact of tech companies on Americans’ privacy and behavior. Further, they note that the orders highlight the importance of “ascertaining the full scale and scope of social media and video streaming companies’ data collection” in order to understand how children and families are targeted, whether Americans are “being subjected to social engineering experiments” and the financial incentives driving these companies.

Commissioner Phillips’ dissenting statement argues that the 50-plus specifications posed by the orders—and the diverse industries of the companies to whom they were issued—are indicative of “invasive government overreach” and constitute “an undisciplined foray into a wide variety of topics, some only tangentially related to the stated focus of th[e] investigation.”


The orders reflect increasing efforts by the FTC and other authorities to gain transparency into the business models and technologies of companies that collect significant amounts of consumer data and engage in targeted digital advertising. The specific information sought in these orders provides additional insight into what products, data activities, and risks (including impact on privacy, bias, and impacts on children and teens) are of regulatory interest, both from the perspective of potential increased privacy regulation and/or enforcement.

Boards and senior management of companies that collect large amounts of consumer data and provide digital advertising services may wish to review the FTC’s requests and map their companies’ own activities against these areas of regulatory interest. Forming an up-to-date and comprehensive understanding of a company’s data practices across all business lines can itself be a challenge for a rapidly innovating business. Steps should then be taken to ensure that there is adequate governance and policies, procedures, and controls around these data practices, and that the risks of such practices meet the company’s risk appetite in light of growing regulatory scrutiny.

The complete publication, including footnotes, is available here.

Both comments and trackbacks are currently closed.