YouTube has reached a settlement with the Federal Trade Commission and New York attorney general for allegedly violating the Children’s Online Privacy Protection Act (COPPA), according to CNBC. The agreement includes a combined $170 million fine — $136 million to the FTC and $34 million to the New York attorney general — and requires YouTube change data collection on child-directed content on the platform.

“This latest violation is extremely serious. The company baited children using nursery rhymes, cartoons and other kid-directed content on curated YouTube channels to feed its massively profitable behavioral advertising business,” said FTC Commissioner Rohit Chopra in a dissenting statement on the settlement.

The charges against YouTube. The FTC and New York attorney general charged YouTube with violating COPPA laws by collecting data on minors without parental consent. YouTube was also accused of claiming it was a leader in reaching children ages 6 to 11 when marketing itself to toy companies like Mattel and Hasbro — while simultaneously telling an advertising company it did not have to comply with COPPA because it didn’t have users younger than 13 years old, according to the CNBC report.

“YouTube touted its popularity with children to prospective corporate clients. Yet when it came to complying with COPPA, the company refused to acknowledge that portions of its platform were clearly directed to kids,” wrote FTC Chairman Joe Simons.

YouTube to modify data collection policies. In addition to the $170 million it has agreed to pay (the largest amount every paid to the FTC for COPPA violations), YouTube is also changing its data collection and ad targeting policies connected to children’s content.

“Starting in about four months, we will treat data from anyone watching children’s content on YouTube as coming from a child, regardless of the age of the user,” wrote YouTube CEO Susan Wojcicki on the Official YouTube blog, “This means we’ll will limit data collection and use on videos made for kids only to what is needed to support the operation of the service.”

No more personalized ads on kid videos. Wojcicki said YouTube will also stop serving personalized ads on children’s content “entirely” — meaning children’s content will still be monetized, but ad targeting will be limited to ads that have not been personalized to the viewer based on their specific activity and behavior on the platform.

We asked YouTube for clarification on how ads will be targeted to children’s content going forward. In response, the company sent a link to (and the full copy) of the blog post stating it would stop serving personalized ads, but did not offer any further explanation.

How YouTube will identify children’s content. YouTube is putting the onus on creators to notify the platform if their videos fall into the category of children’s content. The company said it will also use machine learning to identify videos that target children, “For example, those that have an emphasis on kids characters, themes, toys or games.”

Wojcicki said the company is giving creators impacted by the platform updates four months to adjust before the coming changes. “We recognize this won’t be easy for some creators and are committed to working with them through this transition and providing resources to help them better understand these changes,” wrote Wojcicki.

Why we should care. YouTube’s platform changes resulting from this settlement will impact advertisers two-fold. First, limiting the amount of data being collected on anyone watching children’s content means YouTube will have less information on users to target ads.

Also, any marketers or advertisers aiming to reach users viewing children’s content are going to have a harder time implementing highly targeted campaigns as this type of content will no longer include personalized ads. Marketers will have to rely on “non-personalized” signals for ads targeting children’s content, and could potentially see a drop in ad performance and engagement.

About The Author


Facebook critics were grousing that $5 billion was too little to pay for the company’s alleged repeated violations of user privacy, in contravention of an earlier FTC consent decree. Indeed, the financial penalties could have been a great deal stronger. But we now know the settlement with the FTC comes with a range of strict new privacy requirements that impose substantial new compliance burdens on Facebook.

There are still some critics complaining that even the new privacy rules still don’t go far enough to place “meaningful limits” on the collection of personal data.

Changing the privacy culture of Facebook. Mindful of criticism of the monetary settlement, FTC Chairman Joe Simons said in a press release, “The magnitude of the $5 billion penalty and sweeping conduct relief are unprecedented in the history of the FTC. The relief is designed not only to punish future violations but, more importantly, to change Facebook’s entire privacy culture to decrease the likelihood of continued violations. The Commission takes consumer privacy seriously, and will enforce FTC orders to the fullest extent of the law.”

So what must Facebook now do? A lot.

Independent board privacy committee. There will be a new independent privacy committee at the board level, “removing unfettered control by Facebook’s CEO Mark Zuckerberg over decisions affecting user privacy.” Members of the committee cannot be fired by Zuckerberg but only by a supermajority of the board.

In addition, Facebook will be required to appoint privacy compliance officers, who must certify on a quarterly basis that Facebook is in compliance with the FTC mandated program and will be personally subject to civil and criminal liability for any false representations. These compliance officers can only be hired and fired by the board’s privacy committee and not by any executive at Facebook including Zuckerberg.

Personal liability for Mark. Mark Zuckerberg must also sign off on the quarterly FTC privacy reports. He faces potential personal liability for any false statements or misrepresentations. (One question going forward will be how “material” must such misrepresentations be to trigger liability?)

An independent assessor, accountable to the FTC and the board’s privacy committee, will be tapped to review the state of Facebook’s privacy program every two years — for 20 years. That assessment cannot rely “primarily” on Facebook management’s compliance statements. It also appears that the assessor and FTC can use what amounts to legal civil discovery tools to gain information to assess compliance during that biennial review process.

These rules equally extend to Instagram and WhatsApp.

New product review and third-party oversight. Facebook will also be required to conduct a compliance review of “every new or modified product, service, or practice before it is implemented, and document its decisions about user privacy.” And when privacy events that compromise the data of more than 500 users occur, Facebook must document and submit them to the FTC and its privacy assessor within 30 days.

Additional new requirements include:

  • Facebook must exercise greater oversight over third-party apps, including by terminating app developers that fail to certify that they are in compliance with Facebook’s platform policies or fail to justify their need for specific user data;
  • Facebook is prohibited from using telephone numbers obtained to enable a security feature (e.g., two-factor authentication) for advertising;
  • Facebook must provide clear and conspicuous notice of its use of facial recognition technology, and obtain affirmative express user consent prior to any use that materially exceeds its prior disclosures to users;
  • Facebook must establish, implement, and maintain a comprehensive data security program;
  • Facebook must encrypt user passwords and regularly scan to detect whether any passwords are stored in plaintext; and
  • Facebook is prohibited from asking for email passwords to other services when consumers sign up for its services.

Speaking of third parties, Facebook today acknowledged that despite shutting down sharing of Facebook-friends data last year, some partners still had access due a bug in Facebook’s codebase. Microsoft and Sony were able to continue to access to Facebook friends’ data but that has now been corrected according to the company.

Zuckerberg says he supports the new rules. Mark Zuckerberg issued a statement in which he said, “I believe they will reduce the number of mistakes we make and help us deliver stronger privacy protections for everyone.” He added that the company’s next focus “is to build privacy protections as strong as the best services we provide. I’m committed to doing this well and delivering the best private social platform for our community.”

Why we should care. Say what you want about the $5 billion penalty, but the new privacy regimen that Facebook must comply with appears very strict. That’s reflected most obviously in the personal liability that Mark Zuckerberg and the company’s new privacy officers will face for false statements or misrepresentations to the FTC. And the third-party app policing rules are designed to deter and prevent future Cambridge Analytica-style data harvesting.

There are also some provisions of the new rules that could affect Facebook’s access to data for ad purposes, including limitations around the use of phone numbers and third party passwords.

About The Author

Screenwerk, about connecting the dots between digital media and real-world consumer behavior. He is also VP of Strategy and Insights for the Local Search Association. Follow him on Twitter or find him at Google .