Published On: March 4th, 2020/Categories: The GDPR Guy/15.5 min read/

YouTube Meets COPPA

Episode 13 of The GDPR Guy is Carl explaining YouTube’s recent COPPA settlement with the FTC and the new changes they have implemented.

Audio

PUBLISHED AS A PODCAST

Useful Resources

LINKS AND FURTHER READING

$170 million FTC-NY YouTube settlement offers COPPA compliance tips for platforms and providers

YouTube channel owners: Is your content directed to children? 

Upcoming changes to children’s content on YouTube.com

Transcript

Some of you may be aware that recently YouTube has been having fun with America’s Federal Trade Commission, the FTC. This all started with a complaint that YouTube was in violation of the Children’s Online Privacy Protection Act, COPPA, and ultimately ended in September 2019 with a vast settlement of $170 million dollars – $136 million to the FTC and $34 million to the New York Attorney General.

But what is unique about this case is how news of its ramifications was most notable not in the privacy sphere, but amongst YouTubers themselves – the content creators that upload their videos onto YouTube. If you want to know about COPPA, or at the very least get some vitriolic opinions on it, just ask a YouTuber.

The reason for this is that as part of the settlement with the FTC, YouTube made some major changes to the way videos are tagged and in turn the functionality surrounding those videos, with many a YouTuber expecting this to be the end of their livelihood.

To explore these changes and why they’re necessary, we need to look at where this case came from and why YouTube had a problem in the first place.

YouTube is like many other websites in the sense that its business model is to gather a huge volume of visitors and monetise them by displaying ads. And of course those ads are going to be personalised ads, based on your behaviour all around the Internet to get you clicking on relevant stuff. And that’s how YouTubers, the video creators, make their money. They specify that ads should be displayed around their videos, and so they get more revenue the more their videos are watched.

And like most websites, personalised ads are a lot more effective than non-personalised ads, so that’s the method of choice to get everyone the most revenue. That’s why YouTube is one big hotbed of tracking, with most interactions tracked and used to ultimately drive revenue. Of course, that relies on personal data, or let’s use the more relevant term here, Personal Information, since we’re going to be talking about COPPA.

At a high level, COPPA has two sides to it.

  1. Are you collecting personal information? and
  2. Are you doing this knowingly from a child?

So here you’ve got two get-out-of-jail-free cards you can potentially use for COPPA.

The first being to ensure your service doesn’t collect personal information beyond the bare minimum you need to operate the service. That way it doesn’t matter what age your users are.

The second is that your service, like most services out there, does indeed collect personal information in excess of what you truly need. For instance you’re using cookies and tracking to display personalised ads next to videos. But if you can show that your users are not children, then COPPA can’t apply. This was the YouTube approach. 

And so just like most other large tech company websites, the terms of service for YouTube stipulates that you have to be at least 13 to register, since that is the minimum age from which YouTube can collect your Personal Information without parental consent under COPPA.

This is where the game starts – trying to stay out of scope of COPPA by stipulating you have to be 13 or over to use the service. COPPA is all about collecting Personal Information from under 13s so if you can show you don’t knowingly have any under 13 users then all good right? Well this approach has some merits, but if you’re going to take this line, you have to stick to it rigidly. And allegedly YouTube didn’t.

YouTube fell down by allegedly knowing full well that its users were under 13. This was seemingly evident in two places, the first being YouTube’s promotional pitches to companies such as Hasbro and Mattel where they made comments such as, “YouTube is unanimously voted as the favourite website of kids 2-12” and “In fact, it’s the #1 website regularly visited by kids.” Clearly some people at YouTube were very proud of their appeal to kids. Secondly, a huge amount of video content within YouTube is undeniably targeted at young children, from Mickey Mouse cartoons, to nursery rhymes to basic counting songs.

Whilst YouTube may have had an age gate on their registration flow, preventing users from registering when they self selected an age under 13, they allegedly knew that millions of young children still did access their content, whether it be from lying about their age or from unsupervised use of an older person’s shared device.

The FTC settlement required YouTube make some fundamental changes to get compliant with COPPA, and these changes got rolled out in January.

Back to those two get-out-of-jail-free cards on offer, clearly YouTube could no longer rely on the “we don’t have child users” card. And this left a fundamental problem for them – how do we stay in compliance if our registered users aren’t our actual users? The logged in user might be known to us as 40 years old, but the actual user might only be 4.

The alternative card on the table was to stop collecting personal information from users, and that way it wouldn’t matter what age the real user was. But this would be hugely damaging to revenue, requiring all ads to be non-personalised.

It’s pretty rare you get a service where the operator doesn’t control its content and has literally no idea of the age of its users.

So with no simple solution on the table, YouTube was forced into a “zero trust” compliance model.

The logic goes something like this:

  1. We don’t want to collect personal information from children.
  2. We don’t know which users are children.
  3. But we do know which videos are directed at children
  4. Let’s assume that all videos for children are being watched by children.
  5. Stop collecting personal information around only these videos.

This change requires all videos or their channels to be labelled as “made for kids” or “not made for kids”. And YouTube double checks this. Their algorithm proactively checks for labelling “errors or abuse” and amends the videos accordingly.

And with the “made for kids” label in place, a whole host of features are disabled, because they all rely on collecting personal information from the user who is assumed to be a child.

These disabled features include, comments, donations, live chat, the notification bell, the miniplayer and most importantly personalised advertising.

Disabling personalised ads is the big one. Instead, any ads will be context based which are clicked on less, and ultimately produce less revenue for the content creator.

Unsurprisingly, many YouTubers were extremely upset by this, as well as the uncertainty surrounding it. One of the biggest questions was how to decide whether a video is made for kids or not. Is a Minecraft video made for kids? Is an unboxing video of a Mickey Mouse DVD made for kids? Both the FTC and Youtube have since issued guidance in the form of videos and extensive Q&A articles to help clarify. But crucially, the focus from YouTube is about helping you – the content creator, comply with COPPA, NOT YouTube comply with COPPA. YouTube has pushed the compliance focus outside of their four walls, saying they’re doing their bit, and opening the door for the FTC to go after content creators for breaking the rules.

We’re only a month into these changes, so we’re yet to see many metrics of the real negative impact on revenue. Generally commentary from YouTubers since the change has been minimal, so it’s hard to know what is going on behind the scenes. With that in mind I reached out to one channel owner that produces their own animated videos for small children to see what impact they had seen. In terms of their size, this channel is pretty big, with 280,000 subscribers and 220 million views.

I’ll be honest, I didn’t expect a reply, but I did, and it was a long one, explaining that they had seen a 75% drop in revenue. Their exact words were, “YouTube’s changes are catastrophic for us.”

Being perfectly honest, I’m not overly sympathetic to those YouTubers that produce trashy content such as box openings or episodes of Mickey Mouse played in reverse. But when I hear that some of the very best children’s educational content is being put in jeopardy, I am truly sad. Most YouTubers won’t have understood the COPPA rule or how YouTube was working within it. They would have assumed that YouTube is a massive organisation that would obviously be fully compliant with the law, and in turn, so would they. At the same time though, these YouTubers were profiting off the collection and profiling of children, in direct contravention of COPPA, which is clearly wrong and needed to stop.

YouTube is still the only game in town for monetising your videos, so people have no choice but to accept the changes and make less money. YouTube has promised to bring back some of the disabled features in the future but in a format that doesn’t rely on collecting personal information. Ultimately though, personalised ads will continue to be unavailable when the content is child directed.

YouTube has taken much criticism for agreeing this settlement and the changes they’ve implemented. But personally I don’t see what alternative they had. As a parent I’ve found the changes to be positive in some respects but I would like to see YouTube do more to support the high quality YouTube creators that produce rich content that is safe for kids. The YouTube Kids app is a good alternative, but it lacks many of the very best children’s channels and is very clunky to use.

For those of us in privacy roles looking at the YouTube example and its relevance for the GDPR and the UK’s Age Appropriate Design Code, the big takeaway is the need to be honest about how your service is being used and to design a compliance solution that reflects those real world situations. We have to think beyond simple age gating and compliance legalese buried in Terms & Conditions. And ultimately, just ask a parent. If what you’re doing with child data makes sense to them, then you’re probably on the right track.

Share This Post!

About the Author: Carl Gottlieb
I'm the trusted privacy advisor to leading tech companies, helping them gain maximum advantage through the right privacy strategy. My consultancy company Cognition provides a range of privacy and security services including Data Protection Officers, in-depth assessments and virtual security engineers. Get in touch if you'd like to learn more.

Related articles