Skip to main content

Are you posting videos to YouTube that are made for kids? Are you using channels designated for adults? If you’re uploading child-directed videos to adult-designated channels, you might have a COPPA problem (or two). That’s one of the key takeaways from today’s proposed settlement with Disney, filed by the Department of Justice on behalf of the FTC, in which the FTC alleges that Disney Worldwide Services, Inc. and Disney Entertainment Operations LLC failed to properly designate its YouTube videos as directed to children. The FTC charges that Disney’s practices violated the Children’s Online Privacy Protection Act and the Commission’s Children’s Online Privacy Protection Rule (COPPA).

According to the complaint, when Disney uploaded videos to YouTube, its policy was to set the audience at the channel level, rather than checking the audience for each video. As a result, some child-directed videos were incorrectly designated as “not made for kids.”  Because of this faulty label, personal information of children viewing these videos was collected and used for targeted advertising without parental notice or consent as required under COPPA. And, according to the complaint, kids viewing these mis-designated videos were also exposed to YouTube features not meant for kids: autoplay to other “not made for kids” videos and access unrestricted public comments.

The FTC’s settlement with Disney imposes a $10 million penalty and requires a sea change in how the company uploads its kids’ videos. Going forward, Disney must implement a program to review each video it publishes to the YouTube platform to determine if the video is child-directed. If YouTube implements age assurance technologies that can determine the age, age range, or age category of all YouTube users, in a way that ensures COPPA compliance, Disney can rely on that signal. This forward-looking age assurance provision reflects and anticipates the growing use of age assurance technologies to protect kids online—and creates an incentive to encourage its spread.

Here are key takeaways from this action:

If you are a content creator, know how YouTube works. If you run a YouTube channel and set the audience at the channel level, that audience designation applies by default to all videos on that particular channel. So, if you designate your channel as “not made for kids,” all the videos on the channel are designated “not made for kids” by default. If the channel contains any videos that are intended for an audience under 13 years old, you must change the audience designation for those videos to “made for kids.”

The FTC is protecting kids online. Children’s privacy is a big priority at the FTC. The Commission is taking on the biggest and most powerful companies to protect kids’ privacy by vigorously enforcing COPPA. If it looks like a content creator has been cutting corners to get a competitive (and illegal) advantage, they risk FTC law enforcement and civil penalties.

Think about how your choices affect kids. The core problem here is that by mis-designating videos, Disney cut parents out from knowing what personal information was collected from their kids. But that COPPA violation resulted in other downstream harms: Kids were exposed to other “not made for kids” videos via autoplay and unrestricted comments. We all know the rabbit holes autoplay and comments can take us to—and we do not want our kids going there. Any company interacting with children online should be aware that the FTC is closely watching how their data practices affect kids.

Make way for age assurance technologies. Age assurance is a broad term for methods to determine the age or age range of an individual (adults, teens, or kids) online. Effective age assurance technologies that reliably identify users’ ages can ease the burden on parents, allow kids to have an age-appropriate experience online, and protect kids from harmful content online.

To learn more, check out the FTC’s Children’s Privacy page.

Get Business Blog updates