Two separate settlements with Epic Games, owner of the massively popular online game Fortnite, send the unmistakable message to business that the FTC means business when it comes to enforcing online protections for kids and fighting back against dark patterns designed to rack up charges without consumers’ express consent. If that isn’t enough to make companies take notice, perhaps these numbers will. Epic will pay a record-shattering $275 million civil penalty for alleged violations of the Children’s Online Privacy Protection Act. The company will turn over an additional $245 million for allegedly using dark patterns to dupe millions of Fortnite players into making unintentional purchases, the largest FTC administrative settlement ever. This post will focus on the FTC’s allegation of COPPA violations and on Epic’s choice of default settings, which allowed strangers to communicate with children and teens under 18. Subscribers to the Business Blog can expect a second post shortly that will take a deep dive into how the FTC says Epic used design tricks to zap Fortnite players with unauthorized charges. You definitely don’t want to miss Part 2.
First, a refresher about what the COPPA Rule requires. Section 312.3 makes it clear that the Rule covers operators of child-directed sites and online services – a determination made by evaluating the subject matter, visual content, use of animated characters or child-oriented activities and incentives, and other factors – and operators of sites and online services who have actual knowledge they’re collecting or maintaining personal information from a child under 13. If a company is covered by COPPA, it must (among other things) get verifiable parental consent before collecting, using, or disclosing personal information from children under 13.
According to the FTC, a substantial number of the 400 million people who play Fortnite are kids under 13, and through its registration process, Epic collected kids’ personal information – including their full names, email addresses, and usernames – without getting their parents’ consent. The complaint cites a number of factors to establish that Fortnite is a “child-directed” service. First, there’s a 2019 survey reporting that 53% of U.S. children aged 10-12 played Fortnite weekly, compared to 33% of teens between 13 and 17, and 19% of those between 18 and 24. The style of game play is relevant, too, including Fortnite’s cartoon-like graphics and colorful animation. Indeed, according to the complaint, Fortnite has proven so popular with children that Epic Games has approved licensing deals – and pocketed millions of dollars – for Fortnite-branded merchandise aimed at kids, including children’s clothing, Halloween costumes, school supplies, and toys.
Other persuasive evidence came from Epic’s own employees. The complaint quotes statements like “We want to be living room safe, but barely. We don’t want your mom to love the game – just accept it compared to alternatives,” “Agree with the idea that, generally, all theming should be relevant to a 8-14 y.o., as a litmus test,” and “We are NOT adult: experience must allow for parental comfort for ages 10+.”
But according to the FTC, Epic’s law violations didn’t end there. In designing Fortnite to match users to play the game together, Epic set it up by default that players could engage in direct, real-time voice chat with other players. Given the number of Fortnite players who were young kids or teenagers, the inevitable result was that children and teens were often matched with strangers.
Epic’s then-Director of User Experience spotted the problem early on. Noting that “surely a lot of kids” are playing Fortnite, the Director of User Experience urged Epic leadership to institute “basic toxicity prevention” mechanisms to “avoid voice chat or have it opt-in at the very least.” An Epic employee raised a similar concern after a high-profile gamer verbally harassed a young player while publicly streaming to an audience of thousands. As the employee acknowledged: “. . . we honestly should have seen this coming or [at least] expected this with an on-by-default voice chat system. Situations like this are bound to happen . . .”
Another employee summed up the problem this way:
I think you both know this, but our voice and chat controls are total crap as far as kids and parents go. It’s not a good thing. It was on my list a year ago, but never bubbled to the surface. This is one of those things that the company generally has weak will to pursue, but really impacts our overall system and perception. I’ve made a COPPA compliant game and we are far from it, but we don’t need to be that far . . .
How did Epic respond to its own employees’ concerns? According to the FTC, with lip service – followed by crickets. Despite entreaties from its staff, Epic chose to maintain its on-by-default in-game communications that allowed personal interactions between kids and strangers. When the company introduced a toggle switch allowing Fortnite players to turn voice chat off, the FTC says the control was buried on a hard-to-find settings page. Furthermore, even after Epic ultimately implemented an age gate, the FTC says the company continued to enable direct communication by default for all players, including those who identified themselves as under 13 or teens.
The complaint outlines disturbing allegations of how Epic’s choice of default settings resulted in harm to kids and teens, including threats, bullying, and sexual harassment. Numerous news stories reported that predators had coerced youngsters they met through Fortnite into sharing explicit images or meeting offline for sexual activity. In addition, some kids and teens were exposed to traumatizing encounters involving self-harm, suicide, and suggestions by others that a player “kill themselves.” As one parent reported to Epic, “This morning, while on Fortnite, my 9 year old son had a ‘friend’ (someone he doesn’t know in real life, but has been playing with for months) tell him that he was going to kill himself tonight. It shook him to the core.”
In addition to the $275 million civil penalty, which by law goes to the U.S. Treasury, the proposed court order prohibits Epic from enabling voice and text communications unless parents of users under 13 or teenage users (or their parents) give their affirmative consent through a privacy setting. Epic also must delete personal information previously collected from Fortnite users in violation of COPPA’s parental notice and consent requirements unless the company obtains parental consent to retain that data or the user identifies as 13 or older through a neutral age gate. To protect kids and other users in the future, Epic must establish a comprehensive privacy program that addresses the issues challenged in the FTC complaint.
What can other companies take from the record-setting settlement?
Companies can’t disclaim their way out of COPPA coverage. Simply saying your business isn’t covered by COPPA doesn’t absolve you of your legal obligations. The COPPA Rule includes detailed definitions of the sites and online services subject to the law’s protective provisions. If there is any doubt in your mind about whether COPPA applies to your business, now is the time to clear up that ambiguity.
Listen to what your employees are telling you. When a knowledgeable staffer says, “Houston, we have a problem,” take their concerns seriously. One of a company’s best tools for reducing the risk of legal quicksand is a staff that feels empowered to call management’s attention to potential difficulties.
Default settings that harm consumers can be unfair under the FTC Act. As the complaint alleges, Epic’s choice to configure its system for on-by-default voice and text chat injured both kids in the under-13 COPPA age group, as well as teens. Think through the potential for harm your default settings could have for users of all age groups.
Looking for COPPA compliance resources? Visit the FTC’s Children’s Privacy page. And be sure to read the follow-up Business Blog post about the FTC’s challenge to Epic’s use of digital dark patterns.
The purpose of this blog and its comments section is to inform readers about Federal Trade Commission activity, and share information to help them avoid, report, and recover from fraud, scams, and bad business practices. Your thoughts, ideas, and concerns are welcome, and we encourage comments. But keep in mind, this is a moderated blog. We review all comments before they are posted, and we won’t post comments that don’t comply with our commenting policy. We expect commenters to treat each other and the blog writers with respect.
- We won’t post off-topic comments, repeated identical comments, or comments that include sales pitches or promotions.
- We won’t post comments that include vulgar messages, personal attacks by name, or offensive terms that target specific people or groups.
- We won’t post threats, defamatory statements, or suggestions or encouragement of illegal activity.
- We won’t post comments that include personal information, like Social Security numbers, account numbers, home addresses, and email addresses. To file a detailed report about a scam, go to ReportFraud.ftc.gov.
We don't edit comments to remove objectionable content, so please ensure that your comment contains none of the above. The comments posted on this blog become part of the public domain. To protect your privacy and the privacy of other people, please do not include personal information. Opinions in comments that appear in this blog belong to the individuals who expressed them. They do not belong to or represent views of the Federal Trade Commission.
My 13 year old son plays fortnite and he has been having people sending him awful messages on the game telling him to kill himself and to die.
How do I get the fortnite refund?
I want my refunds fortnite Scammed me some skins!
To the ftc.gov owner, Good job!