Let’s say someone walks into an old-fashioned record store looking for the Bright Eyes song “False Advertising.” Upon finding and buying the album, she’d have little reason to fear that store employees might sneak into her house later and take it back from her. She’d also have no cause to think that the album was counterfeit and not by the band at all. Now let’s say instead that the same song inspires an artist to create a mural depicting the FTC’s greatest false ad cases, and the mural gets displayed in a local gallery. The artist might be surprised if the gallery later shuts its doors and refuses to return the mural . . . or if some other company secretly reuses bits of it to make something else.
When people buy or make digital products, though, it’s not always clear what they really own or control. Such clarity may often depend on intellectual property rights, which are generally beyond the FTC’s consumer protection jurisdiction. But we take note – and can take action – if companies aren’t upfront about what consumers are buying, who made it, how it was made, or what rights people have in their own creations.
What do people think they’re buying?
Companies that offer digital products – such as books, music, movies, and games – will often say that consumers can “buy” those products when they’re really getting only a limited, revocable license to enjoy them. Yes, some people may appreciate this distinction, but others have been surprised when their access to such products suddenly disappears. Companies are always obliged to ensure that customers understand what they’re getting for their money – a basic point we’ve made many times. In 2008, the FTC settled a case on this theme after Sony BMG misled CD buyers via its use of software that limited buyer use of the CDs. That same year, FTC staff resolved a similar matter after buyers who were told that they’d “own” Major League Baseball videos ran into unexpected use restrictions.
What do people think they can do with it?
Owner expectations for digital and Internet-connected products can also be subverted when companies impose limits on the right to repair, remotely exercise power to switch off hardware, use novel subscription models for normal product features, or otherwise unfairly change terms or restrict access post-purchase. Another unexpected limitation can arise when a family member passes on; survivors may be surprised to encounter access restrictions on digital products owned by the deceased. Similar or novel ownership issues may arise if and when the “metaverse” becomes more of a thing, and we’ll be watching those virtual spaces.
What about creative control of one’s own work?
In the record store example above, at least the buyer is reasonably assured that the album is the genuine article. But these days, digital music or text can be generated by AI tools and passed off – with increasing ease and quality – as the work of real artists or writers. We’ve already seen examples of fake new songs supposedly from recording artists, as well as new books sold as if authored by humans but in fact reflecting the output of large language models. Companies deceptively selling such content to consumers are violating the FTC Act. This conduct obviously injures artists and writers, too.
Some creators may develop content specifically for the digital environment, and they may reasonably expect to have some control over what they’ve made and how it’s used or presented. When platforms hosting that content fold up or change their terms, creators can suddenly lose access to what they spent time and effort to build. We may take a close look if such a platform isn’t living up to promises made to creators when they signed up to use it.
What else do artists, writers, and other creators have to worry about these days? Speaking again of generative AI, many models are trained on data that includes people’s creative work, which the models can then spit out in bits and pieces in response to varied inputs. These AI models can also ingest people’s likenesses and other aspects of their identities, in which case the people effectively become digital products themselves. These troubling facts involve complicated issues extending beyond consumer protection law, and they’re playing out now in courts and on picket lines.
Generative AI tools that produce output based on copyrighted or otherwise protected material may, nonetheless, raise issues of consumer deception or unfairness. That’s especially true if companies offering the tools don’t come clean about the extent to which outputs may reflect the use of such material. This information could be relevant to people’s decisions to use one tool or another. It’s not unusual for the FTC to sue when sellers deceive consumers about how products were made, such as with cases involving environmental claims. The information could also be relevant to business decisions to use such a tool for commercial purposes, given that the businesses could be liable if their use of the output infringes protected works.
Companies should keep in mind the following:
- When offering digital products, you must ensure that customers understand the material terms and conditions, including whether they’re buying an item or just getting a license to use it. Unilaterally changing those terms or undermining reasonable ownership expectations can get you in trouble, too.
- Selling digital items created via AI tools is obviously not okay if you’re trying to fool people into thinking that the items are the work of particular human creators.
- When offering a platform for creators to develop and display their work, be clear and upfront about their rights to access and take this work with them, as well as how the work will be used and presented. Again, don’t change the terms later.
- When offering a generative AI product, you may need to tell customers whether and the extent to which the training data includes copyrighted or otherwise protected material.
In the 1960s, four American musicians toured South America, pretending to be the Beatles, a scheme that worked until people saw their faces and heard them play. That kind of scam wouldn’t likely get off the ground today, but through a mixture of deepfakes, voice synthesis, and text generation, one could now create some fake, “long lost” Beatles music or footage and put it out in the world. At least Sir Paul McCartney, who has been using AI himself for artistic ends, would have the resources to deal with it. But many other artists would not be so fortunate if their work gets digitally faked or misused. In any event, if you sell ersatz Beatles music, suggesting it’s the lads from Liverpool when it’s really the Fabricated Four, it would surely be no defense that consumers weren’t actually “buying” anything at all.
Read more posts in the FTC’s AI and Your Business blog series:
The purpose of this blog and its comments section is to inform readers about Federal Trade Commission activity, and share information to help them avoid, report, and recover from fraud, scams, and bad business practices. Your thoughts, ideas, and concerns are welcome, and we encourage comments. But keep in mind, this is a moderated blog. We review all comments before they are posted, and we won’t post comments that don’t comply with our commenting policy. We expect commenters to treat each other and the blog writers with respect.
- We won’t post off-topic comments, repeated identical comments, or comments that include sales pitches or promotions.
- We won’t post comments that include vulgar messages, personal attacks by name, or offensive terms that target specific people or groups.
- We won’t post threats, defamatory statements, or suggestions or encouragement of illegal activity.
- We won’t post comments that include personal information, like Social Security numbers, account numbers, home addresses, and email addresses. To file a detailed report about a scam, go to ReportFraud.ftc.gov.
We don't edit comments to remove objectionable content, so please ensure that your comment contains none of the above. The comments posted on this blog become part of the public domain. To protect your privacy and the privacy of other people, please do not include personal information. Opinions in comments that appear in this blog belong to the individuals who expressed them. They do not belong to or represent views of the Federal Trade Commission.
Many small business owners utilize Software As A Service because cloud computing seems so attractive and easy. We can perform business operations and record-keeping without having to finance a server system or a professional to install, run, and oversee it. Mostly we understand that we have purchased a license to access and use programs, that licenses expire, and that tech support may expire before the license does. A concern for me is who owns that data that is residing on the cloud as a result of my input and business results? What happens to it if the license is revoked? In all fairness, we should be able to recover 100% of our data and records in a useable format, not a format that can be converted only by resubscribing to the same cloud service.
In reply to Many small business owners… by Small Town
I share this concern. A legal protection over business data would make this country safer for businesses. E.g. the EU's GDPR, but extended for business entities. I wonder how that could work when the business has massive amounts of data which costs actual dollars to move/recover.
This was a totally confusing article, It lead you to think it was viable information but in the end you told us nothing at all except ai creates data that belongs to someone else.
In reply to This was a totally confusing… by mary jastrzemski
I thought the four bullet points at the end succinctly outlined the FTC's position on some hot topics for intellectual property, AI, and digital rights.