Skip to main content

A California-based developer of a photo app has settled Federal Trade Commission allegations that it deceived consumers about its use of facial recognition technology and its retention of the photos and videos of users who deactivated their accounts. 

As part of the proposed settlement, Everalbum, Inc. must obtain consumers’ express consent before using facial recognition technology on their photos and videos. The proposed order also requires the company to delete models and algorithms it developed by using the photos and videos uploaded by its users.

“Using facial recognition, companies can turn photos of your loved ones into sensitive biometric data,” Andrew Smith, Director of the FTC’s Bureau of Consumer Protection, said. “Ensuring that companies keep their promises to customers about how they use and handle biometric data will continue to be a high priority for the FTC.”

Everalbum offered an app called “Ever” that allowed users to upload photos and videos from their mobile devices, computers, or social media accounts to be stored and organized using the company’s cloud-based storage service. In its complaint, the FTC alleges that, in February 2017, Everalbum launched a new feature in the Ever app, called “Friends,” that used facial recognition technology to group users’ photos by the faces of the people who appear in them and allowed users to “tag” people by name. Everalbum allegedly enabled facial recognition by default for all mobile app users when it launched the Friends feature.

Between July 2018 and April 2019, Everalbum allegedly represented that it would not apply facial recognition technology to users’ content unless users affirmatively chose to activate the feature. Although, beginning in May 2018, the company allowed some Ever app users—those located in Illinois, Texas, Washington and the European Union—to choose whether to turn on the face recognition feature, it was automatically active for all other users until April 2019 and could not be turned off.

The FTC’s complaint alleges that Everalbum’s application of facial recognition to Ever app users’ photos was not limited to providing the Friends feature. Between September 2017 and August 2019, Everalbum combined millions of facial images that it extracted from Ever users’ photos with facial images that Everalbum obtained from publicly available datasets to create four datasets for use in the development of its facial recognition technology. The complaint alleges that Everalbum used the facial recognition technology resulting from one of those datasets to provide the Ever app’s Friends feature and also to develop the facial recognition services sold to its enterprise customers; however, the company did not share images from Ever users’ photos or their photos, videos, or personal information with those customers.

According to the complaint, Everalbum also promised users that the company would delete the photos and videos of Ever users who deactivated their accounts. The FTC alleges, however, that until at least October 2019, Everalbum failed to delete the photos or videos of any users who had deactivated their accounts and instead retained them indefinitely.

The proposed settlement requires Everalbum to delete:

  • the photos and videos of Ever app users who deactivated their accounts;
  • all face embeddings—data reflecting facial features that can be used for facial recognition purposes—the company derived from the photos of Ever users who did not give their express consent to their use; and
  • any facial recognition models or algorithms developed with Ever users’ photos or videos.

In addition, the proposed settlement prohibits Everalbum from misrepresenting how it collects, uses, discloses, maintains, or deletes personal information, including face embeddings created with the use of facial recognition technology, as well as the extent to which it protects the privacy and security of personal information it collects. Under the proposed settlement, if the company markets software to consumers for personal use, it must obtain a user’s express consent before using biometric information it collected from the user through that software to create face embeddings or develop facial recognition technology.

The Commission voted 5-0 to issue the proposed administrative complaint and to accept the consent agreement with the company. Commissioner Rohit Chopra issued a separate statement.

The FTC published a description of the consent agreement package in the Federal Register. The agreement will be subject to public comment until February 24, 2021 after which the Commission will decide whether to make the proposed consent order final. Instructions for filing comments will appear in the published notice. Once processed, comments will be posted on Regulations.gov.

NOTE: The Commission issues an administrative complaint when it has “reason to believe” that the law has been or is being violated, and it appears to the Commission that a proceeding is in the public interest. When the Commission issues a consent order on a final basis, it carries the force of law with respect to future actions. Each violation of such an order may result in a civil penalty of up to $43,280.

The Federal Trade Commission works to promote competition and protect and educate consumers.  The FTC will never demand money, make threats, tell you to transfer money, or promise you a prize. Learn more about consumer topics at consumer.ftc.gov, or report fraud, scams, and bad business practices at ReportFraud.ftc.gov. Follow the FTC on social media, read consumer alerts and the business blog, and sign up to get the latest FTC news and alerts.

Contact Information

Media Contact

Staff Contacts

James Trilling
Bureau of Consumer Protection
Robin Wetherill
Bureau of Consumer Protection