322
Audio & Video Production299
Software Development231
Automation & Workflow200
Writing & Content Creation183
Marketing & Growth175
AI Infrastructure & MLOps143
Design & Creative144
Data & Analytics110
Photography & Imaging139
Voice & Speech123
Customer Support113
Sales & Outreach108
Education & Learning116
Operations & Admin79
Clarifai says it deleted 3 million OkCupid user photos and related AI models after an FTC settlement about data sharing and privacy claims.
In short: Clarifai says it deleted 3 million photos it received from OkCupid in 2014 and also deleted any AI systems trained on that data.
Clarifai, a company that builds tools for working with images, says it deleted 3 million photos that came from OkCupid users. The photos were used to train facial recognition AI, which is software that tries to identify or describe faces in pictures.
According to Reuters, the photo deletion happened after scrutiny from the US Federal Trade Commission, or FTC, the government agency that enforces consumer protection laws. Clarifai also says it deleted any “models,” meaning the finished AI systems it built using those photos (like throwing away a recipe and every dish made from it).
Court documents reviewed by Reuters say Clarifai asked OkCupid to share data in 2014. Reports say OkCupid provided user-uploaded photos, plus other information like demographic and location data. The FTC investigation reportedly began in 2019 after a New York Times article mentioned Clarifai using OkCupid images to build a tool that could estimate a person’s age, sex, and race from their face.
Last month, the FTC settled a lawsuit with OkCupid and its owner, Match Group. OkCupid and Match Group did not admit wrongdoing, but the FTC said they are permanently prohibited from misleading people about what data they collect and share. TechCrunch reported that OkCupid and Clarifai did not immediately respond to requests for comment.
This story is a reminder that personal photos can be used for purposes people never expected, like training face analysis tools. Even if it happened years ago, regulators can still step in later, and companies may be forced to delete data and stop certain practices.
Source: TechCrunch AI