324
Audio & Video Production314
Software Development229
Automation & Workflow208
Writing & Content Creation190
Marketing & Growth179
AI Infrastructure & MLOps150
Design & Creative154
Photography & Imaging146
Data & Analytics115
Voice & Speech123
Education & Learning120
Sales & Outreach114
Customer Support112
Research & Analysis86
A lawsuit in Arizona says men used women’s social media photos to create sexual AI images, sell them on Fanvue, and teach others how to do it.
In short: Three women in Arizona filed a lawsuit claiming men used their social media photos to make sexual AI images and sell both the content and lessons on how to create it.
Three Arizona women have filed a lawsuit against three Phoenix men, Jackson Webb, Lucas Webb, and Beau Schultz, plus 50 unnamed defendants. The complaint says the men took photos of real women from the internet and used AI to generate images and videos of fake models that closely resembled them.
One plaintiff, identified as “MG” in court papers, says she discovered Instagram videos that appeared to show her face placed onto a sexualized body with tattoos in the same places as hers. She says the images were also used to promote AI ModelForge, which the lawsuit describes as a business that taught men how to create “AI influencers” based on real women.
According to the suit, the defendants sold sexual content on Fanvue, a paid subscription site. It also claims they sold online courses for $24.95 per month through Whop, including step-by-step instructions for collecting women’s photos and training a model using CreatorCore (software that can learn a person’s look from photos, like teaching a copier to mimic one specific face). The lawsuit claims this activity generated large view counts and more than $50,000 in income in one month.
This case highlights how easy it can be to turn ordinary social media photos into convincing sexual images without permission. Even if platforms remove one post, copies can reappear elsewhere, like a game of whack-a-mole.
Source: Wired