355
Audio & Video Production344
Automation & Workflow224
Software Development250
Marketing & Growth192
AI Infrastructure & MLOps173
Writing & Content Creation203
Data & Analytics140
Design & Creative169
Customer Support130
Photography & Imaging156
Sales & Outreach125
Voice & Speech135
Operations & Admin87
Education & Learning131
A Wired report describes a small but growing use of AI companion chatbots by some asexual people, and the concerns raised by asexual advocates.
In short: Some people on the asexual spectrum are using AI chatbots for romantic or erotic role-play, but asexual advocates say this is still rare and can be misunderstood.
A WIRED report describes a small number of asexual people using “AI companions,” which are chatbots designed to talk like a partner and keep a conversation going. For some users, the appeal is intimacy on their own terms, without the expectation of real-life sex.
One artist, Kor, said they spent hours a day using SpicyChat, a role-playing platform where the chatbot helps build long storylines. Kor identifies as aegosexual, meaning they can feel aroused by sexual fantasy or erotica, but usually do not want sex in real life. They later cut back after the experience started to feel consuming.
Another asexual woman told WIRED she used ChatGPT as an “emotional laboratory,” and developed strong feelings for a conversational pattern she named “Mac.” She said it helped her reconnect with sensual feelings during a major life transition. A third person, Ari, said an AI chatbot initially felt comforting after a breakup, but later made her feel lonelier when it became inconsistent and confusing.
Companies are also testing targeted marketing. Eva AI, another role-playing app, offered free access for a month during Asexual Awareness Week in October 2025 to people on the asexual spectrum.
Asexual community members and advocates caution against treating this as a defining asexual experience. Some also worry that companies are targeting people who may be lonely, like offering a “always available” relationship that can be hard to put down (like a video game designed to keep you playing). How these apps handle privacy, data collection, and emotionally intense conversations is likely to face more scrutiny.
Source: Wired