330
Audio & Video Production330
Automation & Workflow221
Software Development248
Marketing & Growth203
AI Infrastructure & MLOps153
Writing & Content Creation204
Data & Analytics129
Customer Support132
Design & Creative155
Sales & Outreach124
Photography & Imaging143
Operations & Admin95
Voice & Speech132
Education & Learning122
Onix is rolling out a service where you subscribe to chatbots based on real health and wellness experts, with privacy claims and clear limits on medical care.
In short: Onix is launching a subscription service where people can chat with AI versions of health and wellness experts, but the bots can still make mistakes and may promote products.
Onix, a new company based in Canada, is rolling out a platform that it describes as a “Substack for chatbots.” Substack is a site where you pay to subscribe to writers. Onix applies that idea to paid chats with AI versions of individual experts.
Each expert creates a chatbot “double” called an Onix. The company says these bots are trained on the expert’s own content and are meant to stay focused on that person’s area, like stress, nutrition, or therapy style guidance.
Onix says it has built privacy protections, including storing user information on the user’s device in encrypted form (like locking data in a safe). It also says the bots have limits meant to reduce false answers. In testing described by WIRED, some bots still went off topic and made up details, which is a known issue with large language models (AI that predicts the next words, like an advanced autocomplete).
The product is in beta with invited testers and is expected to open up more broadly later. Onix started with 17 vetted experts and some are also influencers with products to sell.
Many people already use tools like ChatGPT for personal advice, especially when real care is expensive or hard to access. Onix is trying to turn that behavior into paid subscriptions, priced around $100 to $300 per year for some experts. The big question is whether this kind of guidance is reliable and how clearly people understand it is not medical treatment.
Source: Wired