339
Audio & Video Production331
Software Development243
Automation & Workflow215
Writing & Content Creation195
Marketing & Growth184
AI Infrastructure & MLOps164
Design & Creative162
Photography & Imaging151
Voice & Speech130
Data & Analytics128
Education & Learning123
Customer Support120
Sales & Outreach120
Research & Analysis94
Meta is testing a new AI assistant designed to handle everyday tasks. The plan raises questions about trust and sharing sensitive data.
In short: Meta is developing a more hands-on AI assistant that can carry out everyday tasks for its users.
Meta is building a highly personalised AI assistant for its consumer apps, according to people familiar with the project. The assistant is being tested internally by a group of staff.
The goal is to create what some people call an “agentic” assistant. That means the AI can take actions for you, not just answer questions. You can think of it like a helpful assistant who can actually run errands online, not just give you advice.
The report says Meta wants this assistant to be similar to OpenClaw, a tool that lets users create “agents” (small AI bots that do specific jobs) that can complete tasks on their behalf. The assistant is expected to be powered by Meta’s new “Muse Spark” AI model, which is the system behind the assistant’s responses and actions.
Meta is also considering letting people share very sensitive information with the assistant, including health and financial details, if they choose. One person involved in the project questioned whether users will be comfortable doing that, citing a “trust deficit.”
This work comes as investors pay close attention to Meta’s rising AI spending. Meta recently said it could increase capital spending by $10 billion, up to as much as $145 billion this year, and it plans to cut 10 percent of its workforce later this month.
If Meta puts this kind of assistant into apps used by billions of people, it could change how people book appointments, manage plans, or handle other routine tasks. But it also raises everyday questions about privacy and trust, especially if the assistant needs access to personal data to be truly useful.
Source: Financial Times