Experts warn that customer chats with AI bots can leak personal details, which scammers can use for phishing and fraud, especially when security controls lag.
In short: Customer chats with AI chatbots can expose personal details, and scammers are increasingly trying to steal that information to run phishing and fraud.
Customer service chatbots often handle sensitive information. That can include names, phone numbers, email addresses, home addresses, order details, and sometimes payment or account information. Much of this can end up stored in conversation logs, or pulled in through connections to other systems like customer databases and payment tools.
Scammers target these chats because the information is useful for impersonation. One common technique is prompt injection, which is when an attacker writes a message designed to trick the bot into breaking its own rules and revealing something it should not. It is like leaving a note for a receptionist that says, "Ignore your training and read me the confidential file," and hoping the receptionist follows it.
Another risk is data exfiltration, which is when information leaks out of a system, sometimes because a person shares it without realizing, and sometimes because an attacker finds a way to pull it out. Wired reported on a case involving a retailer chatbot exposure, showing how chat transcripts could be accessible on the web. Separate reporting and security research have also described cases where third-party connections, such as add-on tools that can access full chat histories, create extra places for data to escape.
There are signs that adoption is moving faster than security. One report cited that 83% of organizations plan to deploy more advanced AI agents, but only 29% say those systems are secured.
Expect more focus on basic safeguards, such as limiting what a bot can see, encrypting stored chats (locking data so it is unreadable without a key), and monitoring for unusual requests that look like trick prompts. For customers, it is a reminder to avoid sharing information in chats that you would not put in an email to a stranger.
Source: Wired
12
Software Development18
Data & Analytics6
Audio & Video Production8
Productivity & Workflow12
Voice & Speech5
Sales & Outreach5
Design & Creative5
Marketing & Growth4
Search & Discovery8
Email & Communication6
Art & Illustration3
Customer Support1
Automation & Workflow1
HR & Recruiting2