ChatGPT Privacy Concerns Raised by OpenAI CEO Sam Altman
By Global Leaders Insights Team | Jul 28, 2025

Sam Altman, CEO of OpenAI, has warned ChatGPT users about a major privacy issue, especially for those using the AI chatbot as a therapist or confidant. Speaking on This Past Weekend, Altman explained that conversations with ChatGPT don’t have the same legal protections as talks with licensed professionals like therapists, doctors, or lawyers.This means personal details shared with the AI could potentially be exposed in legal cases, raising serious concerns for users.
Altman pointed out that many people, especially younger users, turn to ChatGPT for emotional support, treating it like a therapist or life coach.
“People talk about the most personal stuff in their lives to ChatGPT,” Sam Altman said. Unlike therapy sessions, which are protected by confidentiality laws, AI chats aren’t covered by similar rules. If OpenAI is sued, the company might have to hand over user conversations, even ones users thought were deleted, as they could be stored for legal or safety reasons.
- Sam Altman Warns ChatGPT Users About Privacy Risks in AI Conversations
- OpenAI CEO Raises Red Flag: AI Chats Not Protected Like Therapy Sessions
- ChatGPT Privacy Alert: User Data Could Be Used in Legal Cases, Says Altman
This isn’t just a theoretical worry. OpenAI is currently in a legal battle with The New York Times, which has demanded that the company keep all user chat logs, including deleted ones, indefinitely. OpenAI has called this request excessive and is fighting the court order, but the situation highlights how vulnerable user data can be. Unlike secure platforms like WhatsApp, ChatGPT conversations can be accessed by OpenAI staff to improve the AI or monitor misuse, adding to privacy concerns.
Also Read: Meta Appoints Ex-OpenAI Scientist to Lead AI Superintelligence Lab
Altman urged for new laws to protect AI conversations the same way therapist sessions are protected. “I think we should have the same concept of privacy for your conversations with AI that we do with a therapist,” he told Von, admitting that this issue wasn’t even on their radar a year ago. Until those laws exist, experts suggest users be cautious and assume their AI chats could become public. As more people rely on AI tools like ChatGPT, this privacy gap could erode trust and slow their widespread use.