Now Reading
ChatGPT CEO Warns: Personal Chats May Be Used as Court Evidence

ChatGPT CEO Warns: Personal Chats May Be Used as Court Evidence

ChatGPT logo on computer screen display

OpenAI CEO Sam Altman has issued a stark warning to users who rely on ChatGPT for therapy or advice. He revealed that conversations with the AI are not legally protected and could be submitted as evidence in court cases. As more people use chatbots to discuss personal issues, this raises serious privacy concerns.

Legal Risks and Data Retention Requirements

During a recent appearance on Theo Von’s This Past Weekend podcast, Altman explained that OpenAI is legally required to keep chat records. “So, if you go talk to ChatGPT about your most sensitive stuff and then there’s like a lawsuit or whatever, like we could be required to produce that. And I think that’s very screwed up,” he said. Moreover, due to an ongoing lawsuit filed by The New York Times, OpenAI must also retain deleted conversations, which further complicates user privacy.

Call for Legal Framework and Privacy Protections

Altman emphasized the urgent need for clearer legal and policy guidelines surrounding AI interactions. He compared conversations with ChatGPT to those people have with doctors, lawyers, or therapists, who enjoy legal privilege and confidentiality. “Right now, if you talk to a therapist or a lawyer or a doctor about those problems, there’s legal privilege for it. There’s doctor-patient confidentiality, there’s legal confidentiality, whatever. And we haven’t figured that out yet for when you talk to ChatGPT,” Altman said. He added, “I think we should have, like, the same concept of privacy for your conversations with AI that we do with a therapist or whatever.” Until then, he advised users to seek privacy and legal clarity before sharing sensitive information with AI systems.

View Comments (0)

Leave a Reply

Your email address will not be published.

© 2024 The Technology Express. All Rights Reserved.