health
January 23, 2026
Giving your healthcare info to a chatbot is, unsurprisingly, a terrible idea
Posts from this topic will be added to your daily email digest and your homepage feed.

TL;DR
- ChatGPT Health encourages users to share private medical data, promising confidentiality and security, but lacks legal obligations of healthcare providers.
- Companies like OpenAI and Anthropic are pushing into the health AI market with products like ChatGPT Health and Claude for Healthcare.
- Users have limited recourse against data breaches or misuse, relying on company terms of use and privacy policies, which can be altered.
- Despite disclaimers, the AI's perceived authority can lead users to trust it for medical advice, even for diagnosis and treatment.
- The AI's classification as a non-medical device allows it to operate with less regulatory oversight, despite potential use in medical decision-making.
Continue reading the original article