21/08/2025
⚠️ A man developed psychosis after following ChatGPT’s advice about replacing salt.
After three months on the diet, he exhibited symptoms including hallucinations, paranoia, and confusion, ultimately requiring a three-week involuntary psychiatric hold.
The 60-year-old man had no prior psychiatric history.
The patient replaced sodium chloride—table salt—with sodium bromide, a compound banned by the FDA since 1975 due to its toxic effects.
Doctors diagnosed him with bromism, a condition historically linked to bromide poisoning, once common before the substance was regulated.
According to a case report in Annals of Internal Medicine, the patient’s symptoms improved with antipsychotics and electrolyte treatment. While the exact ChatGPT exchange couldn’t be confirmed, the medical team tested ChatGPT 3.5 and found that it did suggest bromide as a possible chloride substitute without issuing a health warning. This case underscores the potential dangers of using AI tools for medical decisions, particularly when those tools fail to verify the user’s intent or context. The patient fully recovered, but the incident is a stark reminder that artificial intelligence is no substitute for professional medical advice.
Wainberg, Z. et al. (2025). Annals of Internal Medicine.