Recently, a report from the United States has attracted widespread attention. A 60-year-old man changed his diet after consulting the chatbot ChatGPT, which eventually led to serious health problems. After following this new diet for three months, he developed symptoms such as paranoia and hallucinations and had to go to the emergency room.

ChatGPT

Image source note: The image is AI-generated, and the image licensing service provider is Midjourney.

After medical examination, the man was diagnosed with bromine poisoning, a condition caused by long-term exposure to bromides. He had been taking sodium bromide purchased online, originally hoping to improve his health by replacing salt. However, doctors found that after consulting ChatGPT, he completely replaced sodium chloride in his diet with sodium bromide.

The patient was inspired after reading an article about reducing salt intake and decided to try reducing chlorides in his diet. When consulting ChatGPT, he was advised to replace chlorides with bromides, so he made dietary adjustments. After several months of substitution, the patient began to feel abnormal, even telling the emergency room staff that he was worried his neighbors were poisoning him. Laboratory results showed that his blood had abnormally high levels of carbon dioxide and chloride, but normal sodium levels.

Through further examinations and literature research, doctors ultimately concluded that the patient's symptoms were caused by bromine poisoning. During hospitalization, the patient felt thirsty but feared water. After a day of treatment, his hallucinations and delusions worsened, and he was sent to a psychiatric hospital for antipsychotic medication. During the recovery process, he mentioned his experience using ChatGPT, as well as acne and skin allergic reactions on his face, which may be related to bromide.

This case reminds people that relying on non-professional artificial intelligence tools for health advice carries risks. Although AI can help provide information, it cannot replace the advice of medical experts. Especially in matters concerning health, patients should seek guidance from professional doctors rather than solely relying on answers provided by chatbots.

Key points:

🌟 This man mistakenly replaced sodium chloride with sodium bromide based on ChatGPT's advice, leading to bromine poisoning.

🧪 Doctors found that the patient's blood contained abnormally high levels of bromide, and finally diagnosed him with bromine poisoning.

💡 This case reminds everyone to rely on professional doctors for health advice, not artificial intelligence tools.