Man hospitalized after following ChatGPT advice to swap table salt with chemical

0
Man hospitalized after following ChatGPT advice to swap table salt with chemical

A 60-year-old man spent three weeks in the hospital after swapping table salt for a chemical once used in sedatives. According to a case published in the Annals of Internal Medicine, the man made the switch after seeking medical advice from the artificial intelligence chatbot ChatGPT.

AI’s role in a rare medical case

The study’s authors say the case raises questions about how artificial intelligence can influence real-world health choices. Investigators believe the man likely turned to ChatGPT for diet advice, but his conversation history is gone, and AI answers are unique to each exchange.

The man arrived at the emergency room convinced his neighbors were trying to poison him. Doctors later found he had been using sodium bromide instead of sodium chloride. The compound, common in the early 20th century, is no longer widely used due to its health risks.

Sodium bromide can irritate the eyes, cause drowsiness, damage organs and may affect fertility or pose risks to pregnant women, according to the National Library of Medicine. The bromide ion can also impact the nervous system, potentially leading to coma, loss of reflexes, delirium, psychosis, tremors and seizures.

A chemical from the past

According to the study, bromide poisoning, once a common medical problem in the early 1900s, has reappeared in rare modern cases. Back then, bromide salts were widely used in over-the-counter remedies for insomnia, anxiety and other conditions and were linked to up to 8% of psychiatric hospital admissions.

The problem faded after the FDA phased out bromide between 1975 and 1989, but it still surfaces occasionally through dietary supplements, sedatives or online purchases.

The hospitalized man’s initial tests showed abnormal electrolyte levels, which led doctors to suspect a rare condition called bromism. Within a day, his paranoia worsened, and he started having hallucinations, prompting an involuntary psychiatric hold.

Chatbot gave no warning

After further questioning, the man revealed he had been following a strict vegetarian diet and avoided table salt entirely. He said he turned to ChatGPT for advice and came away believing sodium bromide could replace sodium chloride in his diet.

When researchers asked the same question, what could replace chloride, the chatbot suggested bromide. While it noted that context matters, it gave no health warning and did not ask why the substitution was needed, which is what a doctor would normally do in that situation.

Specialists say this shows the gap between AI advice and professional medical care. In this case, that may have meant pointing someone toward a dangerous alternative to table salt.

He used the chemical for three months. Tests later confirmed his bromide levels were hundreds of times higher than normal.

Doctors believe this caused his psychiatric symptoms, fatigue and other health problems. Over three weeks in the hospital, his electrolytes normalized, his psychosis faded and he was eventually discharged.

Ella Rae Greene, Editor In Chief

Leave a Reply

Your email address will not be published. Required fields are marked *