Elderly man hospitalised after following ChatGPT's wrong dietary advice

Following ChatGPT's wrong salt alternative leads old man to consume toxic compound
An undated image. — Depositphotos
An undated image. — Depositphotos

In the midst of immense scale of automation being attained with the helo of artificial intelligence (AI), the technology recently gave an elderly man wrong dietary advice, resulting in the 60-year old getting hospitalised.

The person in question asked ChatGPT to suggest him an alternative to sodium chloride as he was concerned about his salt intake. The AI in response recommended using sodium bromide, a compound once used in early 20th-century medications but now considered harmful in large doses.

Following ChatGPT's salt advice without consulting a healthcare professional led him to substitute table salt with a toxic compound.

The man purchased sodium bromide based and used it in his cooking for three months. He soon experienced symptoms such as hallucinations, anxiety, excessive thirst, and skin lesions. Later, he was diagnosed with bromism, a rare condition caused by excessive bromide levels in the body.

It was also reported that the person had no prior psychiatric or physical health issues, but his condition deteriorated to the point of requiring a psychiatric hold and intensive treatment with fluids and electrolytes.

In light of the development, experts cautioned that while AI tools like ChatGPT can provide useful information, they are not substitutes for professional medical advice.

Besides OpenAI clearly stating in its Terms of Use that its services are not intended for treating health conditions, OpenAI CEO Sam Altman has also emphasised that ChatGPT's emotional support services are not a replacement for professional counselling.

While the person affected by the AI bot's wrong medical advice, has recovered following a three-week hospitalisation, this case serves as a warning.