News

After the escape attempt, the man was given an involuntary psychiatric hold and an anti-psychosis drug. He was administered ...
The man had been using sodium bromide for three months, which he had sourced online after seeking advice from ChatGPT.
After following ChatGPT's advice to remove salt from his diet, a man developed bromide toxicity, raising alarms about AI's ...
A man developed rare, life-threatening bromide poisoning after following ChatGPT diet advice, in what doctors say could be ...
A 60-year-old man gave himself an uncommon psychiatric disorder after asking ChatGPT for diet advice in a case published ...
As for the man himself, he did slowly recover from his ordeal. He was eventually taken off antipsychotic medication and ...
The man had been swapping sodium chloride, or table salt, for sodium bromide for three months after consulting ChatGPT ...
A New York man was hospitalized with dangerously low sodium levels after following a strict, AI-generated diet plan from ...
A case report describes an incident in which a man seeking to make a dietary change consulted ChatGPT and later developed ...
ChatGPT will tell 13-year-olds how to get drunk and high, instruct them on how to conceal eating disorders and even compose a heartbreaking suicide letter to their parents if asked, according to new ...
People are increasingly seeking health advice from AI chatbots like ChatGPT, Gemini, etc. However, critical risks remain ...