News
After the escape attempt, the man was given an involuntary psychiatric hold and an anti-psychosis drug. He was administered ...
The man had been using sodium bromide for three months, which he had sourced online after seeking advice from ChatGPT.
1don MSN
After following ChatGPT's advice to remove salt from his diet, a man developed bromide toxicity, raising alarms about AI's ...
A man developed rare, life-threatening bromide poisoning after following ChatGPT diet advice, in what doctors say could be ...
1don MSN
A 60-year-old man gave himself an uncommon psychiatric disorder after asking ChatGPT for diet advice in a case published ...
As for the man himself, he did slowly recover from his ordeal. He was eventually taken off antipsychotic medication and ...
The Independent on MSN2d
A man asked ChatGPT how to remove sodium chloride from his diet. It landed him in the hospitalThe man had been swapping sodium chloride, or table salt, for sodium bromide for three months after consulting ChatGPT ...
A New York man was hospitalized with dangerously low sodium levels after following a strict, AI-generated diet plan from ...
1d
Live Science on MSNMan sought diet advice from ChatGPT and ended up with 'bromide intoxication,' which caused hallucinations and paranoiaA case report describes an incident in which a man seeking to make a dietary change consulted ChatGPT and later developed ...
An advocacy organization is calling on OpenAI to develop more guardrails, while the company is forming an advisory group of ...
ChatGPT will tell 13-year-olds how to get drunk and high, instruct them on how to conceal eating disorders and even compose a heartbreaking suicide letter to their parents if asked, according to new ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results