The truth behind a medical condition that never existed

0
The truth behind a medical condition that never existed

On April 26, 2024, researcher Lazljiv Izgubljenovic published a paper about bixonimania, a disease that causes dark patches on patients’ eyelids. Several weeks later, the same researcher from “Asteria Horizon University” in “Nova City,” California published a second paper on the disease. 

Almost immediately after the first mention of bixonimania, artificial intelligence tools around the world began alerting users to the disease. One in 90,000 people were affected, one reported. Another advised people with the condition to seek medical attention. At least one other researcher cited Izgubljenovic’s study in their own work on periorbital melanosis, a condition that also causes dark circles around the eyes.

In the years since, all of the papers, including the original studies on bixonimania, have been retracted. That’s because the disease was entirely invented. The researcher, his university and his hometown are all fictitious. (Both of the studies authored by Izgubljenovic appeared on a preprint platform, meaning neither underwent peer review.)

The experiment was dreamt up by Almira Osmanovic Thunström, a doctoral researcher at the University of Gothenburg and AI strategist at the research organization, Chalmers Industriteknik. She wanted to demonstrate how quickly information — and misinformation — is absorbed into the AI ecosystem and further disseminated to users.

AI in health and science

AI tools are already being used in hospitals and clinics across the nation. Recent surveys found about one in six adults — and one in four under 30 — use AI tools for medical advice at least once a month. 

Leading tech companies are jostling to cater to this market. Earlier this year, Amazon, OpenAI and Anthropic all announced new AI platforms that provide 24/7 health guidance.

Some say AI could help patients be more informed, especially given how little time they spend face-to-face with doctors. 

“Our health care system doesn’t do a great job of giving them that information,” Sina Bari, a physician and the senior director of medical AI at iMerit Technology, told Straight Arrow News in February. “Now you have the potential for these large language models to just provide so much information and answer so many questions.”

Joann Elmore, a physician and professor at the University of California Los Angeles, said AI can be used pretty reliably to assist with diagnostic imaging. 

“The eyes of AI don’t fatigue like human eyes,” she said.

Others have touted that ambient scribe tools that record doctor’s visits and transcribe them into clinical notes alleviate a major burden for doctors, who are burning out at increasing rates.

And while some tools may be helpful, many others are being implemented under lax regulatory systems without any evidence that they work. Under current law, physicians who use an AI tool are still responsible for all treatments and outcomes. 

It is less clear who might be liable for mistakes produced by public-facing tools such as AI chatbots. A number of ongoing lawsuits allege that OpenAI, which created ChatGPT, could be held responsible for the information it provides. Earlier this week, victims of the April 17 Florida State University shooting announced that they plan to sue ChatGPT, alleging that it failed to dissuade — and may have even advised — the alleged shooter in carrying out his attack which led to the deaths of two people.

Catching on to bixonimania

More recently, AI platforms seem to have caught on to the fictitious nature of bixonimania disease. Last month, several platforms wavered on their stance. Perplexity described it as an emerging disease. ChatGPT said it was probably a made-up or fringe disease, according to Nature

Today, neither Gemini, Claude nor ChatGPT mentioned the illness when prompted by Straight Arrow with questions about dark circles or sore, itchy eyes. When asked directly if the symptoms could be bixonimania, all three platforms said it was not a real disease.

Ella Rae Greene, Editor In Chief

Leave a Reply

Your email address will not be published. Required fields are marked *