HIPAA promised to keep your medical data secret. AI threatens to reveal it

0
HIPAA promised to keep your medical data secret. AI threatens to reveal it

In 1996, Congress enacted a law to assure Americans that their sensitive personal health data would never be disclosed without their consent.

Lawmakers had never heard of AI.

Thirty years after passage of the Health Insurance Portability and Accountability Act, known as HIPAA, artificial intelligence is disrupting the patient privacy that the law created. Individuals’ health information shared with AI chatbots can be stripped of identifying details and sold to everyone from data brokers to pharmaceutical companies.

Now, AI can be used to restore a patient’s identifying data, circumventing HIPAA, a New York University research team found.

HIPAA’s protections “are rapidly becoming outdated,” Lavender Jiang, a fifth-year data science PhD student at New York University, told Straight Arrow News. Jiang is part of a team that showed how AI can be used to examine anonymized patient notes to determine an identity.

“We believe HIPAA needs urgent updates to offer more robust protections against the sale of this data and we should exercise care when handling clinical notes,” Jiang said.

In other words, a specifically trained AI could use health data collected by a doctor’s AI receptionist to render HIPAA protections useless. Experts believe that updates to the law and strict regulations on AI’s use are more needed than ever.

Is AI HIPAA-compliant?

Threats to medical data have long existed. Hackers and data leaks have exposed personal, sometimes embarrassing health care data.

But as AI becomes increasingly integrated into the health care industry, many people wonder whether AI-powered chatbots and automated receptionists used by doctors protect patients’ private medical data.

Whether HIPAA applies to medical data gathered by AI depends entirely on who is deploying the technology. Providers, organizations and agencies subject to the law’s regulations are known as HIPAA-covered entities.

Those entities include providers such as doctors and psychologists and their clinics or practices. Health plans — whether from health insurance companies, an employer or the government — are also covered. 

More or less, HIPAA applies to any individual or entity that comes into contact with or processes protected health information.

Given that a doctor’s office is a HIPAA-covered entity, protections apply whether sensitive health data has been collected by a human or AI receptionist.

Importantly, however, medical information handed over to chatbots used by companies outside the health care industry do not appear to receive the same protections. 

Even as companies such as OpenAI and xAI tout the ability of chatbots to respond to health-related inquiries, experts warn that data protections outlined in terms of service agreements are not the same as those from HIPAA.

De-identifying — and re-identifying — data

Regardless of where it’s collected, health data can be altered to remove HIPAA protections. Protected health data is regularly stripped of identifying information, such as a patient’s name, in a process known as de-identification.

De-identified health data can then be sold to everyone from data brokers to pharmaceutical companies. This industry, currently valued at over $9 billion, has existed for decades. In the case of the pharmaceutical industry, prescription and insurance information can be purchased to in turn target doctors for marketing purposes.

While such sales may seem trivial given that the data has been anonymized, a team of researchers recently reported on the ease of re-identifying health care information, raising serious questions over HIPAA’s validity in the age of AI.

The New York University research team found re-identifying data to be trivial.

In one example they cited in a research paper, de-identified notes from a hospital that only mentioned a pregnant woman who enjoyed horseback riding allowed AI to single out a specific patient — correctly inferring the patient’s gender, socioeconomic class and the type of neighborhood she lived in.

Even when industry best practices are followed, de-identified clinical notes “remain statistically tethered to identity through the very correlations that confirm their clinical utility,” the research paper says. “The conflict is structural instead of technical.”

However, Jiang said tools exist to help patients keep their data secure.

“For many personal health care uses,” she said, “patients may be able to achieve satisfactory performance using open source models running on secure, local hardware, ensuring the data never leaves the patient’s control.”

The post HIPAA promised to keep your medical data secret. AI threatens to reveal it appeared first on Straight Arrow News.

Ella Rae Greene, Editor In Chief

Leave a Reply

Your email address will not be published. Required fields are marked *