Data privacy poses challenges to AI in healthcare
June 5, 2024
Privacy and data access stand head-to-head like combatants, challenging researchers who seek to develop healthcare applications for artificial intelligence.
Dakota State University doctoral student Jennifer Schulte discussed this challenge in New Orleans recently when she presented her paper, “Artificial Intelligence Usage and Data Privacy Discoveries within Health,” at the 39th International Conference on Computers and Their Applications.
“Not many people want to share healthcare data, so it’s hard to get thorough research,” she said, explaining the importance of her presentation.
As a result, the databases used for healthcare applications are not truly representational, Schulte explained. This impacts the effectiveness of artificial intelligence (AI) in the healthcare industry. To improve healthcare applications, the data must become “a true representation of people of the world,” she said.
Schulte, whose undergraduate and graduate work were also completed at DSU, is currently a faculty member in The Beacom College of Computer & Cyber Sciences. She teaches introductory courses to incoming freshmen.
Her passion for healthcare comes from her parents.
“Growing up with them having discussions at home about their work, which was healthcare, I have an interest in that area,” she said.
She intuitively understands the impact that AI could have. AI is already being used to assist in clinical decision making and to help identify potential health risks earlier. Mobile health (mHealth) devices, phone applications, wearable devices and sensors also rely upon it.
Schulte explained that while datasets are not truly representational, all types of people are being treated using information that is currently available.
“There’s so much we could learn and improve upon with greater access,” she said.
Federal law poses one of the challenges. The Health Insurance Portability and Accountability Act (HIPAA) of 1996 established standards for the security of health information. Unless patients waive the protections provided by this law, healthcare providers cannot share this information.
Another challenge is skepticism among healthcare professionals as a result of AI’s black box. Whereas an airplane’s black box explains what has happened in the event of a tragedy, an AI black box hides the way the system works. This acts as a barrier to establishing trust with professionals.
“Doctors want to know how the model came to that conclusion,” Schulte said.
Her presentation was well received at the conference.
“There was a lot of agreement: ‘Yes, this is a problem we need to research,’” she said.
In addition, she was approached by colleagues at the conference who were interested in collaborating with her on future projects. Schulte said that completing her dissertation is her first priority. Then, she will engage in further research, though she hopes to continue teaching as well.
“I get joy out of teaching and research. I think I want an even balance of the two,” she said.