Dive Brief:
- Generative artificial intelligence could capitalize on the healthcare industry’s wealth of unstructured data, alleviating provider documentation burden and improving relationships between patients and their health plans, according to a new report by consulting firm McKinsey.
- The report argues generative AI could help payers quickly pull benefits material for members or help call center workers aggregate information during conversations about claims denials. Providers could use AI to take conversations with patients and turn them into clinical notes, create discharge summaries or handle administrative questions from workers at health systems.
- But healthcare leaders should start planning now if they want to use generative AI, as the risks can be high, the report said. Data fidelity and accuracy is key, so executives should begin assessing the quality of their AI tech stacks and considering potential problems like bias and privacy concerns, according to McKinsey.
Dive Insight:
The healthcare industry already uses AI for a number of purposes like predicting adverse events or managing operating room schedules, but generative AI, which can create new content like text, images and code in response to prompts, is currently being hyped as the latest game-changer in the industry.
A number of hospitals are already taking steps to implement the technology, despite risks to using generative AI in healthcare, including the possibility of providing biased or inaccurate responses.
Instead of integrating unproven technology into clinical applications, healthcare organizations may want to start with administrative or operational use cases, McKinsey analysts said.
Data privacy is particularly important in healthcare, and open-source generative AI products may not offer the necessary security, the report added.
Human oversight will be key as the industry explores using these tools, McKinsey analysts noted.
“Bringing gen AI to healthcare organizations will affect not only how work is done but by whom it is done. Healthcare professionals will see their roles evolve as the technology helps streamline some of their work,” Shashank Bhasker, Damien Bruce, Jessica Lamb and George Stein wrote. “A human-in-the-loop approach, therefore, will be critical.”
Generative AI’s efficacy in healthcare settings is still being determined.
Google’s large language model geared toward medical applications, Med-PaLM, performs well when asked medical questions. However, it’s still inferior to human clinicians and isn’t ready for use with patients, according to a recent study published in Nature.
Meanwhile, Microsoft-owned clinical notetaker Nuance has integrated GPT-4 into its clinical notetaking tool. Nuance says the new application, which listens to conversations with patients and automatically transcribes them in medical records, is notably faster than old products, but has yet to release accuracy measures for the AI-backed transcription tool.