Google is expanding access of its large language model that’s specifically trained on medical information through a preview with Google Cloud customers in the healthcare and life sciences industry next month.
A limited group of customers have been testing the artificial intelligence, called Med-Palm 2, since April, including for-profit hospital giant HCA Healthcare, academic medical system Mayo Clinic and electronic health records vendor Meditech.
Google declined to share how many additional healthcare companies will be using Med-PaLM 2 following the expansion in September, but a spokesperspon said, “there are customers across healthcare sectors that have expressed interest and will be getting access.”
“We’re thrilled to be working with Cloud customers to test Med-PaLM and work to bring it to a place where it exceeds expectations,” Google health AI lead Greg Corrado told reporters during a press briefing on the preview.
Med-PaLM was the first AI system to pass U.S. medical licensing exam style questions. Its second iteration, which Google introduced in March this year, bettered its predecessor’s score by 19%, passing with 86.5% accuracy.
The LLM is not a replacement for doctors, nurses and other medical caregivers, but is instead meant to augment existing workflows and work as an extension of the care team, Corrado said.
However, Med-PaLM faces big questions that have plagued other generative AI in healthcare, including the potential for errors, the complexity of queries it can perform, meeting product excellence standards and a lack of regulation — despite already being piloted in real-world settings.
HCA has been testing Med-PaLM to help doctors and nurses with documentation, as part of the health system’s strategic collaboration with Google Cloud launched in 2021.
The system has been working with health tech company Augmedix and using Google’s LLM to create an ambient listening system that automatically transcribes doctor-patient conversations in the emergency room, according to Michael Schlosser, HCA’s senior vice president of care transformation and innovation.
HCA is currently testing the system in a cohort of 75 doctors in four hospitals, and plans to expand to more hospitals later this year as the automation improves, Schlosser said during the press briefing.
HCA is also piloting using Med-PaLM to generate a transfer summary to help nurses with patient handoffs at UCF Lake Nona Hospital in Orlando.
Meanwhile, Meditech — a major player in the hospital software space — is embedding Google’s natural language processing and LLMs into its EHR’s search and summarization capabilities.
Documentation is an appealing potential use case for generative AI that could cut down on onerous notetaking processes. Along with Google, other tech giants like Amazon and Microsoft have announced or expanded recent AI-enabled clinical documentation plays.
Privacy watchdogs, physician groups and patient advocates have raised concerns around the ethical use of AI and sensitive medical data, including worries about quality, patient consent and privacy, and confidentiality.
In 2019, Google sparked a firestorm of controversy over its use of patient data provided by health system Ascension to develop new product lines without patient knowledge or consent.
Google says that Med-PaLM 2 is not being trained on patient data, and Google Cloud customers retain control over their data as part of the preview. In the case of the the HCA pilot, patients are notified of the ambient listening system when they enter the ER, HCA’s Schlosser said.
Doctors are also leery about ceding control to what is in many cases a black box algorithm for determining information and the right course of patient care.
Schlosser said that the for-profit operator is first building AI into easy-to-accept use cases, like automating handoffs or scheduling, to make doctors and nurses more comfortable with the technology, before eventually implementing AI into additional parts of clinical practice.
“You get into nudging in the workflow around documentation, and then you could slowly step your way up to higher and higher levels of decision support,” Schlosser said. “But I want clinicians to fully embrace AI as a partner that’s making their life easier before we start getting into some of those more controversial areas.”