listen to article
This voice is automatically generated. Please let us know if you have any feedback.
LAS VEGAS — When the University of Illinois Hospital and Health Sciences System was testing an artificial intelligence-powered tool to create responses to messages, patients were misspelling the names of medications, Chief Medical Information Officer Karl said.・Mr. Kochendorfer recalled this at a panel discussion at the same hospital. HLTH Last week's conference.
This mistake resulted in the AI reporting side effects of a medication the patient was not using because the nurse forgot to double-check the response.
In the end, he said, it wasn't a big problem and all he needed to do was call the patient or send another message to fix it. However, it could seriously affect this tool.
“The pilot almost lost his life. (…) And it happened on the first day,” he said.
As the healthcare industry struggles with how to safely implement AI, investors and healthcare systems are first looking to implement tools that automate administrative and back-office tasks, thereby reducing healthcare worker burnout. HLTH conference experts said this could reduce the risk to patient care.
But the pressure to implement this technology is on. Proponents argue that AI could help solve major workforce challenges in the healthcare industry. The country will face a shortage of more than 100,000 critical health workers by 2028 as the overall population ages and requires more care, according to a report by consultancy Mercer.
While AI has the potential to be transformative, experts say the field needs to tread carefully when introducing new tools. The stakes are high, with policymakers and experts raising concerns about accuracy, bias, and security.
Deploying AI in healthcare is complex, and the industry needs to learn lessons from some of the predictive tools introduced previously, says Rohan, co-founder and chief medical officer of health information app Roon. Mr. Ramakrishna said this in the following panel discussion. HLTH.
“I think one of the things we have learned is that we need to be very careful in applying AI solutions in medical settings,” he said.
How AI can solve “simple mismatches between supply and demand”
Daniel Yang, Kaiser Permanente's vice president of AI and emerging technologies, said AI could help alleviate one of the biggest problems in healthcare: patients with more complex and chronic conditions. The number of elderly patients is increasing, and the number of healthcare providers who can support them is decreasing. discussion.
As older Americans require more care, Millennials, America's largest generation, want a more on-demand consumer experience, he added. But supply constraints make that difficult to achieve. It takes years to train new doctors, and fewer doctors mean more burnout, delayed care, and higher costs.
“This isn't even about AI; it's actually about what I think is plaguing healthcare in general,” Yang said. “What we're seeing is a simple mismatch between supply and demand.”
AI has the potential to enhance clinicians' workflows and provide better care. Yang said the algorithms developed by Permanente Medical Group researchers save 500 lives each year by alerting patients to patients at risk of clinical decompensation or when their condition is deteriorating. said.
“That's better for me because it saves me time. But at the end of the day, it's better for patients because they feel like they're being heard. It's better for me. It was a truly transformative experience.”
christopher wickson
Vascular Surgeon at Savannah Vascular Institute
This technology could also be a way to reduce burnout and improve retention rates by reducing the amount of time clinicians spend on administrative tasks such as note-taking. Healthcare providers have long reported that they spend hours working on electronic records, often to the detriment of patient care.
Christopher Wixson, a vascular surgeon at Savannah Vascular Institute, said he is on the verge of leaving medicine as the industry moves to electronic medical records. Gathering information while listening to patients was difficult, and nonverbal cues were easy to miss when focused on a laptop screen.
But ambient documentation, where AI tools typically record conversations between clinicians and patients and draft notes, has changed the game, he said.
“That's better for me because it saves me time,” Wixson said during a panel discussion. “But at the end of the day, it's better for patients because they feel like they're being heard. It was a really transformative experience for me.”
Investors, health systems focus on administrative burden
Given providers' heavy administrative workloads and concerns about model errors and biases that are more relevant to clinical decision-making, products that address administrative concerns are among the top priorities for AI adoption.
Some investors are also interested in automating these operational tasks.
“We will continue to not sexy Back-office automation is really about reducing the burden on employees and has nothing to do with clinical as it is related to AI,” said Payal Agrawal Divakaran, Partner at .406 Ventures.
A report released earlier this month by Silicon Valley Bank found that management AI is bringing in even more venture capital funding this year. By 2024, administrative AI companies have raised $2.4 billion, compared to $1.8 billion for clinical AI. This is probably due to lower regulatory and institutional hurdles, especially for decision support tools.
“We're seeing these efforts not just in administrative work, but also in pre-approval and lower-value work,” Megan Scheffel, head of credit solutions for SVB's life sciences and healthcare banking division, said in an interview. “You can also take your office staff and move them to higher value projects.”
There are now many opportunities to use large language models as drafting tools, such as notes and nurses. hand over Greg Collard, distinguished scientist and head of health AI at Google Research, said during a panel discussion.
Monitoring capabilities are built in as the provider must review the output before making a final decision. It also makes it easier to assess quality by asking users about their experiences and seeing how many revisions are needed, he said.
But you still need to be methodical when evaluating operational and management tools and testing them with a health system's local patient data, says Todd Schwar, partner at Cleveland Clinic Ventures. Zinger said in an interview. Cleveland Clinic's governance structure also focuses on issues such as the data used in our tools, how that information is protected, and whether our products are safe and positively impact patient care.
It's safer to start with a management or operational product like Ambient Scribe or a revenue cycle management tool.
“There's no risk in making clinical decisions, right?” he said. “That level of trust doesn't exist. I think it's going to take time.”
Preparing for AI implementation
While AI may hold promise in reducing healthcare worker burnout, health systems face challenges in engaging healthcare workers, setting up pilots, establishing governance policies, and rolling out products.
In one example shared with HLTH, St. Louis-based healthcare provider BJC identifies clinicians who took days to sign documents and who wrote long notes for an ambient note-taking pilot. said Michelle Thomas, clinical associate director. Chief Medical Information Officer for BJC Healthcare and Ambulatory Information Officer for BJC Medical Group.
“I think we got a response from 1 in 20 people, so we're immediately wondering if the people who need it the most aren't interested in it,” she said during the panel discussion. Ta.
Thomas said they changed their policy and decided to invite anyone interested to participate in the pilot, encouraging them to move quickly.
still, It is important to think carefully about which providers should participate in the test. Some physicians did not understand the requirements for participating in the pilot, including the need to spend time providing feedback and managing complaints about the new product.
Health systems also need to consider what outcomes they want when implementing AI tools. According to Thomas, many doctors spend a lot of time editing notes, rather than revising them, to create notes that suit their personal style, which doesn't save much time. In contrast, advanced practice providers approved Ambient Notes sooner after review.
“You really have to decide what your ROI is. Are you looking for financial savings? Are you looking for time savings? Are you looking for hard numbers to justify this technology? Or is it softer than that? Are you looking for patient satisfaction?”
cyber security — Already a challenge for the healthcare sector more generally — is also key to AI implementation.
Melanie Fontes Reiner, director of the Department of Health's Office of Civil Rights, said in an interview that organizations need to ponder the same questions they ask of systems that use protected health information.
When entering into relationships with developers, are there business partnership agreements in place? Have you thought about data deletion policies for information stored in the cloud? Have you considered who needs access to the data? Do you have it?
“I certainly think there needs to be a balance as we walk here, and it requires all of us to take responsibility for how we use this information and how it impacts our systems and our patients. “We need to think about what could be causing harm and how we can take proactive steps to protect it,” she said.