LAS VEGAS—After years of hype, artificial intelligence and machine learning have arrived at a point where anesthesiologists may soon see them increasingly included in clinical workflows if some remaining hurdles can be worked out, speakers said during a presentation at the 2023 annual meeting of the Society for Technology in Anesthesia (STA).
“AI and machine learning are going to come to our clinical practice in the next five to 10 years in the same way that we have seen technology, such as Google and GPS, take hold in our personal lives,” said Hannah Lonsdale, MBChB, FRCA, an assistant professor of pediatric anesthesiology at Vanderbilt University Medical Center, in Nashville, Tenn.
However, she cautioned, “the onus is on clinicians to make sure that the clinical AI tools they’re presented with are appropriate for their own practice and that they are going to add value to their patient care.”
Medicine has traditionally been slower than other industries in adopting new technologies, Lonsdale said. As such, most AI projects published in medical journals have focused on development and validation of machine learning models. Only a minority of published research reports have been integrated into clinical workflows, and even fewer have demonstrated improvements in patient outcomes.
To date, few AI programs in anesthesiology have been FDA approved; things are still much in a developmental stage, Lonsdale told said. However, one potential useful application includes using machine vision (the use of imaging to make predictions) to guide intubation using video laryngoscopy. Another is AI-guided regional anesthesia software that overlays an ultrasound image to identify structures.
“There’s a lot of interest in prediction of patient outcomes, such as which patients are most likely to have adverse events or need a blood transfusion while under anesthesia,” she said. “So if we know in advance that this patient is at an increased risk of having an event, we can put extra planning in place to accommodate that, or tailor the ordering of blood products more specifically by using AI to look at influential patient factors.”
Several hurdles remain for wider adoption of AI technologies, Lonsdale said. One is data. Data sets from patients need to be optimized for future use with AI and labeled in a manner that AI programs can appropriately pull them for study. There have been no universally agreed-upon data collection standards to support interoperability among different electronic health record (EHR) systems and other data sources. More EHRs today are embracing an interface called FHIR (Fast Healthcare Interoperability Resources) as a standard for healthcare data to facilitate this type of exchange. Data also need to be cleaned, or preprocessed, Lonsdale said, to remove outliers, manage any missing data and select appropriate features for study.
Expert interdisciplinary AI care teams also must be created, she said, involving data scientists and engineers, clinicians, information technology support, end-user clinical providers, ethicists, and project managers.
When developing projects, it’s essential to pick the right questions for study, Lonsdale said. This might be in areas of high clinical value, and should represent areas of medicine or operations where there is consensus by providers on the appropriate management, where necessary data exist for study, and the application of AI has the promise to reduce cognitive burden and provide useful insight for providers.
A useful starting point for deployment of AI projects could involve piloting solutions with a small group of early adopters motivated to use the programs first, to iron out kinks, Lonsdale said. The most savvy projects embed some type of evaluation strategy to determine whether new tools are effective.
Trust and regulations present additional challenges, she noted. “As researchers, it’s our responsibility to provide as much information as possible about AI so that clinical anesthesiologists have the tools they need to be able to interpret whether these new programs that we’re offering are useful, and can trust them to use in their practice.” AI developers must also factor in ethical concerns, such as ensuring diversity in data sets.
Finally, she said, anesthesiologists and other clinicians must be able to maintain autonomy when it comes to AI programs and not feel compelled to follow program outputs, just as a driver may opt against a GPS directive to drive down a road covered in mud or branches that could be perilous. “AI is simply a tool like all other tools that support decision making,” she said. “It’s not intended to replace clinical skills and judgment.”
The near future is likely one of augmented, not artificial, intelligence, she said, to help clinicians in patient care efforts.
Traditionally, improving patient care through technology has focused on new devices, drugs or therapies, commented session moderator Jonathan Wanderer, MD, MPhil, the president of the STA. But the past decade has seen the rise of clinical information systems and the importance of applied clinical informatics.
“We’re entering into a new era powered by the availability of cheap computing, as well as advances in machine learning and AI that will power the next set of innovations that we’re going to see in our perioperative space,” said Wanderer, a professor of anesthesiology and biomedical informatics and the medical director of perioperative informatics at Vanderbilt University. “There’s a lot of potential to have some transformation of the way we provide care. Things that previously had been too difficult, such as automated processing of waveform, imaging and video data, are now very much feasible. It’s a very exciting time to be doing anesthesia because it really feels like the future is now.”
—By Karen Blum
Lonsdale, Wanderer and Zapf reported no relevant financial disclosures.