LAS VEGAS—After years of hype, artificial intelligence and machine learning have arrived at a point where anesthesiologists may soon see them increasingly included in clinical workflows if some remaining hurdles can be worked out, speakers said during a presentation at the 2023 annual meeting of the Society for Technology in Anesthesia (STA).

“AI and machine learning are going to come to our clinical practice in the next five to 10 years in the same way that we have seen technology, such as Google and GPS, take hold in our personal lives,” said Hannah Lonsdale, MBChB, FRCA, an assistant professor of pediatric anesthesiology at Vanderbilt University Medical Center, in Nashville, Tenn.

However, she cautioned, “the onus is on clinicians to make sure that the clinical AI tools they’re presented with are appropriate for their own practice and that they are going to add value to their patient care.”

Medicine has traditionally been slower than other industries in adopting new technologies, Lonsdale said. As such, most AI projects published in medical journals have focused on development and validation of machine learning models. Only a minority of published research reports have been integrated into clinical workflows, and even fewer have demonstrated improvements in patient outcomes.

image

To date, few AI programs in anesthesiology have been FDA approved; things are still much in a developmental stage, Lonsdale told said. However, one potential useful application includes using machine vision (the use of imaging to make predictions) to guide intubation using video laryngoscopy. Another is AI-guided regional anesthesia software that overlays an ultrasound image to identify structures.

“There’s a lot of interest in prediction of patient outcomes, such as which patients are most likely to have adverse events or need a blood transfusion while under anesthesia,” she said. “So if we know in advance that this patient is at an increased risk of having an event, we can put extra planning in place to accommodate that, or tailor the ordering of blood products more specifically by using AI to look at influential patient factors.”

Several hurdles remain for wider adoption of AI technologies, Lonsdale said. One is data. Data sets from patients need to be optimized for future use with AI and labeled in a manner that AI programs can appropriately pull them for study. There have been no universally agreed-upon data collection standards to support interoperability among different electronic health record (EHR) systems and other data sources. More EHRs today are embracing an interface called FHIR (Fast Healthcare Interoperability Resources) as a standard for healthcare data to facilitate this type of exchange. Data also need to be cleaned, or preprocessed, Lonsdale said, to remove outliers, manage any missing data and select appropriate features for study.

Expert interdisciplinary AI care teams also must be created, she said, involving data scientists and engineers, clinicians, information technology support, end-user clinical providers, ethicists, and project managers.

When developing projects, it’s essential to pick the right questions for study, Lonsdale said. This might be in areas of high clinical value, and should represent areas of medicine or operations where there is consensus by providers on the appropriate management, where necessary data exist for study, and the application of AI has the promise to reduce cognitive burden and provide useful insight for providers.

image

A useful starting point for deployment of AI projects could involve piloting solutions with a small group of early adopters motivated to use the programs first, to iron out kinks, Lonsdale said. The most savvy projects embed some type of evaluation strategy to determine whether new tools are effective.

Successful Integration Requires Careful Planning

Moving AI and machine learning algorithms into clinical practice is challenging and involves careful planning, said Matthew Zapf, MD, an assistant professor of anesthesiology and the director of the Center for Evidence-Based Anesthesia at Vanderbilt University, in a companion presentation at the STA meeting.

“It’s really important for these efforts to move from what we call in silico, or in the computer, to actually helping patients in the real world,” he told Anesthesiology News. “It takes a lot of careful thought in order to do so, but there’s an enormous potential for helping patients have better outcomes in the future if we learn from each other, and are thoughtful about our choices and how we want to implement programs.”

When designing algorithms, it’s useful to start by thinking about the end product and where you would like it to fit into your health record systems or clinical workflow, then build your model around that, he said.

In a typical road map, teams would define the clinical problem they want to solve, choose a modeling strategy to try and establish a process to assess the model, he said. In defining the problem, consider how an algorithm would change management, if you already collected the data necessary for study, and if you have enough data to set up a model. Next, think about how your data are being collected and stored. “You want to make sure that the data used to train machine learning models is going to be available when you ask the algorithm to make a prediction,” he said.

Another consideration is what information will be fed into the system, Zapf said: “If we’re looking at the time period before surgery, we want to make sure that we’re capturing information only available before surgery, so that we can feed it into the algorithm and get similar results. It sounds like a simple thing, but it ends up being a little tricky based upon all the different ways data is stored.” Also determine what end users of the program will see. What would improve clinical care the most? Seeing a prediction from the program, or changing something in the EHR like treatment guidelines of a high-risk condition or a nudge toward obtaining specific lab values?

Trust and regulations present additional challenges, she noted. “As researchers, it’s our responsibility to provide as much information as possible about AI so that clinical anesthesiologists have the tools they need to be able to interpret whether these new programs that we’re offering are useful, and can trust them to use in their practice.” AI developers must also factor in ethical concerns, such as ensuring diversity in data sets.

Finally, she said, anesthesiologists and other clinicians must be able to maintain autonomy when it comes to AI programs and not feel compelled to follow program outputs, just as a driver may opt against a GPS directive to drive down a road covered in mud or branches that could be perilous. “AI is simply a tool like all other tools that support decision making,” she said. “It’s not intended to replace clinical skills and judgment.”

The near future is likely one of augmented, not artificial, intelligence, she said, to help clinicians in patient care efforts.

Traditionally, improving patient care through technology has focused on new devices, drugs or therapies, commented session moderator Jonathan Wanderer, MD, MPhil, the president of the STA. But the past decade has seen the rise of clinical information systems and the importance of applied clinical informatics.

“We’re entering into a new era powered by the availability of cheap computing, as well as advances in machine learning and AI that will power the next set of innovations that we’re going to see in our perioperative space,” said Wanderer, a professor of anesthesiology and biomedical informatics and the medical director of perioperative informatics at Vanderbilt University. “There’s a lot of potential to have some transformation of the way we provide care. Things that previously had been too difficult, such as automated processing of waveform, imaging and video data, are now very much feasible. It’s a very exciting time to be doing anesthesia because it really feels like the future is now.”

—By Karen Blum


Lonsdale, Wanderer and Zapf reported no relevant financial disclosures.