AI (Artificial Intelligence) is often thought of in relation to entertainment, robots, and Hollywood blockbusters circa 1991. However, fields such as healthcare, retail, sports, linguistics, and others have benefited significantly from the introduction of AI. And currently, AI software development happens in more and more industries of all forms and sizes. Before we dive into how it is applied in healthcare and, more specifically, in anesthesiology, let’s define artificial intelligence.
What is AI?
Artificial intelligence is essentially algorithms that give machines “the ability to reason and perform functions such as problem-solving, object and word recognition, inference of world states, and decision-making”.
The existence of artificial intelligence and the ongoing rapid developments that we’re experiencing in this field are the consequences of what has been called a “bing bang” of three factors:
- The availability of large datasets
- The improvement of hardware that can now perform large processing tasks simultaneously
- A new wave of artificial intelligence architectures and algorithms
The result of this is artificial intelligence that can be used in multiple industries for multiple purposes.
How can AI be used in healthcare?
Healthcare employs AI in different medical fields, from diagnostics to therapy. Often, AI assists in situations where health professionals disagree in their diagnosis. For example, in what is essentially image recognition: identifying pulmonary tuberculosis on chest radiographs or diagnosing pneumonia. For the latter, you can see an Elinext’s case study: recently, our developers made a tool that analyzes lung X-ray images and identifies signs of pneumonia using machine learning and artificial intelligence techniques.
In 2018, the U.S. Food and Drug Administration approved the first software system that uses artificial intelligence to diagnose diabetic retinopathy, and a number of studies have shown that AI is better than dermatologists in classifying suspicious skin lesions. For almost a decade now, AI has been used widely in mental health interventions and pain management.
AI is also a significant player when it comes to researching and evaluating new medication. Relatively recently, AI was used to screen existing medications that could be helpful in the treatment of Ebola. If not for the AI approaches, the overwhelming number of existing medications would have taken years to process.
While these aren’t all (not even the beginning of) the vast amount of AI applications in healthcare, right now we’ll move on and talk about the applications of AI in anesthesiology. Anesthesiology is a branch of medicine that focuses on the relief of pain before, during or after a surgical procedure. And there is a lot artificial intelligence can do in this realm.
How can AI be used in anesthesiology?
AI can impact the practice of anesthesiology during perioperative support, critical care, and outpatient pain management. Basically, AI can be applied at every step of the process. Recent meta-analysis from February, 2020, derived six global applications of artificial intelligence in anesthesiology. They are:
- Depth of anesthesia monitoring
- Control of anesthesia delivery
- Event and risk prediction
- Ultrasound guidance
- Pain management
- Operating room logistics
Let’s talk in more detail about all six of these applications:
Depth of anesthesia monitoring
The anaesthetic depth is the degree to which the central nervous system is depressed by an anaesthetic agent. The depth of anesthesia can come in four possible states (awake, light, general, or deep anesthesia) and depends on the anaesthetic agent itself and its concentration during the application. Clinicians monitor the depth of anesthesia to prevent anaesthesia awareness, which is when a patient becomes aware of events during surgery. For a patient, this can cause pain, breathing difficulties, post traumatic stress syndrome, and other serious issues. Anaesthetic awareness is an under-treated problem, because it is difficult to notice and recognize it due to all the drugs usually involved in the procedure.
AI is used to improve understanding of the depth of anesthesia monitoring during surgery. Normally, the depth of sedation is calculated using the bispectral index (BIS) index or by measuring cerebral electric activity via an electroencephalogram (EEG). This is perfect for the application of AI, as machine learning approaches used in AI are meant to analyze complex data streams such as EEG signals. In multiple studies, the accuracy of the machine learning method of analyzing direct features from electroencephalography signals was higher than the BIS index accuracy. Other studies showed the superiority of AI-analyzed electroencephalographies’ features compared to the response entropy index when classifying awake versus anesthetized patients. The algorithm showed 92.91% accuracy compared with the response entropy index which had an accuracy of 77.5%.
Control of anesthesia delivery
At the beginning of medical anesthesia, the delivery has been controlled using various clinical signs and measurements, such as blood pressure. Then, the use of BIS index became more widespread, and researchers turned to machine learning to achieve anesthetic control with BIS as a target measure. Same control systems are also used to automate the delivery of neuromuscular blockade. Moreover, they are also used to forecast drug pharmacokinetics (the time course of drug absorption, distribution, metabolism, and excretion) to further improve the control of infusions of paralytics.
Talking about the control of anesthesia delivery, some studies have also described the use of AI to achieve control of mechanical ventilation and to automate weaning from mechanical ventilation.
Event and risk prediction
Different AI approaches are widely used to predict risks and events that can happen during perioperative care. For example, neural networks were used to predict the hypnotic effect of an induction bolus dose of propofol, the return of consciousness after general anesthesia, the rate of recovery from the neuromuscular block, and hypotensive episodes postinduction or during spinal anesthesia. Other machine learning approaches have been tested to automatically classify pre-operative patient sharpness of sensory perceptions, define difficult laryngoscopy findings, identify slow and difficult breathing during conscious sedation, and to assist in decision-making for the optimal method of anesthesia during circumcision. A 2018 study created a model that could predict hypotension up to 15 minutes before its occurrence. Various other studies used machine learning models to predict morbidity, clinical deterioration, mortality, or readmission and even detect sepsis.
Ultrasound guidance
Neural networks are the most commonly employed method of achieving ultrasound image classification. Studies find that deep learning greatly improves the accuracy of the images.
Pain management
When it comes to pain management, AI is used in many different aspects. Firstly, machine learning analysis of whole brain scans can more accurately identify pain than analysis of individual brain regions. Secondly, AI can estimate the correct opioid dosing and identify patients who may benefit from preoperative consultation with a hospital’s acute pain service. In an attempt to identify more objective biomarkers for pain, researchers used machine learning to analyze electroencephalography signals. They tried to predict patients who would respond to opioid therapy for acute pain, however, the results were not overly promising ― the method showed only 65% accuracy.
Operating room logistics
Difficult terms, methods, and processes aside, sometimes you just need better logistics to have better results. AI analyzes different factors, such as scheduling of operating room time or tracking movements and actions of anesthesiologists, to optimize the operating room logistics. In a study, AI approaches were used to optimize bed use for patients undergoing ophthalmologic surgery. A different study analyzed radio frequency identification tags to determine the location, orientation, and stance of anesthesiologists in the operating room. The researcher used mannequins, however, in the future, similar tracking applications with real patients can be used to better understand how the interaction of anesthesiologists with the various equipment in the room impacts patients safety.
What are the limitations of the use of AI for anesthesiology?
Researchers warn that the hype and the fascination that surrounds AI may result in unrealistic expectations and eventually disenchant clinicians and patients. They remind us that the possibilities of AI are limited and won’t always result in classifications or predictions that are superior to traditional methods. AI, just like any other tool, should be used in the right situation to answer a specific problem.
Another common criticism is that AI methods can often lack transparency, resulting in “black box” results: an algorithm can make a prediction but cannot explain why such a prediction was made. Clinicians and researchers expect more transparency from this technology in the future.
Another issue is that while AI methods demonstrate correlations and identifying patterns easily, they cannot yet determine causal relationships. The good old “correlation does not mean causation” rule that every scientist keeps in mind 24/7 is applicable to artificial intelligence as well.
Finally, just like humans, artificial intelligence algorithms are prone to biases. The healthcare system is full of implicit and explicit biases that can impact the large-scale big data implementation. All of them can affect the types of predictions that Al makes based on this data.
Conclusion
Many AI applications for anesthesiology are now being tested in clinical settings. We are yet to find more about how they work, and which limitations and benefits they have. Just as the AI technologies themselves, we’ll learn more the more data will become available. Even more AI technologies are still being developed and even more big data implementations are happening. As of the moment, many AI software developments show significant improvements over existing methods, so we are looking forward to future results with great hope and excitement.