Many are familiar with Arnold Schwarzenegger’s Terminator or C-3PO from the Star Wars franchise. Although each is a fictional movie character, they both represent examples of the potential for the very advanced development of computer technologies. Both were able to communicate, make decisions, and expertly interpret data obtained from their surroundings. Science “fiction” is becoming “fact” much more quickly than any of us could have imagined when these movies were first released in the 1980s. Now, buzz words such as artificial intelligence (AI), machine learning, and deep learning are being used by nearly every industry, including medicine and electrophysiology.
In fact, the development of AI technologies is now dominating medical innovation. For example, a device called the Butterfly iQ, a smartphone-enabled ultrasound machine, was able to detect cancer.1 In Denmark, an AI algorithm is in use to assist emergency dispatchers to more efficiently identify cardiac arrest.2 AI possibilities are limitless, and new applications for advanced computing are being developed every day.
Artificial Intelligence, Machine Learning, and Deep Learning Defined
The best way to explain these three terms is to imagine a set of Matryoshka dolls, the Russian nesting dolls that begin with a small doll inside progressively larger dolls. At the center of the stack is deep learning, then machine learning, and finally, AI. While all deep learning and machine learning are forms of AI, not all AI is the former. In simple terms, AI is the simulation of human intelligence by computer systems — a concept that has been around since the 1950s. With AI, a machine is able to perform tasks that normally require human intelligence, such as visual perception, speech recognition, and decision making, just to name a few. Machine learning relies on the concept of using algorithms to analyze data, learn from it, and then make determinations, decisions, and predictions. In essence, machine learning is the ability for computers to “learn” without being specifically programmed to do so through complex pattern recognition. The capability of machine learning to allow a computer program to modify itself when exposed to new data is what sets it apart from other forms of AI. Machine learning works to do one of two things (or both): 1) minimize errors, or 2) maximize the likelihood that its prediction will be true.
Deep learning is a subset of machine learning, and is the smallest doll in the Matryoshka stack. Deep learning relies on two concepts — deep artificial neural networks and deep reinforcement learning. Neural networks are a set of algorithms designed to recognize patterns in an effort to cluster and classify data. Ultimately, deep learning maps inputs to outputs and finds correlations. Reinforcement learning centers around more goal-oriented algorithms. These algorithms are actually able to learn how to attain a complex objective (goal) or maximize along a particular pathway over many steps — such as maximizing the points won in a game over many moves. One way to think about reinforcement learning is delayed gratification; in reinforcement learning, the machine must learn to correlate immediate actions with delayed results. The machines are rewarded for correct decisions and penalized for those that are incorrect. Deep learning has been the catalyst for the development of the field of AI by enabling many practical applications, particularly in healthcare and medicine.
Applications In Electrophysiology and Future Directions
AI is already a part of what we do in the field of cardiac electrophysiology. As leaders in innovation, those of us in cardiovascular medicine will continue to lead the way in the development of new technologies and applications for AI through the following:
- Personalized medicine. Defined as the use of individual health data paired with predictive analytics, personalized medicine has the promise of promoting better and more effective therapies that are catered to a single person’s biology. Traditionally, medicine has grouped together populations and suggested treatments based on the average person’s statistics — with limited success. This paradigm is changing. In 2017, the University of Tokyo reported a case in which IBM’s Watson supercomputer saved a woman’s life by correctly diagnosing her rare form of leukemia after she had previously failed a traditional approach to therapy. In a matter of minutes, Watson reviewed more than 20 million cancer research papers, provided an accurate diagnosis, and recommended a more appropriate and individualized treatment protocol for the patient.3
- Disease identification. In electrophysiology, there are many new digital tools that utilize AI and deep learning for diagnosis of diseases. For example, a new study found that the KardiaBand (AliveCor) for Apple Watch and Kardia algorithm successfully detected atrial fibrillation with a high degree of accuracy; additional research demonstrated that, when paired with AI technology, AliveCor's ECG device was also able to detect hyperkalemia with a high degree of accuracy.4 The use of AI in the determination of atrial fibrillation may lead to significantly improved outcomes in these patients through stroke prevention, anticoagulation management, and follow-up after ablation. AI applications also reach beyond cardiovascular disease — other studies have shown that, compared to highly trained radiologists, AI is more quickly and accurately able to detect signs of Alzheimer’s disease on brain scans.5
- Disease management. AI can be used for remote monitoring and updated data access to assist healthcare providers in making real-time treatment decisions. For example, in the electrophysiology space, the automation of remote device monitoring could quickly identify false-positive transmissions, which could ultimately save time, streamline workflow, and improve outcomes. The use of AI in remote monitoring helps to facilitate care and save healthcare dollars by identifying potential complications or changes in clinical status before they become clinically relevant. (Figure 1)
- Therapy development and clinical trials. Using machine learning and AI to better identify potential patients for clinical trials could result in smaller, quicker, and less expensive trials. These technologies can draw on a wider range of data sources including social media, doctor visits, and genetic information. Finally, predictive models that use machine learning and AI could potentially identify adverse events in real time, improving the safety of clinical trials. ν
Disclosure: Dr. Campbell discloses that he is the CEO of PaceMate.
- Molina B. Doctor says he diagnosed his own cancer with iPhone ultrasound machine. USA Today. Published October 27, 2017. Available at https://usat.ly/2gHAnmh. Accessed March 26, 2018.
- Comstock J. AI software helps emergency dispatchers spot cardiac arrests. MobiHealthNews. Published January 15, 2018. Available at https://bit.ly/2DXAaVb. Accessed March 26, 2018.
- Feldman M. Watson Proving Better Than Doctors at Diagnosing Cancer. Top500.org. Published August 6, 2016. Available at https://bit.ly/2kDYp2y. Accessed March 26, 2018.
- Cleveland Clinic Study Affirms Accurate Detection of Atrial Fibrillation by KardiaBand for Apple Watch. Cision PR Newswire. Published March 12, 2018. Available at https://prn.to/2pIRh8c. Accessed March 26, 2018.
- Mullin E. AI can spot signs of Alzheimer’s before your family does. MIT Technology Review. Published March 19, 2018. Available at https://bit.ly/2HKiUVU. Accessed March 26, 2018.