ORLANDO, Fla. – Machine learning and artificial intelligence (AI) are playing increasingly important roles in health care. A panel of experts at the Healthcare Information and Management Systems Society (HIMSS) 2019 conference in Orlando, Fla., discussed how their organizations are using these innovations to improve how patients are treated.

Susan Dentzer, visiting fellow at Duke-Margolis Center for Health Policy, noted that "AI and machine learning are now real tools that are being used in the health care system to make health care better." For example, they are helping enroll subjects in clinical trials, making large academic centers less stressful for patients, helping institutions make sense of all the unstructured text that comes in electronic health records (EHRs), easing billing and reimbursement and identifying waste in health care.

Despite the increased emphasis on these modalities, "AI in medicine is not a new concept," Taha Kass-Hout, senior leader, health care and AI at Amazon, reminded the audience. "It goes all the way back to 1964."

Changes in AI

Change has proven rapid. "Two years ago, if you look at The Lancet, or BMJ, or the New England Journal of Medicine, or even JAMA . . . there was no mention of deep learning," he noted.

Amazon Web Services, (AWS) a subsidiary of the tech company, last November unveiled Amazon Comprehend Medical, a HIPAA-eligible machine learning service designed to process unstructured medical text and identify the necessary information within it. (See BioWorld MedTech, Dec. 4, 2018.) AWS has already done a few pilot projects with its Comprehend Medical, including the Fred Hutchinson Cancer Research Center and Roche Diagnostics. Specific to Fred Hutchinson, it was used to identify patients for clinical trials for specific cancer therapies by evaluating millions of clinical notes to glean and index information on medical conditions, medications and choice of cancer therapeutic options.

Kass-Hout, who also served as the FDA's first chief informatics officer, discussed the relationship with Fred Hutchinson during the panel, noting that there is only a 48-hour window to match a patient for a trial, and humans alone evaluate documents far more slowly. "Comprehend Medical today . . . we're going way over 11,000 documents per hour."

The center sees great hope for the offering, with Dentzer noting that it "basically believes that this will enable them to get to cures for cancer, as they're able to discretely identify the right patients for the right trials at the right time."

Reducing clinician burden

Manu Tandon, senior VP and chief information officer at Beth Israel Deaconess Medical Center, provided his organization's experience with machine learning. He noted that his is one of the few larger institutions that does not use a vendor EHR. Instead, it uses a home-grown one. "Our EHR, when it was developed three decades back, the big concern at the time was that clinicians would refuse to use it." In developing it, the organization emphasized ease-of-use.

While its system has worked, he did note that because computers are stationary, it can prove difficult to get information to where the clinician or nurse is. "The whole notion of innovation and cloud and health care, I would say that in general, health care is behind in adopting the cloud," for a number of reasons. Tandon noted that going to the cloud, if designed well, can be more secure than other methods.

Looking broadly at physician burden, he noted that there are several technologies that can help reduce it. For example, using machine learning can help in predicting appointment no-shows, as well as predicting the volume that will hit the hospital at any point of time. Another option is smart alert that can let a physician know when lab results are ready, thereby allowing a patient to leave quickly and avoid the potential for acquiring infections.

Precision diagnostics

Karley Yoder, director of product management at GE Healthcare, noted that her company is aiming for precision health, to include precision diagnostics. Her company has worked with AWS Sagemaker, which allows developers and data scientists to build, train, and deploy machine learning models quickly, with a goal of cutting down on incorrect diagnoses.

She provided a couple of examples of how AI can help in imaging. For example, those performing ultrasounds are trained less and less today, so it is essential to make the machine easier to operate. Currently, operators are concentrating on collecting the image, rather than whether there is something wrong with the image. AI can help in cutting down both on time and auto calculations so the clinician stenographer can focus on findings inside the image, she explained.

Yoder added that AI can help when X-rays are required. For example, if someone is in a car accident, and a mobile chest X-ray is performed, it goes into the radiology queue until someone can assess it. She suggested training a deep learning device, giving the ability to triage and allowing paramedics to act quickly. The same concept could help stroke victims.