PERTH, Australia – It's likely that Australia will not draft separate guidance or regulations for software applications that use artificial intelligence or machine learning (AI/ML) for drug development or medical devices.

Instead, the Therapeutic Goods Administration (TGA) will classify AI and ML under software as a medical device (SaMD) when it is intended for diagnosis, prevention, monitoring or treatment or alleviation of disease.

"The way that the medical device framework has been written facilitates capturing novel technologies; it gives freedom to manufacturers in how they can demonstrate compliance and so, there's really no need to be creating additional frameworks at this stage," said Lee Walsh, technical lead for digital health, conformity assessment section of the TGA.

"Med-tech companies can also use their connected devices to demonstrate compliance," he said, noting that data collection is one of the benefits of these technologies, and companies can feed this data into their evidence dossier with the other evidence they have collected. Another advantage is that it facilitates ongoing post-market monitoring.

Because AI/ML technologies are new and emerging, they're not always straightforward to regulate, so "there are challenges around these products for the TGA, not just for the manufacturers," Walsh said. "While software as a medical device is not new, the volume of products, how they produce and who's producing them, as well as what they can do is quite different to when the regulations were written."

"How do we ensure that [AI/ML] learning can improve decisions in the group datasets so that we can make sure that they're improving and not going the other way," Walsh asked.

SaMD is a bit of a gray area in Australia, Arthur Brandwood, founder and principal consultant at Brandwood Biomedical, told BioWorld.

Australian regulations are flexible, and Brandwood doesn't see the need for new regulations, but there may be some adaptions that need to be made.

"One of the challenges with AI is that medical device regulations are predicated on the idea that you have a defined design that is discrete, characterized, known and unchanged, and that's what gets a regulatory approval.

"The problem with AI is that the thing keeps changing itself every day, so how do you deal with what? The classic way to deal with that is to have an internal control in the system. You set some known standards such as measuring a physiological process, and then the algorithm checks itself against a standard that is known as a checkpoint.

"What you're effectively doing is building the validation into the software to build internal controls so it validates itself."

Most AI/ML applications are relatively low risk and are not making decisions on health care; they're making recommendations, because a clinician is looking at the output and making decisions.

He said the FDA's precertification program is basically modeled after European regulations whereby higher-risk products receive more scrutiny, and lower-risk products receive little scrutiny as long as the manufacturer is well known. In Australia, only class IIb and higher devices are checked against, "and this is exactly what the TGA and European regulators already do," he said.

Low risk or high risk?

Tracey Duffy, who heads the TGA's Medical Devices and Product Quality Division told BioWorld that the TGA was leaning toward higher regulatory scrutiny when a medical professional was not involved with the device, because patients would be making their own decisions, raising the risk of misinterpreting the data.

As technology advances there are more SaMD products that present a moderate or high risk to patients. These may include software apps that calculate medicine doses or that directly make a diagnosis.

Some examples are apps that make a diagnosis through analysis of electrocardiogram (ECG) data, or that make a diagnosis through the application of AI or ML in the analysis of skin images for detecting melanoma. Such devices should be subject to third-party oversight that is commensurate with the risk they represent to patients, she said.

One of the biggest areas of concern for SaMD is the reporting of post-market incidents and complaints, because many software issues are managed as a "reboot," and often users don't identify issues. Similarly, software issues are often misidentified as "user issues," Brandwood said.

Part of the challenge is that AI/ML sits on the borderline between several things. For example, expert systems like IBM Watson that may or may not be medical devices, but rather library systems that provide more effective search tools.

Cybersecurity discussions

The TGA sought feedback from industry to shape policy on SaMD and cybersecurity for medical devices (CSfMD). The agency said that in the SaMD arena, new players may not have had an opportunity to engage with the TGA or may not have a full understanding of Australia's regulatory requirements.

Challenges in the CSfMD space are increasing, and the complexity of the cyber threat landscape and lack of regulatory guidelines require immediate attention, the agency said.

The TGA engaged the Commonwealth Scientific and Industrial Research Organization (CSIRO) to conduct research to better understand the innovators in the SaMD space, and how the TGA can support them in demonstrating safety of their products.

CSIRO is reaching out to the emerging cluster of technology developers that are focused on producing health and medical software, including clinical decision support tools and companion apps for devices.

Bronwyn Le Grice, founder and managing director of Australia's first digital accelerator ANDHealth, told BioWorld that the TGA is engaging proactively with industry, "so there's some great thought leadership happening in Australia with CSIRO around the ethical use of AI."

"From an industry level we want to make sure the technologies provide appropriate real-world evidence of efficacy underpinning health care. We're seeing regulators trying to ensure that these products and software platforms making claims have the evidence to support those claims, and this really is the critical thing about this space.

"Regulatory AI is one thing, but patent data is another," Le Grice said. "In Australia we can't own our data. We need to fully embrace the concept of an empowered consumer. If we want to use patient data in this way, we need to bring health care consumers along on the journey with us. We can't just say that patients give us this data and we can do what we want with it," Le Grice said.

"We've already seen case studies where regardless of what the legal terms of the organization said, the consumer expectation was that clearly their data would not be sold for commercial gain. We need to be cognizant of operating on the social license that consumers give us, and I don't think organizations have demonstrated to date that they are sensitive to that social license."

The secondary use of data framework associated with Australia's "my health record" is one of the first such frameworks in the world, she said.

Health Minister Greg Hunt said the framework would inform how records data can be used for public health policy, planning and release purposes from 2020. Individuals will be able to opt-out of sharing their secondary information, but they need to actively opt out or their de-identified information could be shared.

Duffy stressed that the TGA views cybersecurity as a shared responsibility among all stakeholders.

The cyber security consultation will be an important part of product review, because there are "lots of outdated operating systems," she said.

"As a small regulator in a global market, the key opportunity and challenge is how we keep up with the changes on us now as well into the future," Duffy said. She suggested a shift in focus was needed from gate keeper to enabler.

"Medical device regulations tend to lag behind when it comes to introducing novel technologies such as SaMD and AI," said Medical Technology Association of Australia CEO Ian Burgess.

"Typically, developers of novel technologies would be concerned by over regulation or uncertainty. However, our experience in recent years with the TGA consultation processes give us confidence that the TGA is ready to listen to feedback from diverse stakeholders including industry."

No Comments