Medical Device Daily Washington Editor
BETHESDA, Maryland – Most of the chatter about computers in the metro DC area addresses their role in provider settings, but virtual reality — no longer just for teenagers – looks to be joining the conversation for med-tech manufacturers. The use of computer modeling in device development is of interest because potentially offering lower development costs and shorter turn-around times.
This was the focus of a symposium held earlier this week, the Workshop on Computer Methods for Cardiovascular Devices, jointly sponsored by FDA, the National Science Foundation and the National Heart, Lung and Blood Institute, a unit of the National Institutes of Health.
A subtext of the meeting was that the effort to simulate a device with a computer will require the cooperation of academic researchers, industry, FDA and the software industry, but participants also came away with an appreciation for the gap between the current base of data and the data needed to forge successful computer simulations.
Wednesday’s meeting included breakout sessions by device category.
Reporting from the stent group was Craig Bonsignore of Proximal Design (Corte Margarita, California), a device development consultancy.
He told attendees, “the take-away is that computational methods are for everything,” including manufacturing optimization and sensitivity analyses.
Regarding preclinical evaluation of bare-metal and drug-eluting stents, Bonsignore said a fair amount is known about modeling, but that “some exciting areas to explore include modeling the biology” of features such as restenosis and thrombosis, and that “there’s a huge role for using computational simulations to look at delivery and deployment” of stents.
Regarding changes to device design, “analysis and design need to be one and the same,” Bonsignore said, adding that simulation could help medical science “move away form this one-device-fits-all model” and develop devices for specific sub-populations.
Bonsignore said simulation of patient/device interaction is ripe for development, but he saw “an axis of time” and “an axis of patient variability” that the science has to elucidate before such an effort will bear fruit.
On the question of simulation certification, Bonsignore said, “the notion [is] of certifying not necessarily the code itself but the analyst” who wrote the code.
Presenting for the rhythm management device group, Mike Schendel, PhD, with Medtronic (Minneapolis), said “we decided to focus on the weakest link in the system, which tends to be the leads” of electrophysiology equipment, a nod perhaps to the problems with the company’s Sprint Fidelis leads.
“There are potential reliability issues ... so it’s not something we can ignore,” Schendel said.
Schendel said the group felt that computer models would be “mostly about predictive modeling to address potential reliability issues” and that “one of the eye-opening things” unearthed by the discussion was that many in the group from academe were “more focused on developing new therapies” while those from industry were “more interested in ... reliability issues.
“Maybe the people needed are those in materials sciences,” Schendel said.
Schendel said of computer models for pre-clinical evaluation that “we’re using them already,” but lamented that “what we’re stuck with in industry is comparative data” on the reliability of existing leads.
“One thing were asking for is academic support” toward a more basic science of applied mechanics for “boundary conditions on our leads,” he said, adding that the materials of which leads are made “must be characterized in an as-used state” rather than in an idealized state.
Ken Cavanaugh, Jr, PhD, an electrical engineer at FDA, discussed the stent group’s views on scientific priorities for academic research. “We felt that the key point is to identify the appropriate input parameters,” he said.
“As far as funding ... a lot of people point to NIST as a model,” Cavanaugh said of the National Institute of Standards and Technology, but “a lot of people think device companies should be responsible for the data collection,” with FDA acting as a facilitator. “There was a lack of consensus on the role of industry” on basic research questions such as tissue data, he said.
As for the question of bringing research from the lab to the clinic, Cavanaugh said the group realized that intellectual property issues will impose some drag. “Being willing to share data” is essential, he said, noting that both academe and industry see this as a big problem. In such a stalemate, “academia may not always know ... what industry and government are looking for.”
Cavanaugh declined to offer specifics on what role the agency is interested in playing in development of computational models.
“Some ways that our role could be helpful is to just talk to us early on” in the development of a computational model, advice that echoes comments made by dozens of other FDAers in similar settings.
Cavanaugh said that developing proficiency in modeling will require “professional training [and] involve professional societies and software vendors,” but he added to the list “development and dissemination of best practices so everyone knows what is the best way to develop an analysis.”
For ventricular assist devices (VADs), Bob Benkowski, CEO of MicroMed (Houston, Texas), said “the priorities of academic institutions” include “integrating biological models ... into simulations.” He said academe is “more suited and qualified” than industry or government to conduct the multi-disciplinary research needed for this pursuit.
Benkowski joked that the VAD group concluded that “government should fund everything” in connection with this effort, but pointed out that some funding is provided in the pediatric provisions of last year’s FDA Authorization Act.
He also said “the [computer] laptop industry has funded” a lot of battery technology. He also said that the software industry should take responsibility for the user interface and hardware compatibility.
Benkowski seconded Cavanaugh’s comment about ownership issues, describing academic research centers as “very savvy about such things.”
However, “it’s not just the algorithms that are important, it’s also the underlying data sets,” he said.
As to who should validate the use of the software in a particular simulation, Benkowski said the VAD group viewed the validation as the responsibility “ultimately [of] the end user,” not the software publishing company.