Edward Teller's quote that "the science of today is the technology of tomorrow" got empirical support by work published in the March 31, 2017, issue of Science.
Researchers from Harvard Business School and MIT showed that nearly 10 percent of NIH-funded grants end up forming the base for a private sector patent within 25 years, and roughly 30 percent produced papers that were cited as prior art in patent applications, making them indirectly useful.
Furthermore, so-called basic research was nearly as likely as applied research to end up forming the basis of a patent. Thirty-five percent of "disease-oriented" and 30 percent of "non-disease-oriented" patents were cited in patent applications.
Co-author Pierre Azoulay's dry summary was that "since the second world war, there has been an argument [about] what is most useful, applied or basic research. People have strong opinions, and typically, those views tend to coincide neatly with the kind of work that they actually do," he told BioWorld Today. "One implication of our work is that people should find something else to argue about.
"Trying to prejudge relevance is really, really hard," he added. "And so we should stop pretending that we can do that."
Another implication, he said, is "that things really do take a very long time to become useful. If you think it will take a long time, it takes even longer than that."
In their study, Azoulay and his co-authors Bhaven Sampat and Danielle Li examined more than 360,000 NIH grants that were awarded between 1980 – the year the Bayh-Dole act made it possible for universities and hospitals to apply for patents on publicly funded research – and 2007. Nearly 31,000 of those grants led directly to university or private sector patents, and more than 110,000 grants were cited in more than 81,000 private sector patents.
The NIH is the world's biggest source of public funding for biomedical research, currently giving out grants to the tune of $32 billion annually. The Trump budget proposal has proposed slashing that budget by nearly 20 percent, though with even Republicans calling the proposal an "embarrassment," it seems likely that those cuts will not pass congressional muster. (See BioWorld Today, March 30, 2017.)
Azoulay, who is a professor at MIT's Sloan School of Management, said that although it is tempting to interpret the paper as a timely rebuttal to President Trump's proposed funding cut, "our [data] doesn't speak to whether those funding cuts are a good or a bad idea. They don't speak to what the rate of return is on NIH-funded research."
"NIH-funded research is built upon by private companies; they find it useful," he said. "It is not an ivory tower."
But one possibility is that if academic labs were not doing that research, industry might step up and do – and, as has been noted, pay for – that research themselves.
More formally, there are two possibilities. Government funding of research can cause either crowd-out, encouraging industry to stand back and let someone else do the heavy lifting of early stage research, or crowd-in, encouraging industry to pursue avenues they might not have otherwise considered.
"Very often researchers take grave umbrage that what they do could be replicated in industry," Azoulay said. "To me it's an empirical question."
In other work, Azoulay and his colleagues have looked at that empirical question, and at least those conclusions should not cause further umbrage in the research community.
"Those [NIH] funding cuts are probably a terrible idea," he said.
There are examples of both processes occurring, but overall, crowd-in dominates crowd-out. Allocating more public money to an area of research will generate more downstream R&D by private sector firms, not less, as would be the case if crowd-out were dominating.
Perhaps the most salient example of crowd-in is probably the study of CRISPR/Cas9, which has gone from being an obscure subfield of bacterial immunity to generating investments topping $1 billion, and patent battles to match. (See BioWorld Today, Jan. 17, 2012, and Oct. 10, 2016.)