Google Deepmind is shedding light on the dark genome with its latest AI model, which is trained to decipher the 98% of DNA that does not code for proteins. Alphagenome is designed to predict how variants in the regulatory genome exert their effects on the expression of the genes they control.
Prenaital ApS has filed a patent for technology that may identify risks of spontaneous pre-term birth. Spontaneous preterm birth (sPTB) is usually defined as birth occurring before 37 weeks of gestation. The invention relates to a method for predicting potential preterm birth from medical scan data, which may be used in practice by medical professionals for risk analysis and prognosis of potential pathologies.
Microsurgery is performed to connect small human vessels, but the technical difficulty required to conduct this type of surgery is quite specialized and limits the number of surgeons who can perform microsurgery.
Investors continue to pour capital into AI-driven health care technologies, from drug discovery and diagnostics to personalized medicine and clinical decision-support tools. However, there remain issues with the quality and reliability of the data underpinning these systems, as well as the viability of their business model.
Depending on who you ask, AI will take over the world and save it; or ruin it. Certainly, it is changing it. Science magazine dedicated its first editorial of 2026 to AI. Despite its title – “Resisting AI slop“ – editor-in-chief Holden Thorp gave the sort of nuanced review that is typical of him.
Illumina Inc. presented at the J.P. Morgan 2026 Healthcare Conference on Jan. 13 and introduced what it said is the world's largest genome-wide genetic perturbation dataset, being built to accelerate drug discovery through AI across the pharmaceutical ecosystem. This is a move away from its core focus on DNA sequencing technology.
Generative AI has largely escaped the U.S. FDA’s regulatory purview up to now, but OpenAI seems poised to create a new source of regulatory angst for the agency. The company unveiled its ChatGPT Health Jan. 7, a large language model that when used professionally could land the company in the FDA’s regulatory crosshairs.
It doesn’t take a meteorologist to see the storm clouds of uncertainty that will continue to roll in on health care across the globe this year. While the prospects for the medical device industry may be sunnier than for other aspects of health care, some high pressure areas likely will present challenges.
Software as a service has typically been less susceptible to liability than products, but that may soon come to an end if the AI LEAD Act, sponsored by a bipartisan pair of members of the Senate, gains sufficient traction.
The U.S. FDA’s final rule for regulation of lab-developed tests was destined to be controversial at best and exceptionally susceptible to legal challenge according to more than one legal opinion. The inevitable legal challenge succeeded wildly in a decision rendered in district court in March 2025, marking one of the rare instances in which the courts thwarted FDA rulemaking and thus is easily the regulatory story of the year for 2025. Attempts to regulate AI in the U.S. and Europe also dominated the regulatory landscape.