Organ-on-a-Chip technology has been proven to predict DILI with a higher rate of confidence than traditional in vitro methods and animals. Start protecting human health today.
In the beginning, it’s just nausea. But soon, abdominal pain sets in with general fatigue and a growing sense that something has gone wrong. These seemingly innocuous symptoms can be early indications of drug-induced liver injury (DILI). Without quick action, patients developing DILI may progress to multi-organ failure and potentially death.
Bringing harm to a patient is the worst-case scenario that looms large as new drugs transition from preclinical to clinical studies. For more than half a century, DILI has been a frequent cause of post-market drug withdrawal and a common cause of clinical trial failure1,2. It is unfortunately easy to find examples of this. In January of 2020, the development of inarigavir was halted after the tragic death of patient who presented with clear signs of DILI. This year alone, both Pfizer and Aligos Therapeutics halted production of promising therapeutics as a result of unforeseen DILI in clinical trial patients. It is no exaggeration to say that it is a leading patient safety concern.
An insidious condition, DILI develops when a therapeutic turns toxic in a patient’s liver, leading to a decline in liver function and ultimately death. It’s a difficult condition to detect and even harder to predict1. A battery of preclinical models are used to screen prospective compounds for toxicity before they reach patients, with animal models often viewed as the ultimate predictor of drug safety. Though animal models have undoubtedly played an important role in the evolution of modern drug development, many statistics suggest they are unreliable2-7:
In reviewing the literature, it becomes clear that relying on animal models to predict toxicity generally, and DILI specifically, is unlikely to prevent many toxic compounds from entering clinical trials and causing harm to patients. Such grave errors can be avoided, though, if the right preclinical models are used.
In a recent study published to BioRxiv, Emulate researchers provided strong evidence indicating that Organ-on-a-Chip technology can be a far more reliable predictor of drug toxicity.
The predictive power of Organ-on-a-Chip technology
Organ-on-a-chip technology is a type of microphysiological system in which cells can be cultured in a highly controlled, physiologically relevant microenvironment. These microengineered culture systems combine organ-specific cell types, tissue-specific extracellular matrix, and biophysical forces to mimic in vivo microenvironments. Multiple studies have shown that cells grown in Organ-Chips closely mimic in vivo cells both in behavior and in gene expression profiles9-12.
Many studies had previously suggested that Organ-Chips may be superior to conventional preclinical models when predicting drug toxicity13,14. However, the limited scale of these studies left some doubt about the robustness of Organ-Chips. To truly evaluate the potential of Organ-Chips, a large study was needed.
In December of 2021, such a study was completed by researchers at Emulate9.
The research team used 780 liver-chips to analyze the model’s ability to predict DILI caused by 27 known hepatotoxic and non-hepatotoxic small molecule drugs. Importantly, these molecules were not chosen at random, but were selected based on guidance from the Innovation and Quality (IQ) consortium—a collaboration of pharmaceutical and biotechnology companies that aim to advance science and technology to enhance drug discovery programs. Towards this goal, the IQ consortium have released guidance stipulating basic expectations of preclinical models of liver toxicity. The model should:
Before this study, no microphysiological system had yet met these standards.
The Emulate Liver-Chip showed close resemblance to the human liver and was able to accurately identify toxic from non-toxic drugs, and correctly predicted toxicity for the tested drugs.
In going beyond the IQ consortium’s standards, Emulate researchers expanded the study to include an additional 8 known hepatotoxic compounds to evaluate the model’s utility in predictive toxicology.
The Emulate Liver-Chip showed an 87% sensitivity and 100% specificity in predicting drug toxicity, far outperforming liver spheroids (a common preclinical model) which showed a sensitivity of only 47%. Here, it’s worth noting that each of these drugs had been found to be safe in animal models but ultimately proved toxic when given to patients.
The Liver-Chip could save lives and billions of dollars
The mere fact that the Liver-Chip showed an 87% sensitivity and 100% specificity in identifying these toxic drugs is impressive on its own, but set against the history of these drugs, the significance of this improvement falls into sharp relief. The 22 toxic drugs in this study had previously advanced to human use, and collectively are responsible for more than 200 patient deaths and 10 liver transplants15. Were the Liver-Chip available when these drugs were being developed, many of these deaths could have been avoided.
The benefits of Organ-Chips go beyond improved patient safety. Roughly 75% of costs in drug development are lost to drug candidates that ultimately fail due to efficacy or safety issues16. A major contributing factor in drug failure is poor model validity. It’s been argued that even small improvements in the predictive validity of preclinical models could have a significant impact on drug development success rates17.
In their study of the Liver-Chip, Emulate researchers modeled the potential impact that routine use of the Liver-Chip could have on drug development productivity. By simply improving our ability to detect hepatotoxicity with 87% sensitivity, it’s estimated that the Liver-Chip could increase research and development productivity by $3 billion dollars on an annual basis.
Bottom line: The Liver-Chip should be integrated into preclinical development
Emulate’s results provide strong evidence for the use of the Liver-Chip in preclinical drug development. Not only do they faithfully recreate the liver microenvironment, but they’ve proven to be a robust, sensitive, and specific model for assessing a drug’s likelihood of inducing DILI. This means fewer toxic drugs advancing to clinical trials, saving billions of dollars that can be reinvested in other drug candidates, and most importantly, saving patients from the devastating effects of drug induced liver toxicity.