In a few years' time, or so the conventional wisdom goes, personalized medicine will become a reality, and many (or even most) new drugs will come to market supported by tests that will help physicians make the decision of who gets what drug, what's the appropriate dose, and who's in the greatest danger of serious side effects—all based on a better understanding of biological processes and newly available data from testing at the molecular level to characterize patients and disease. We're not there yet, of course. To date only a handful of drugs are accompanied by molecular-level diagnostic tests. At the 2005 Molecular Medicine Tri-Conference, Ian Massey, PhD, senior vice president and head of research and preclinical development at Roche, expressed the opinion that molecular diagnostics will only affect a few medicines in the near to mid-term.
Before progress toward the goal of personalized medicine can be seen in the clinic, it must be preceded by changes in the R&D programs of pharmaceutical and biotechnology companies, as biomarkers become an essential part of drug development.
The term "biomarker" is used in a variety of ways. In this article, it will refer to genomic, proteomic, or metabolic characteristics that are objectively measured and evaluated as an indicator of normal biological processes, pathogenic processes, or pharmacological responses to a therapeutic intervention. Biomarker tests can look at proteins, RNA, DNA, and so forth. In some cases, the quantity of a biomarker or its presence or absence are the definitive indicator. In another example, through imaging techniques, it is possible to look at where the biomarker is localized in the body and whether receptor sites are occupied or not.
Biomarkers have numerous potential uses, both in the pharmaceutical R&D effort and in the clinic. They can help to:
- diagnose specific diseases, often at early stages
- identify patients who will respond or not respond to a treatment
- set dosage levels for individual patients
- eliminate drug candidates by identifying potential toxicity early
- identify individual patients at high risk of experiencing side effects
- stratify patients to improve clinical trials.
In 2003, Stan Bernard wrote that pharmacogenomics is "a reality of business in the pharmaceutical industry today, and its impact is likely to grow dramatically." (See "The Five Myths of Pharmacogenomics Pharm Exec, October 2003.) Two years later, how have pharmaceutical companies progressed in the discovery, development, and incorporation of biomarkers into clinical trials? Is the use of biomarkers creating a significant competitive advantage, leaving behind companies that are not embracing them? When can we expect biomarkers to change patient stratification—in most clinical trials and in treating patients? With the ability to select the right drug for patients comes the next series of steps that lead to fulfilling the promise of personalized medicine.
For this article, seven professionals from five large pharmaceutical companies were interviewed in the second quarter of 2005 to gain their opinions and outlook. The purpose of these interviews was to identify new possibilities for biomarker-based tests in conjunction with new drugs and to understand how pharmaceutical companies are approaching organizational and technical changes to incorporate biomarkers into the R&D/Phase IV process.
In his presentation at the 2005 Biomarker World Congress, Hans Winkler, PhD, senior director of functional genomics for Johnson & Johnson, noted significant changes for biomarkers over the last two years. In his view, biomarkers are now consciously being incorporated into clinical trials, and there is a stronger link between disease and target due to biomarkers. According to multiple presenters at conferences in 2005, pharmaceutical companies appear to be in general agreement that biomarkers should be available by Phase IIa to test and refine the hypothesis in Phase IIb and Phase III trials.