• Sustainability
  • DE&I
  • Pandemic
  • Finance
  • Legal
  • Technology
  • Regulatory
  • Global
  • Pricing
  • Strategy
  • R&D/Clinical Trials
  • Opinion
  • Executive Roundtable
  • Sales & Marketing
  • Executive Profiles
  • Leadership
  • Market Access
  • Patient Engagement
  • Supply Chain
  • Industry Trends

Adaptive Trial Design: Prepping for Adoption

Article

Pharmaceutical Executive

Pharmaceutical ExecutivePharmaceutical Executive-06-01-2014
Volume 0
Issue 0

As interest grows in this productivity-enhancing tool for seamless drug development, there is a need for a better working consensus on standardized metrics that monitor progress and certify success.

Biopharmaceutical companies are targeting improvements in clinical trial design as a critical factor in pipeline portfolio success. One major driver of change is adaptive trial design, which has been shown to improve the quality and relevance of clinical data, enhancing in turn the likelihood of a faster, more predictable path to market authorization. Draft FDA guidelines on adaptive trial design, issued in 2010, have bolstered the appeal of this tool in maintaining productive relationships with the regulator.

The high visibility given to a major breast cancer trial based on adaptive design principles—known as the I-SPY 2 program—promises to continue the industry's evolution toward endorsement of the adaptive approach. The I-SPY 2 trial represents a significant reengineering of the clinical trial design process with the aim to rapidly evaluate the potential of new drugs in the treatment of breast cancer while reducing cost, compressing time, and lowering the number of study volunteers.

The trial incorporates several innovative features, including an adaptive design that enables researchers to use data from patients early in the trial to guide decisions about which treatments, and doses of a particular treatment, might be more useful for patients who enter the trial later. It provides a scientific basis for eliminating treatments that are ineffective and for selecting treatment that show promising efficacy more quickly. One of the other key I-SPY 2 design features is the collaborative nature of the trial in that the multiple drug candidates developed by multiple companies are evaluated. New candidates are added as others either progress to Phase III based on efficacy in specific subgroups of patients, or are dropped.

Two successful drug candidate transitions were recently reported from the I-SPY 2 trial. Veliparib (an AbbVie compound) proved promising against so-called triple-negative breast cancer, an aggressive form of the disease for which there are few effective treatments; and neratinib (from Puma Biotechnology) was reported to be similarly effective against a different form of breast cancer.

The success of I-SPY 2 has triggered similar designs in other disease areas such as the Alzheimer's disease collaborative trial recently announced by the European Union's Innovative Medicines Initiative (IMI). This €53-million project will allow evaluation of several drugs at once using an innovative adaptive design in a similar way to I-SPY 2, and involve a number of biotechnology and pharmaceutical companies working together with academic centers, patient groups, and regulators.

Estimating current adoption rates

Although awareness of adaptive trial design use has grown and qualitative reports from biopharmaceutical companies indicate that adoption is increasing, little quantitative data exists to characterize industry-wide adoption of this study design approach. Recently, two independent studies have been conducted to establish and corroborate baseline measures of adoption.

The two independent studies chose a definition of adaptive trial design that is consistent with the current FDA regulatory guidance. Specifically, adaptive trial designs are pre-planned adaptations, generated through the use of trial simulations and scenario planning, of one or more specified clinical trial design elements that are modified and adjusted while the trial is underway, based on an analysis of blinded and unblended interim data.

The FDA cites numerous adaptations that can be planned and prospectively written into the protocol. Examples include pre-planned changes in study eligibility criteria (either for subsequent study enrollment or for a subset selection of an analytic population); randomization procedure; treatment regimens of different study groups (e.g., dose level, schedule, duration); sample sizes of the study (including early termination); concomitant treatments used; planned schedules of patient evaluations for data collection (e.g., number of intermediate time points, timing of last patient observation and duration of patient study participation); and analytic methods employed to evaluate protocol endpoints (e.g., covariates of final analysis, statistical methodology, or Type I error control).

In October 2011, the Drug Information Association's (DIA) Adaptive Design Scientific Working Group (ADSWG) conducted an online survey among 11 pharmaceutical and biotechnology companies and six contract research organizations (CROs). Participating companies reported that 475 adaptive design trials had been conducted between January 2008 and September 2011, suggesting a 22% adoption rate. Two-thirds (65%) of the total adaptive clinical trials analyzed used group sequential or blinded sample size re-estimation. One-third (35%) employed other adaptive design approaches, including unblinded sample size re-estimation, added or dropped treatment arms, and changes in randomization ratios.

In October 2012, the Tufts Center for the Study of Drug Development (Tufts CSDD) conducted in-depth interviews on the status of adaptive design implementation among 12 major pharmaceutical companies. The study was funded by an unrestricted grant from Aptiv Solutions. Tufts CSDD probed current adoption rates and their impact on study budgets and durations. The results of this study were consistent with that conducted by the ADSWG. Overall, simple adaptive designs are being used on approximately one out of five (20%) late-stage Phase III clinical trials. Early terminations due to efficacy futility were the most common simple adaptive design used. Sample size re-estimation was also a commonly used adaptive design approach. In-depth interviews with sponsor companies indicated low usage rates (i.e., 10% of clinical trials) of adaptive dose finding and treatment group adaptations (e.g., dropping unsafe or ineffective doses) and extremely low usage of seamless Phase II/III studies.

While the two independent assessments indicate that between 20-22% of all active clinical trials include an adaptive trial design approach, analyses of public and commercial databases of trial activity present a very different picture.

Two separate assessments of the Department of Health and Human Services' ClinicalTrials.Gov (CT.Gov) registry of FDA-regulated clinical trials found very small numbers of adaptive trial designs listed there. Searching the term "adaptive design," the ADSWG found only 62 adaptive trial design studies listed—among the 103,213 active trials listed in CT.Gov since 2008—a 0.06% adoption rate.

Tufts CSDD conducted a search of a broader set of adaptive trial design keywords among the 103,213 active CT.Gov trials listed since 2008. Examples of keywords searched include adaptive design, Bayesian design, sample size re-estimation, and group sequential. Tufts CSDD found 119 total trials, suggesting a 0.1% adoption rate. Tufts CSDD also manually searched 37,111 active 2012 clinical trial listings in CT.Gov and found a total of 35 adaptive trial designs listed—a 0.09% adoption rate.

Tufts CSDD also conducted searches of adaptive design keywords using two commercially available subscription-based clinical trial databases—Informa Health's Citeline and Thomson Reuters' Cortellis services. An assessment of the former database yielded a 0.2% adoption rate (317 adaptive clinical trials out of 136,000 trials listed). Tufts CSDD found 134 adaptive clinical trial designs out of a total of 146,678 trials listed in the latter database, suggesting a 0.09% adoption rate.

Next steps

Given these contrasting survey results, the establishment of a robust method of monitoring the adoption of adaptive design trials use and the specific types of adaptations utilized would be invaluable in improving senior management decision-making on study design optimization practices and their impact. However, the extremely low adaptive trial adoption rates found in CT.Gov and in commercially available databases are not plausible or credible given qualitative and quantitative assessments of current adoption levels. These call into question the quality and integrity of the data on study design practices captured there.

As an immediate next step, Tufts CSDD and the ADSWG plan to meet with CT.Gov and EudraCT system administrators and with commercial database developers (e.g., Informa Health, Citeline; Thomson Reuters, Cortellis; and Springer Science and Business Media, Adis) to broadly discuss this problem; to establish consensus-based definitions of adaptive trial designs; and to develop a formal process to capture more detailed, standardized data on various adaptive design approaches that can better inform overall decision-making.

There is a critical need to improve the characterization of adaptive clinical trial designs in these public and commercial databases. Doing so would assist regulators in anticipating changes in adaptive design practices and in assessing the impact of regulatory reform on study design. Improvements in tracking adaptive design use will also benefit drug development sponsors by providing better benchmarks on design practices and stimulating study design enhancements that may ultimately drive higher levels of quality and improvements in drug development success rates.

Ken Getz is Director of Sponsored Research at the Tufts Center for the Study of Drug Development. He can be reached at kenneth.getz@tufts.edu. Phil Birch is Director at Aptiv Solutions. He can be reached at phil.birch@aptivsolutions.com. Stella Stergiopoulos is Senior Project Manager at Tufts CSDD. She can be reached at stella.stergiopoulos@tufts.edu.

Related Videos
Related Content