• Sustainability
  • DE&I
  • Pandemic
  • Finance
  • Legal
  • Technology
  • Regulatory
  • Global
  • Pricing
  • Strategy
  • R&D/Clinical Trials
  • Opinion
  • Executive Roundtable
  • Sales & Marketing
  • Executive Profiles
  • Leadership
  • Market Access
  • Patient Engagement
  • Supply Chain
  • Industry Trends

A New Era in Precision Gene Editing

Article

Pharmaceutical Executive

Pharmaceutical ExecutivePharmaceutical Executive-10-01-2015
Volume 35
Issue 10

Medicine is entering a new era of “gene editing.” Instead of simply being able to “read” our genes, practitioners will be able to adapt and modify DNA sequencing in living cells to treat, cure, and even prevent disease. Defective genes responsible for life-threatening diseases can be rendered harmless by gene editing; it can also be used to develop new, precisely targeted biological drugs based on an individual’s distinct genetic markers. The full list of possibilities, in every area of the life sciences, is simply beyond imagination.

Medicine is entering a new era of “gene editing.” Instead of simply being able to “read” our genes, practitioners will be able to adapt and modify DNA sequencing in living cells to treat, cure, and even prevent disease. Defective genes responsible for life-threatening diseases can be rendered harmless by gene editing; it can also be used to develop new, precisely targeted biological drugs based on an individual’s distinct genetic markers. The full list of possibilities, in every area of the life sciences, is simply beyond imagination. 

That promise has been made visible through two new technologies now being introduced to the research marketplace. The transcription activator-like effector (TALE) nucleases and clustered regularly interspaced short palindromic repeats (CRISPR) nucleases make it possible to edit genes in a few weeks through routine manipulation in life sciences laboratories, as opposed to months or years of dedicated research. 

For the biopharma industry, the technologies offer tremendous potential for both basic research and drug discovery and development. Companies need to make their choices carefully, based on a deep understanding of the advantages and limitations of these exciting technologies.

Gene editing: Current state of the art 

The precise editing of a genome through controlled DNA modification at a targeted location was first performed in the 1980s by a process known as homologous recombination. It relies on the delivery into the cell of a DNA fragment carrying homologous sequences upstream and downstream to a target location. The information it contains is inserted into the genome at the target location. Homologous recombination progressed slowly and was mostly used by academics in laboratory research. Even though basic knowledge in genetics was continuously increasing, strongly driven by progress in sequencing, the translation of this knowledge into new therapies or other uses was limited by the ability to modify genomes efficiently, safely, and specifically.

We are now on the cusp of a new era. The discovery that generating DNA breaks at the target location considerably enhances the efficiency of homologous recombination-driving it from 0.0001% to up to 20% in mammal cells-paved the way for nuclease-based gene editing, with a

wide range of engineering possibilities (see Figure 1; click to enlarge). Nuclease-based gene editing relies on the use of a nuclease-a specialized protein that recognizes and cleaves a specific DNA sequence-to mediate a DNA break at a targeted location in order to generate a precise DNA modification. 

Nucleases can be used to enhance efficiency of homologous recombination or be used alone to edit genes. In this second case, DNA breaks generated by the nuclease are repaired through an error-prone process called non-homologous end-joining. This provokes DNA changes at the cleavage location, typically inactivating the surrounding gene.

Any nuclease-based gene editing technology is made of two core elements: the DNA-binding element recognizes and binds the DNA target, and the cleaving element mediates the cleavage.

Nuclease-based gene editing is already widely used in research as a cost-effective, fast, and easy way to conduct genetic experiments. For example, scientists inactivate genes to understand their function or generate cell or animal models that mimic a human disease. The latest technologies are so powerful that new research protocols no longer aim to modify one gene at a time but to alter as many as 20,000 genes in parallel in order to observe which are involved in a given biological process.

Nuclease-based gene editing is also being used to develop new therapeutics, as a new approach to gene therapy: that is, introducing or editing genes to cure a disease. The most advanced nuclease-based gene therapy clinical trial is a Phase II conducted by Sangamo BioSciences, using zinc finger nucleases (ZFNs) to modify immune system cells and prevent them from being infected with HIV. Sangamo is also collaborating with Biogen to develop ZFN-based therapies against hemoglobinopathies and is pursuing on its own a program in the field of hemophilia (which was until September 2015 under a collaboration agreement with Shire).

 One very recent and promising approach is being developed in the field of immuno-oncology with “CAR-T cells”-chimeric antigen receptor T cells; these immune system cells are engineered to express an artificial receptor at their surface and specifically fight cancer cells. 

 

 

While CAR-T cells are being developed by many pharma and biotech players using transgenesis (e.g., Juno Therapeutics, Kite Pharma, and Novartis) most companies are partnering with gene-editing specialists to use nuclease-based gene editing technologies to engineer CAR-T cells (see Figure 2 below). The main reason for this is the ability to easily knock out selected genes with gene editing, thereby silencing proteins that activate an immune response. It paves the way for allogenic CAR-T cell therapies (using a unique T-cell bank versus engineering a patient’s own cells for each therapy) with reduced risk of immune rejection and increased therapy success rates. Cellectis is a pioneer in this promising approach, followed by other CAR-T cells players.

(Click on chart to enlarge)

 

Choosing the best technology: It depends on your objective 

The available nuclease-based gene editing technologies differ over a range of criteria classified into four categories: prototyping of the system, performance (including safety), delivery, and manufacturing.

The importance of these criteria varies depending on the final use. Academic research and therapeutics development, the two most important applications today for gene editing, have very different needs (see Figure 3).

Prototyping combines the level of skill, the time, and the costs required to design and obtain a new effective nuclease. Prototyping is of prime importance for academic research, as this step accounts for most of the time to perform an experiment and because research lab budgets are constrained. On the other hand, in therapeutics, new product development is a very long and expensive process, making the investment in nuclease prototyping marginal. 

Performance of a nuclease technology depends on five parameters: 

Efficiency: This is the capacity for a nuclease to effectively cleave the targeted DNA site. Efficiency is crucial in therapeutics; it is directly associated with the success of a therapy. In research, many experiments allow for the screening and selection of modified cells, reducing the need for high cleavage efficiency. 

Specificity (off-target): Some nucleases can cleave off-target sequences slightly similar to the target sequence. Specificity is of prime importance in therapeutics to ensure safety of treatment (as undesired DNA modification can have harmful consequences on patients’ health). In research, the ability to screen cells reduces the need for very low off-target activity. 

Precision: This is the capacity of a nuclease to cleave as closely as possible to the desired location. Precision is crucial in therapeutics to perform a fully controlled modification. In research, this criterion is usually less important (e.g., gene inactivation can be performed by modifying genes at different locations or even within their vicinity).

Capacity to cleave methylated targets: Methylation is a DNA chemical modification that modulates the level of expression of a gene. Some nucleases are unable to bind a target if it is methylated. 

Multiplexing: This is the capacity to generate several DNA modifications within an experiment. For research, this is an interesting feature to study gene function. In therapeutics, multiplexing is not critical. Safely modifying one gene at a time is already a challenge. 

 

 

Delivery of the nuclease system into the cell nucleus is mostly done via plasmid DNA, viral vectors, or RNA. These vectors carry the genetic information used by the cell to produce the nucleases. In academic research, this criterion is of moderate importance, as researchers usually have more flexibility to use different delivery systems. In gene therapy, achieving efficient delivery into the desired cell type remains one of the biggest hurdles to overcome. As of today, the main vector type used for gene therapy is the viral vector, but this type of vector can trigger side effects by sustained expression of the nuclease in the cell, which is toxic. RNA systems or other systems enabling transient expression are preferred, however, the process is not yet mastered. 

Manufacturing is the large-scale production of the delivery vector. It is especially important for therapeutics, for which high quantities (and high quality) are needed.

Each nuclease-based gene editing technology offers distinct features, strengths and limitations along these four criteria (see Figure 4).  The first tools developed for nuclease-based gene editing, meganucleases and ZFNs,

were costly and hard to engineer. TALE nucleases and CRISPR nucleases, two recently developed tools, have considerably broadened the ability to manipulate genomes sequences easily and effectively. 

TALE nucleases are artificial enzymes developed in 2009 based on foundational work of teams led by Daniel Voytas, Ulla Bonas, and Adam Bogdanove.  Like ZFNs, they are built from a fusion between a DNA-binding element (an array of TALE subunits) and a cleavage element (FokI). Each TALE subunit can recognize a specific DNA nucleotide independently from the others. Here lies the revolution:  it makes TALE highly flexible, able to target virtually any sequence in the genome with very high precision and easy to design (taking about one week and a few hundred dollars). 

TALE nucleases are both efficient and specific, with the lowest level of off-target activity when compared with ZFN and CRISPR nucleases. For example, researchers have been able to reach 76% or 80% of cleavage efficiency in T cells without any detectable off-target activity.

The DNA-binding capacity of the first developed TALE nucleases is sensitive to methylation. This drawback has since been overcome and it is now possible to design nucleases that are either sensitive or not sensitive to methylation, providing enhanced flexibility to discriminate between a methylated and an unmethylated allele. 

Prototyping of TALE nucleases still requires specific expertise and know-how in molecular biology and protein engineering. Complex molecular operations are necessary because of the repetitive content of the DNA sequence encoding the TALE nuclease. 

Delivery with viral vectors is limited because of the rate of recombination (also due to the repetitive content of DNA sequences). In addition, the large size of the sequence encoding a TALE nuclease gene (2,800 nucleotides) and the two-component structure increase the complexity of delivery with any vector. It also translates into higher manufacturing costs for RNA and viral vectors.

Interestingly, BurrH nucleases could be an alternative to TALE nucleases, as the sequence encoding BurrH does not show the same repetitive content found in the TALE domains. MegaTALs and compact TALE nucleases,  two other recent developments, are easier to handle and deliver. FokI cleavage domain is replaced with a meganuclease or a meganuclease cleavage domain, allowing them to work as a monomer.

CRISPR nucleases are new tools developed in 2012. Their strength is the DNA-binding element which is a small RNA sequence called “guide RNA” (instead of a protein). They are very easy to design and also far simpler and cheaper to manipulate than proteins. This makes CRISPR a groundbreaking tool to perform gene editing in a few days and for less than two hundred dollars. This technology also requires very basic know-how in molecular biology and no specific expertise. The cleavage element is the nuclease Cas9, which can cleave any site in the genome as long as it contains a short sequence called PAM (occurring on average every 13 nucleotides in the human genome).

CRISPR has already been widely adopted by academic labs. This fast and massive adoption is due to its simplicity and speed, as well as the low cost to design a new CRISPR nuclease, combined with a high efficiency. In human primary T-cells, protocols are improving fast; while first studies showed low efficiency  (approximately 10%), researchers have now reached 55% and even 94% using  chemically modified gRNA. 

 

 

Multiplexing, easily achieved by using several guide RNA with the same Cas9 nuclease, is also seen as a powerful characteristic for some experiments, such as genome-wide studies in cell lines. Rather than inactivating one gene and observing which biological attribute is modified, scientists can inactivate in parallel all of the 20,000 genes of the human genome to screen for those linked to the biological attribute. 

As of today, the main drawback of CRISPR nucleases is their high level of off-target activity (low specificity), raising safety concerns, especially for use in therapeutics. The CRISPR recognition site seems to allow for a high number of mismatches. Recent studies suggest that only between five and 12 nucleotides in the guide RNA are really important for the binding of the

 nuclease system. Such short sites are far less likely to be unique in the genome. In order to deal with this limitation, new generations of CRISPR nucleases are being developed. These new versions are modified to work as a pair, like ZFN or TALE nucleases, in order to increase the specificity by doubling the length of the recognition site. However, this comes at a price, as paired nuclease systems are more complex to deliver. Other approaches to reduce CRISPR off-target activity rely on the optimization of the composition and the structure of the guide RNA.

The large size of the DNA sequences encoding the Cas9 nuclease (4,200 nucleotides) and the nature of the guide RNA bring several limitations. It triggers more complexity for both delivery and manufacturing. Delivery complexity is enhanced by the difference in nature of the Cas9 protein and the guide RNA. To lower it, smaller versions of Cas9 are being developed.

The revolution generated by these technologies is also apparent from the increasing intensity of intellectual property activity. Over the last five years, patent applications increased more than 40% per year on average, driven by applications related to TALE and CRISPR technologies.

TALE and CRISPR technologies have two quite different intellectual property landscapes, similar to Apple versus Android operating systems in mobile phones. The TALE nuclease technology IP landscape is well-defined and consolidated around Cellectis (similar to Apple), whereas the CRISPR IP landscape is more scattered. This creates a perceived “freedom to operate,” which, combined with the high accessibility of the CRISPR technology, is not unlike the Android situation. On top of that, a legal dispute among the key stakeholders over the CRISPR IP could prevent any clarification in the near future. Despite this scramble, three companies founded in the last two years are shaping the market today, using patents on CRISPR technology to develop therapeutic approaches: CRISPR Therapeutics, Editas Medicine, and Intellia Therapeutics. In total, they have raised more than $330 million in venture capital since 2013.

Key developments in the race to commercialization  

Many new developments in gene editing are expected over the next few years in several fields, including plants, bioproduction, and most obviously in healthcare and human therapeutics. Consequently, we anticipate a number of disruptive changes to impact the biopharmaceutical industry.

Overall, we can expect several market shifts through a redefinition of the standards of care in some disease areas. Gene editing technologies have the potential to steal a significant portion of existing market share by altering the course of treatments. CAR-T cell therapy, for example, could eventually replace chemotherapy and other standard cancer treatments, starting with second-line treatments and progressing from there. Similarly, these technologies could be used to replace stents in cardiovascular patients. And the market for HIV tri-therapies will plummet if the genome of some cells can be edited to become impervious to HIV, enabling patients to avoid infection altogether. Companies may have to take these market shifts into account in their revenue projections just as they did for patent losses.

In addition, companies that decide to be part of the gene editing game should anticipate significant internal changes across the value chain-from R&D/ external innovation capabilities to pricing and reimbursement models or manufacturing operations. A few of these are listed here. These represent just the tip of the iceberg.

Pricing. The pharmaceutical industry has engaged in a vigorous debate over appropriate pricing for innovative drugs, particularly with the launch of the latest hepatitis C drugs as well as PCSK9 inhibitors, the next generation of cholesterol drugs. With gene-editing technologies, there will be questions on how to price one-shot treatments versus repeated injections of proteins or small molecules over time, on top of the innovative aspect and the classical efficiency improvement. New business models will likely be required to adapt to these changes, not unlike the new models that have emerged in recent years in drug development for rare diseases. Pharma will need to work with regulatory bodies, payers, and care providers to anticipate concerns and find optimal solutions.

Competencies and capabilities. To take advantage of new gene-editing technologies, companies will need to monitor developments in the industry and quickly seize opportunities as they arise in what is likely to be a highly competitive space.  Management will need to be able to differentiate between niche technologies and players and select the best partner, which will require an upgrade in the depth and substance of  project evaluation capabilities. When considering an evolution towards in-house innovation models, companies will require even more competencies. The decision on whether to be a leader or a fast follower will become increasingly important as these gene-editing technologies become more widely adopted.

Manufacturing. Just as the emergence of biologics transformed manufacturing operations, requiring new capabilities, organization and processes (such as specific chemistry, manufacturing and controls [CMC], regulatory expertise, timing of investments, etc.), gene-editing-based drugs will impose a fresh mandate to transform manufacturing operations. Companies will need to define and implement good manufacturing practices for large-scale production of gene editing therapies, including manufacturing of cells (in some cases, patient cells) or of delivery systems, especially viral vectors. Companies will need to move quickly to acquire new capabilities and adapt to new processes.

Finally, as the potential behind gene-editing technologies has ignited the life sciences community, it has also sparked ethical and legal debates. We face the usual switch from “Can we do it?” to “Should we do it?,” similar to current and past debates on other groundbreaking innovations, such as human genome sequencing today or the printing press in the fifteenth century. Triggered by recent and very controversial work published by Chinese biologists on first experiments to edit the DNA of human embryos, an initiative has been launched by the National Academy of Sciences and the National Academy of Medicine to inform decision-making on this controversial area of human gene editing. In December, the initiative will convene a meeting of global experts to review scientific, ethical, and governance issues and discuss a way forward. To realize the field’s full commercial potential over the long-term, it will be important to establish a legal and ethical framework that guides future research and development.  

Gene-editing technologies have demonstrated incredible potential to treat, cure, and prevent disease. The progress is generating great hope for researchers, clinicians, and especially for patients. We foresee a future where gene editing will fundamentally reshape the treatments available to patients. As these technologies continue to advance and find new uses, the message for biopharmaceutical companies is simple: start getting ready now to benefit from the changes that are coming.

 

Elsy Boglioli is a Partner and Managing Director, Boston Consulting Group. She can be reached at boglioli.elsy@bcg.com. Magali Richard is Project Director, Boston Consulting Group. She can be reached at richard.magali@bcg.com

Recent Videos