Labs and Computing Power: The Missing Link for a Revolution by 2030

Pharmaceutical Executive, Pharmaceutical Executive-11-01-2022, Volume 42, Issue 11

A shift in mindset to the experiment itself could lead the way.

If you’re ever in a meeting and want to elicit as wide a range of reactions and emotions as possible—think eye-rolling, raised eyebrows, deep sighs, inquisitive looks, furrowed brows, and everything else in between—look no further than the phrase “lab digitalization.” When I speak at conferences, these two words seem to have a magic effect on people, including on their heart rate and blood pressure.

I think this is no surprise. Why, in 2022, is this still a topic for discussion? I’m not aware of any other industry that is still trying to “digitalize” anything. So what exactly is causing the hold-up in pharma? The short answer is that there’s a missing link. The longer answer is that I believe I know what it is, why it went missing in the first place, and what happens to our industry in the coming decade if we’re able to fix it. To begin, let’s look at how things stand right now.

We have all the parts, but no sum

To date, digital tools in the lab environment have been biased toward individual and discrete tasks, especially for things like record-keeping or standalone operational execution. There are the design tools we use before entering the lab, the automation tools we use when we are in the lab, and even more—the eponymous electronic lab notebooks—that we use after the fact to manually record what took place. On top of all of this, we have an entire world of lab hardware, each with its own interfaces and modes of operation.

All of these different systems and tools cover the entirety of the experimental process: the loop we cycle through in designing experiments, running them, analyzing experimental data, and starting all over again. But here’s the problem: there’s no throughline, no thread that links all of these disparate tools together into a unified whole.

Powerful capabilities just out of reach

At the same time, there are three ideas whose convergence and reliable, lab-based implementation has been stalled because of this fractured landscape:

  • Design of experiments (DOE)
  • Next-generation lab automation, or “Lab Automation 2.0”
  • Artificial intelligence (AI)/machine learning (ML)

Multifactorial experiments are vital in the study of biology because they help us understand the interactions that are fundamental features of all living systems. While DOE will not transform the industry on its own, it allows us to reach far deeper into the complexities of biology than we could otherwise.

Similarly, we’ve had the capacity for high throughput automation for some time now and, while higher throughput is always welcome, there’s only so much value it can drive. The next generation of lab automation has complete traceability, reproducibility, usability, and is capable of handling enormous complexity without code, and with much higher levels of walk-away time.

Finally, the buzz around AI/ML is remarkably strong but we have yet to see the full realization of its potential. The work of biology is difficult to represent in code and the data/metadata that labwork produces is difficult to digitize. If we can’t do it, AI/ML gets stuck as a pipe dream that remains the preserve of “big tech.”

We have already seen the results of a combination of both DOE and next-generation lab automation in some of the public statements made by one of our company’s customers, AstraZeneca, who described how they had begun running “experiments that were previously impossible.”

The missing link? The way we think

I believe the missing link is the mental model we use. Much is made of “the lab of the future” as our industry’s solution for success. When we think about the lab of the future, we ask questions like: “will this piece of software make my lab better, or will this new piece of kit?” These don’t feel like they’re good enough.

We create a subtle but profound shift if we instead switch to thinking about the experiment of the future. This makes us ask different questions. What do I need to change to improve the quality of our experiments? How do I increase the value of our scientific output? What’s missing from thatpicture? When we think of the experiment itself, we stop thinking about the processes, equipment, data, and methodologies as separate problems to be solved in isolation. We begin a journey instead toward a holistic approach that considers the entire picture in a new light. For those like AstraZeneca who have already started that journey, the future looks incredibly bright.

Markus Gershater, Chief Scientific Officer and Co-founder, Synthace