• Sustainability
  • DE&I
  • Pandemic
  • Finance
  • Legal
  • Technology
  • Regulatory
  • Global
  • Pricing
  • Strategy
  • R&D/Clinical Trials
  • Opinion
  • Executive Roundtable
  • Sales & Marketing
  • Executive Profiles
  • Leadership
  • Market Access
  • Patient Engagement
  • Supply Chain
  • Industry Trends

Exploring Luma & Luma Lab Connect

Commentary
Video

In this Pharmaceutical Executive video interview, Kalim Saliba, Chief Product Officer, Dotmatics discusses their Luma platform, Luma Lab Connect, and what this technology could mean for industry.

Can you tell me briefly about Dotmatics Luma and how Luma Lab Connect integrates into it?

The Luma platform, which we introduced last year, is a multimodal R&D platform. It basically combines low-code data management capabilities, with integration with our leading scientific tools to help our customers really not just structure their data but ingest it and correlate it with each other so that they can make faster decisions based on good data. And I would say, you know, there's a lot of talk about AI in the industry, of course, and I think, rightfully so. And when we think about the AI we're introducing into the platform, it's really two flavors. One is around basically facilitating the end users use of the platform to, for example, execute queries. So, you could imagine like natural language execution of queries. But I think more interesting than that will be the ability for our own customers to train their own models based on their datasets to help with in silico simulation of different experiments. So, I think that centralized data platform is really the piece that makes this capability set stand out. And if you think about, then, where does data come from? Right, because a data platform is only as good as the data you put into it.

One of the most critical sources of data is, in fact, these all of these instruments, right. And instrument vendors, you know, don't always make it easy to get data into a system. The formats are different. Some cases, it's a binary format, or an encrypted format, enter lab connect, right? So Luma lab Connect is our offering that basically has parsers for, you know, many different instruments we support over 100 and growing right now. And that supports the seamless ingestion, annotation and parsing of instrument data, before it gets flowed into the Luma. Platform. And if you think about, like the lifecycle of that data, I like to describe it in terms of three different formats. So, at the at its most raw format, it's unstructured, right, you can't really do much with it, the next step of enrichment would be semi structured, right? It's readable, but maybe it's not really correlated well enough with other datasets. By the time it gets to the Luma platform, we and our customers can actually model data structures and relationships between them without writing code. And that is the final state of the data in its fully structured, normalized format. And what that means is it becomes much easier for customers to create relationships between different datasets that came from instruments, but ended up in this nice, structured, well behaved model that they can then build visualizations against, they can use for additional decision support, ultimately gaining insights that they otherwise might not have been able to do.

What do you think the potential of the combination of this technology is?

In terms of some of the key features, I'd say like, first and foremost, that simplifies instrument connectivity. And that's really important because as the number of instruments are going to grow at our customer sites, it becomes really hard to govern and centralize the data coming off of them. One of the things that's unique to our solution is we have a way to remotely manage all of the agents that are connected to the instruments. So, when you think about getting instrument data, you actually need a piece of software actually running on the computer that is attached to the instrument. It you know be very time consuming, intensive. If you have to go to every single instrument in every single computer to make changes to that agent.

So probably one of the one of the most key things we can do is remotely managed from the cloud, our customers can manage those agents push updates change configuration. And that's really powerful because it helps get them up and running in a really short period of time. The second piece is really increasing accuracy of the data. Right now, when we talk to customers, a lot of this is manual, it's, you know, it's about finding the right piece of data that came off an instrument, you know, manually uploading it into an elf, then, you know, correlating it with the correct endpoint data or assay data that might have been processed. So by taking all this instrument data, and being able to correlate it with, like, say, experimental metadata from an Elan assay data endpoint data that includes things come that which come from our scientific tools, imagine being able to pull curve fits from prism, or flow cytometry data from AMIK or FCS Express protein characterization data from our protein metrics offering that is very differentiating because it enables our customers to take these datasets that exist in these different disparate tools. And just like they're doing with the instrument data, pulling the data from prison pulling the data from genius, primary genius, biologics, pulling the data from our flow cytometry tooling, and create a really a correlated landscape of what that data looks like in a way that drives decisions that they might not otherwise have been able to actually get to. Which brings me to the third benefit is really accelerated decision making.

We talk a lot about the make test decide cycle, right, our customers are making, you know, small molecules, cell and gene therapies, antibodies, you know, things related to material science, and they have to test those attributes those materials, those compounds those biologics to see if they meet certain criteria. And to do that we support either integration with some of our leading scientific tools. But we also are going to be providing out of the box dashboards and visualizations in the Luma platform itself, that will be able to basically display the data that had been ingested into the loom platform in a way where those insights become more clear, for example, being able to see, you know, which antibodies in a in a batch of different tests has the lowest liabilities with the greatest efficacy, being able to see that nicely and visually within the Luma platform. It's not something you typically think of when you think about instrument integration. So, lab Kinect is really more than just about instruments. It's about getting the data from our instruments, from our scientific tools, into a platform where you can really make decisions faster and get to a therapeutic more quickly.

What are the implications of this technology on industry?

So, I think the implications of widespread adoption of Luma lab Connect is really tied to the widespread adoption of the Luma platform, where science data and data science can begin to help our customers innovate with AI in ways they haven't been able to. You know, it's not like there aren't successful AI stories in this space, there are lots of people are using it for high throughput screening, identifying targets, and, you know, identifying compounds that can hit those targets effectively. But it's a pretty laborious process to do things that are more innovated. For example, in the protein space, where there are lots of different datasets, the, the processes aren't necessarily well captured. So, the data isn't well structured enough to really train models to make decisions.

So, in my mind, the greatest implication of widespread adoption is that the data that is coming off of these more sophisticated R&D processes, begins to have enough structure and enough context to meaningfully apply artificial intelligence, and ultimately drive down the cost of the R&D process, which, you know, as many, as many, you know, can take up to 10 years and $6 billion, which we have to drive down. So, to me, that's the greatest implication. And it’s interesting, right? Because if you think about where, you know, I think the industry is headed, it's actually towards more simulation. So, then the question might be, well, if everyone is adopting this instrument tech, and pulling more data off of instruments, what does that mean for the future where, you know, the industry is going to increasingly drive towards simulating more of this stuff in silico. And I think you can't have one without the other right in order to actually get to a place where you're able to simulate and use intelligent models that are based on wet lab data, you need the wet lab data first. And yes, lots of it exists today. But can our customers get their existing data in the right format in the right structure that they can extract the right insights out of it? I hope so with our tool, but it's not like those wet lab processes are going to stop right now. Like, if anything, I think they're going to keep going. And hopefully keep going with the structure and context that drives more in silico simulation using AI, but they really go hand in hand.

So, my hope way down the line is that simulation becomes something that's much more at the forefront of R&D, and working in the wet lab, while I think will be a necessity for the foreseeable future, and our lifetimes, won't be as maybe prevalent and unnecessary as it is today in every single scenario, right? And so, reduce wet lab processes, increase accuracy and context of the data in the Luma platform, such that AI can be meaningfully applied for the simulation processes in terms of what you know, does this drug have the right effect with the least amount of liabilities on you know, in the area therapeutic that we're investigating? So long answer your question, but hopefully that that gets the point across.

Related Videos
Related Content