• Sustainability
  • DE&I
  • Pandemic
  • Finance
  • Legal
  • Technology
  • Regulatory
  • Global
  • Pricing
  • Strategy
  • R&D/Clinical Trials
  • Opinion
  • Executive Roundtable
  • Sales & Marketing
  • Executive Profiles
  • Leadership
  • Market Access
  • Patient Engagement
  • Supply Chain
  • Industry Trends

A Match Made in Clinic

Article

Pharmaceutical Executive

How artificial intelligence, wearable devices, and translational informatics are changing healthcare.

In the modern, connected world, the use of smartphones and wearable devices to check social media, respond to email, and document our lives with pictures and video has become ubiquitous the world over. An increasing number of people rely on their wearable devices not only to stay connected to their social networks, but also to continuously monitor another vital aspect of their lives: their health. Such devices, like the Fitbit, Apple Watch, and numerous others, can collect a wealth of data, such as the user’s heart rate, diet, weight, and sleep quality.1

While many wearable devices were designed for consumer use, the libraries of ‘real-world’ data they generate can be of great use to clinicians and healthcare professionals for gathering important information from patients.2 Application in the clinic now goes beyond theoretical-in early 2018 the FDA approved the first wearable device to monitor patients for dangerous types of epileptic seizures.3 The device, called Embrace, uses multiple indicators to detect seizures. In the pivotal clinical trial, it was able to detect 100% of seizures in clinical trial participants.3 This important milestone for wearable devices has been followed by nearly 100 clinical trials over the past year that have incorporated wearable devices into their study designs, which span a number of therapeutic areas including oncology, neurology, cardiology, and more.2,4

In addition to detecting dangerous medical events such as seizures in the real world, wearable devices are being used in a variety of ways in current clinical research studies. They can remotely collect data and have been used to sensitively measure specific disease markers, such as gait and tremor, in movement disorders such as Parkinson’s disease.2 This is of huge benefit to both patients and clinicians, as patients don’t have to visit the clinic and clinicians don’t have to perform on-site patient visits for clinical data to be collected. Other devices may also be able to monitor treatment regimen adherence and may even make it possible to perform therapeutic monitoring of the efficacy and safety of medication.2

 

Taking aim at the ‘big data’ problem

The successes and potential future applications of wearable devices also present a number of challenges for clinical researchers. For example, the breadth and depth of the data that they generate also creates a ‘big data’ problem. Big data refers to the massive and ever-increasing amounts of data that are generated from a variety of sources and in a variety of formats. Managing and organizing this data requires a large amount of computational processing power and storage space. These real-world data sets are often large, noisy, and sometimes missing critical data points. Thus, analyzing them using traditional statistical methods and sifting out results that are clinically meaningful for patients presents an additional complication.

Data scientists are tackling the big data problem created by wearable devices by using advanced computing power and complex data mining algorithms like artificial intelligence (AI).5,6 AI, the concept that computers or other machines can be programmed to carry out similar cognitive functions as the human brain, can provide unprecedented insights into a number of different fields that are struggling with big data problems.6 While AI is a fairly old concept, a more recent evolution in the field, called deep learning (DL) has led to additional advances in data analysis, allowing machines to essentially ‘learn for themselves’ in an unsupervised way through real-world data input over time, rather than be directed by regular interaction with a human user.While unsupervised learning can detect complex patterns, they still require human intervention to interpret in the clinical setting.

 

DL, pattern recognition, and translating data into the clinic

AI and ML algorithms have been particularly useful for pattern recognition with genomics data. To do this, data scientists have created specific DL models that are able to “learn” patterns through the input of genomics data. The more data that is analyzed, the better the algorithm gets at learning how to recognize specific patterns. With the cost of next-generation sequencing continuing to drop and the use of genomic biomarkers to inform treatment decisions in critical therapeutic areas, like oncology, the ability to identify genetic patterns (i.e., mutations in specific genes) and their association with specific therapeutic responses or other phenotypes can be extremely useful in the clinic.7

Sophisticated DL methods have been applied to large genomic data sets to recognize specific gene expression patterns or genetic biomarkers and their association with specific disease phenotypes.7 The ability to do this type of analysis has led to the birth of a new field, called translational informatics, where insights from bioinformatics and DL algorithms can be used to advance drug development and the treatment of various diseases.

There are a broad number of applications for translational informatics, such as biomarker identification for specific disease states or therapeutic responses. This has the potential to improve the rate of clinical development as it has been estimated that drug targets that are genetically supported have twice the success rate of those that are not.8 One cloud-based, collaborative example of how translational informatics and DL have been put into practice is with a collaborative platform called the Kipoi Model Zoo, where researchers can upload genomics data (in a variety of different formats) and DL models that have been developed for the analysis of genomics data.9 In doing so, the platform creates a mechanism for additional DL model training and further refinement of these models. In addition, it creates a pipeline for the use of large amounts of different types of genomics data for additional phenotypic insights. Taken together, the Kipoi Model Zoo and other similar platforms can help uncover the deep and complex link between genomics data and a particular phenotype, which has the potential to improve drug discovery and clinical development.

Another area to which AI and DL have been applied is pattern recognition in medical image analysis. Medical imaging data obtained through computed tomography (CT), X-ray, magnetic resonance imaging (MRI), and positron emission tomography (PET) are used for the diagnosis and prognosis of many different types of diseases, and typically rely on interpretation by expert personnel. Such interpretations are prone to both false-positive and false-negative errors and may even differ from clinician to clinician.10 AI has the potential to identify subtle patterns that might be missed by busy clinicians, and thus assist in making more accurate diagnoses.

 

Barriers to clinical implementation

The application of AI and DL algorithms to the analysis of data-whether it be from wearable devices, medical imaging instruments, or next-generation sequencing data- is a controversial topic that is met with enthusiasm by some clinicians and with resistance and skepticism by others. While the approval of Embrace, which uses a DL algorithm for data analysis, gives wearable devices and DL algorithms some legitimacy, the absence of other well-designed, properly-executed, and peer-reviewed trials has inhibited widespread adoption in healthcare.2 In part, this may be due to the separation of device engineers, data scientists, health care professionals, and drug developers into distinct research fields that do not share a lot of overlap.2

The complex nature of AI and DL and the need for regulatory approval may be another contributing barrier to implementation. AI algorithms are often referred to as ‘black boxes’ for which it is difficult or impossible to interpret the underlying logic of why a certain result was obtained from a given input. This has led to a debate amongst data scientists about whether the inner workings of an algorithm need to be fully understood prior to acceptance in the field; regulatory bodies, such as the FDA, have an even greater difficulty accepting devices with uninterpretable results. This uncertainty could make device engineers, drug developers, and regulatory agencies uncomfortable with devoting time and money to the development of wearable devices and accompanying DL algorithms.

DL and wearable devices have the potential to change the way patients, in a wide variety of therapeutic areas, are diagnosed and treated by clinicians. Though many studies are in their early stages, there is a great deal of hype surrounding this research area.2 As more exploratory studies are done, reliable data is generated, and additional collaboration between data scientists, clinicians, device engineers, and regulatory personnel occurs, wearable devices and DL will likely continue to make inroads into the clinic and represent a major step forward in healthcare.

 

References:

1. Why Fitbit? Fitbit website: https://www.fitbit.com/whyfitbit. Accessed October 10, 2018.

2. Izmailova ES, Wagner JA, Perakslis ED. Wearable Devices in Clinical Trials: Hype and Hypothesis. Clinical Pharmacol Ther. 2018;104(1):42-52.

3. Embrace by Empatica is the world's first smart watch to be cleared by FDA for use in Neurology. Cision website: https://www.prnewswire.com/news-releases/embrace-by-empatica-is-the-worlds-first-smart-watch-to-be-cleared-by-fda-for-use-in-neurology-300593398.html. Published February 2018. Accessed October 11, 2018.

4. ClinicalTrials.gov search results. Search term “wearable devices.” Start date: 10/10/2017 and End date: 10/10/2018.

5. Schmidt B, Hildebrandt A. Next-generation sequencing: big data meets high performance computing. Drug Discov Today. 2017;22(4):712-717.

6. What is the difference between artificial intelligence and machine learning? Forbes website: https://www.forbes.com/sites/bernardmarr/2016/12/06/what-is-the-difference-between-artificial-intelligence-and-machine-learning/#5f898feb2742. Accessed October 10, 2018.

7. Libbrecht MW, Noble WS. Machine learning in genetics and genomics. Nat Rev Genet. 2015;16(6):321-332.

8. Nelson MR, Tipney H, Painter JL, et al. The support of human genetic evidence for approved drug indications. Nat Genet. 2015;47(8):856-860.

9. Kipoi: Model Zoo for Genomics. Kipoi website: https://kipoi.org. Accessed November 9, 2018.

10. Shen D, Wu G, Suk HI. Deep learning in medical image analysis. Annu Rev Biomed Eng. 2017;19:221-248.

 

Jason Chin is Sr. Director, Deep Learning in Genomics, DNAnexus; and John Didion, PhD, is Principal Scientist - FDA Specialist, xVantage Group, DNAnexus 

Recent Videos