To Genzyme, Quality Data Depends on Quality Governance - Pharmaceutical Executive


To Genzyme, Quality Data Depends on Quality Governance

Pharmaceutical Executive

Standards in Clinical Trial Development

The CDISC standards have been developed to support the streamlining of processes within medical research, from the production of clinical research protocols, to reporting and/or regulatory submission, warehouse population, and/or archive and post-marketing studies/safety surveillance. (You can find all CDISC developed standards at: Using standards proactively in clinical trial development benefits business in many ways (see graphic, adopted from HL7 RCRIM).

With all these listed advantages for using standards, Genzyme has implemented or is in the process of implementing CDISC standards end-to-end. At the same time, we are building a Metadata Repository, which will help us to develop consistent and reliable means to apply standards in our business processes, and leverage information enterprise-wide.

Implementation of Standards

At Genzyme, we have developed a data governance process and are setting the process for data design from protocol development and data collection to reporting and submission.

Data quality is an area fraught with tough challenges; for instance, the actual damage of dirty data isn't always that tangible. Using high-quality clinical trial data is very critical for the pharmaceutical industry. It benefits patients, sponsors, and regulator organizations. To improve data quality, data validation is a must-have process. This process can ensure correct metadata will be used in data collection, transmission, and data derivation processes, and can identify data outliers and data errors. Data validation generally can be defined as a systematic process that compares a body of data to the requirements in a set of documented acceptance criteria. With the development of many standard initiatives at Genzyme, we have implemented or plan to implement standard protocol, standard CRF, standard central lab, SDTM/ADaM, and many other standards. All of these efforts will help Genzyme to improve the data quality; however, to ensure we will have a data quality consistent with our standards as specified, a data validation tool is needed.

The data validation tool (DVT) can provide data quality checks based on implemented standards and provide metrics to gauge our data quality. The vision and ultimate goals resulting from this tool are to:

Evaluate CRF, Central Lab, SDTM, ADaM data, and define XML files against CDISC standards and Genzyme-specific requirements to ensure that Genzyme receives, produces, and submits quality data;

Align with Genzyme Metadata Repository to ensure metadata validation; and

Automate and streamline data validation processes.


By implementing standards, developing the data and process governance, defining the standard process, educating and training users, and maintaining the standards/processes, we will be able to reduce:

Potential risk of accepting poor quality data from CROs;

Potential risk of analyzing poor quality data;

Potential risk of submitting low-quality data to regulatory agencies;

Potential risk of re-work or duplicate process on the similar data issues due to lack of data validation process;

Potential of low/no efficient data validation process;

Potential for not using resources smartly.

The end result? It will improve our business process efficiency and effectiveness and bring business ROI.

Julia Zhang is Associate Director, Standards and Architecture, Global Biomedical Informatics, at Genzyme. She can be reached at

Sue Dubman is Senior Director, Global Biomedical Informatics at Genzyme. She can be reached at


blog comments powered by Disqus

Source: Pharmaceutical Executive,
Click here