• Sustainability
  • DE&I
  • Pandemic
  • Finance
  • Legal
  • Technology
  • Regulatory
  • Global
  • Pricing
  • Strategy
  • R&D/Clinical Trials
  • Opinion
  • Executive Roundtable
  • Sales & Marketing
  • Executive Profiles
  • Leadership
  • Market Access
  • Patient Engagement
  • Supply Chain
  • Industry Trends

Global Harmonization of Customer Data

Article

Cherry Cabading and Mona Rakibe outline how the use of modern data management techniques can help companies see improvements in sales effectiveness, data steward productivity and overall data quality.

Data is the lifeblood of the life sciences industry. Reliable data and a complete understanding of the relationships among healthcare providers, integrated delivery networks and payer organizations are crucial for achieving sales effectiveness and meeting compliance needs.

With globalization, life sciences organizations are looking for ways to establish data governance processes that help them manage data with required corporate governance across countries, while providing local groups the flexibility to adapt to regional business models and regulations.

Today, pharmaceutical companies employ tens of thousands of sales representatives, spread out across the globe. In such a global and distributed sales model, providing accurate healthcare provider (HCP) and healthcare organization (HCO) data, and maintaining data quality can be a challenge. Creating complete information about HCP and HCO profiles require organizations to blend data flowing in from dozens of internal applications and third-party sources, in different types and formats. Using traditional, on premise data management systems to bring all the data together does not work. Such systems are quite rigid, and as a result, adding new attributes to profiles or adding new data sources takes too much time. Moreover, any changes to the data model require reloading the data, causing business disruption.

In today’s commercial environment, speed to market is extremely important. Life science organizations must make sure that sales reps across the globe have access to reliable data, and that any changes required to the HCP and HCO profile attributes are implemented quickly. Access to accurate data has a direct impact on sales rep productivity and effectiveness of plan of action (POA). Taking years to implement a master data management platform and months to make updates just does not work.

Plan for consistent and contextual data

The first step towards data consistency and personalization is understanding your markets and countries. Group the countries into markets, not necessarily based on the geography or the proximity, but based on their business data requirements and local regulations. The next step is to connect all internal, external and third-party data sources, and match and merge the data in real-time to create a single pipeline of reliable data. Cloud-based Modern Data Management systems allow you to quickly connect to all required data sources, including third-party subscriptions, and speed up matching and merging to create reliable HCP and HCO profiles with affiliation information. Such platforms provide the required agility to add new data sources and update attributes when the business demands, as well as the flexibility to create contextual, data-driven applications needed for global deployments. The reliable data blended from all sources becomes the foundation for your subsequent data-driven applications, and a single source of truth for other operational systems, like CRM and expense management, etc.

Ensuring the right access is also critical. Document the data attributes that are required by various markets and understand what data attributes require transformation or masking. Discussing security and access during the design phase prevents costly rework and exposure to compliance risks. Once the rules are well-defined and understood, implement them in a rules engine so that properly transformed data is provisioned to various markets, roles and systems. Externalizing rules gives further agility to change rules based on changing business requirements and flexibility to deliver the right data for the right locale. This design ensures that there is only one consistent data stream flowing across all systems and functional groups, and that data is visualized in context of their roles and business objectives. Different teams and geographies should not create their own silo of customer data, as it affects the data quality and impacts decision-making across business functions. As quoted by W. Edwards Deming, “Improve quality, you automatically improve productivity.”

Right data to the right users at the right time

Once the reliable data foundation is created by bringing together data from all sources and access rules are established, the next task is data provisioning. The data can be offered to employees and systems in various forms. Sales reps may need data on their mobile devices, and data stewards who are working on augmenting data sets and reviewing change requests need desktop applications. The ease of use for such applications is very critical; we’re talking about tens of thousands of users spread globally. The adoption is the key! If the adoption is poor, users quickly tend to revert to their old systems and habits, and will put your whole initiative in jeopardy. User experiences that resemble those of today’s consumer applications, like Facebook and LinkedIn, ensure high, quick and effortless user adoption.

Similarly, give proper consideration for data provisioning to your operational systems. Do they need real-time access or does information need to be provided in batches? Consider providing the data as web services around master data objects, providing the required level of abstraction and ensuring required standardization, governance and security.

Plan for ongoing data curation

Provisioning the data is just one part of the whole data supply chain equation. Collaborative curation of data is necessary to maintain and improve the data quality. Well-defined data change processes will ensure that proper mechanisms are in place to gather feedback from the global sales force, and the updates to the data are made in a repeatable and efficient manner. Provide sales reps and other data users the ability to make data change requests (DCR) from their mobile devices. If they encounter a change in HCP specialty or phone number, sales reps should be able to send the update requests with a single click.

Then the workflow process takes over and routes the DCR to the global data steward team that reviews the request and makes the updates. Moreover, if the third-party vendor provides the profile attribute, the change requests go directly to them and updates are received quickly. In the absence of a Modern Data Management platform and integrated workflow capabilities, these changes can take months – an unacceptable delay. Building a data provisioning and DCR system on a modern cloud-based data management platform makes operational management easier, and reduces the cost of data management across geographies. It speeds up information access across functional areas and maintains data consistency. For example, if the compliance team has to pull data for transparency reporting, they do not have to spend months requesting and aggregating the data from multiple systems; it’s available on demand. The system improves compliance, while keeping compliance cost low.

Key success factors

While there are tremendous benefits with this approach towards global harmonization of data and global DCR management, there are a few key considerations for successful deployment:

●      Understand and conform accordingly to the data privacy laws that exist in the global market.

●      Make sure you understand the data access requirements for various countries, roles and systems.

●      Give data segmentation a serious thought. Segmenting data of a global implementation, solely based on geographic proximity may not be the best approach. An enterprise approach to support speed to market should be put into great consideration.

●      Define data policies, governance and standards – agreeing upon what constitutes good data and which data quality metrics to use is extremely important.

●      Build a strong data steward team that understands these metrics and the impact of data quality on commercial business functions.

●      Last but not least, an executive mandate makes sure that all groups and geographies are behind the initiative. Executive involvement will ensure that all teams are using the best-of-breed data, adopting required standards, and aren’t falling back on old methods.

Remember, a global HCP and HCO master data implementation is very different from a data warehouse implementation or an analytics project. Data is in motion, and building a closed-loop system that meets the needs of a diverse set of users, while maintaining data quality and consistency requires a new approach to data management. Life science organizations using Modern Data Management techniques benefit from collaborative data management and curation that spans global geographies, breaking functional silos and cross-organizational boundaries, and will enjoy improvements in sales effectiveness, data steward productivity and overall data quality.
 

Cherry Cabading is Senior Enterprise Architect, Global Commercial Architecture, AstraZeneca Pharmaceuticals. Mona Rakibe is Director, Product Managment, at Reltio.

 

Recent Videos