Cloud Journey Series:
- Cloud Journey — Part 1. A basic introduction of cloud, applying PACE layering and The 6R’s.
- Cloud Journey — Part 2. A quick review on what is the good organization chart to enable cloud journey.
- Cloud Journey — Part 3. A quick view on Business Values and Business Drivers on a cloud journey.
- Cloud Journey — Part 4. What does cloud mean for your “Talents & Culture”?
- Cloud Journey — Part 5. Using Platform Ops to accelerate DevSecOps adoption
- Cloud Journey — Part 6 | Foundations of Cloud Architecture
- Cloud Journey — Part 7 | Customer Data Platform (CDP)
- Cloud Journey — Part 8 | Customer Data Strategy
In this post:
- Data Governance framework
- Customer Data Management Capabilities
- Customer Data Services
- Customer Profiling with CDP & MDM Reference Architecture
- Customer Consent Management
- Components of the Economic Framework
Data and analytics governance has become more challenging as data straddles across edge, on-premises and multiple cloud environments. New regulations are driving demand for effective data governance. Governed self-service is not an oxymoron. Rather, appropriate governance is necessary for self-service which enables and empowers data democratization. Data governance is a big subject. There are many methodologies that attempt to apply a standard model of governing data. Similarly, the number of commercial and open-source products that support governance has exploded over the last number of years, all claiming, to various degrees, to meet your governance needs. While they are all rightly supporting governance functions in one form or another, aligning what is precisely being offered with your requirements is not always clear.
A governance framework for an operational transactional system is different from the framework required for a data and analytics environment which is different again from governing a master data management (MDM) application. With the growth of data and analytics and increasing requirements for speed and agility of service, effective governance is more important than ever to:
- Operationalize data pipelines and reporting and machine learning models faster
- Meet data protection compliance requirements
- Enrich customer experience and speed up insights
- Better manage business processes
- Perform impact analysis
- Enhance data quality
The main steps are defined as:
- Prework: Define use cases, roles, organizations, architectural pattern and data strategy
- Step 1: Data discovery and policy alignment
- Step 2: Transform the data from multiple sources and identify data quality issues
- Step 3: Curate and remediate enterprise data using data catalog
- Step 4: Harmonize data at the BU level and perform analytics and ML
- Step 5: Operationalize data governance
This structured framework is described as a sequential process, but technical professionals may implement certain aspects per the requirements and even revisit some aspects to refine them. Finally, certain aspects of the framework are sometimes oversimplified to convey the spirit of the endeavor, but implementation will require deeper deliberation. For example, each of the boxes require authentication, and lineage information is collected at each box even though these aspects are not repeated throughout the process flow.
It is also important to note the scope of this framework in the context of data governance as a whole. The prework step is critically important to the successful implementation of the five main steps of this framework. The following figure provides insight into how the steps in this framework fit into the overall scope of data governance. You can read more on Gartner’s governance framework here.
Customer Data Management Capabilities
Most customer data management capabilities are concentrated in the data management and analytics layer — except customer identity and access management (CIAM) and customer data platform (CDP.) CIAM plays a role in facilitating customer experience.
A CIAM application can combine and reconcile customer data. Yet, that’s not its primary purpose. A CIAM application should provide single sign-on, multifactor authentication, API access control and other identity and access-management-related capabilities. The CIAM application also provides some identity resolution and profile unification capabilities, but they are mainly intended for the scope of registration and login.
The customer data management scope of business intelligence (BI) platforms, data lakes, data warehouses and CDPs is broader as it extends to data preparation, data delivery, insights and customer 360 view. Data lakes are often combined with a CDP for identity resolution, profile unification, segmentation and data delivery to marketing (activation). CDPs are similar to MDM systems with their identity resolution and profile unification capabilities. However, most CDP vendors lack data quality and data governance capabilities, which is why you cannot reach maximum maturity when implementing an MDM initiative with a CDP. Yet, a CDP architecture can help you reach basic maturity levels and meet your marketing requirements simultaneously. A good strategy is to kick off your MDM initiative with a CDP at a basic maturity level. This can be achieved with the consolidation style.
Customer Data Services
Many organizations have siloed customer data. Architects must consider master data management, customer data platform, customer identity and access management, and consent and preference management solutions to maximize customer experience.
- High availability: Make data accessible by leveraging middleware to move data between multiple systems in the desired format and composition. Ensure that the servers of your applications have sufficient capacity to handle customer data API requests. SaaS applications in a multi-tenant cloud are subject to an API requests limit during a specific period (second, day or month). Augment your use of APIs with event consumption, use an API management platform to monitor your usage, and mix synchronous and asynchronous integration patterns to avoid hitting contractual limits.
- Consistency: Inconsistent customer data can result from deep design choices across distributed enterprise applications, mainly in microservices architectures. To support consistency, you need to avoid distributed transactions. Transactions should occur within the scope of a single API request. Data cannot be synchronized across all systems at the exact same time. Once you have achieved eventual consistency, you are ready to implement customer data mastering services.
- Relevance and compliance: Create a data catalog to maintain an inventory of data assets to understand relevant datasets for the purpose of extracting business value. Correct legal data processing and responding to data subject rights cannot be facilitated on time when failing to extract relevant data.
- Security: A CIAM application will provide additional security measures to identity and access management of customer data. But several issues should be addressed in addition to a customer data mastering implementation, such as network security, encryption, data recovery, security information and event monitoring (SIEM), and endpoint protection
If you are looking to eliminate silos, you cannot start synchronizing data across multiple applications without knowing which data points should be synchronized, in which format, at what time, and for which purpose, even when it is communicated as part of a business requirement. Blindly implementing business requirements from specific departments will put the consistent standard across enterprise applications at risk. Once the customer data initiative is defined, you need to establish technical design principles. Begin to establish these principles by focusing on the following areas:
These foundational principles apply to any customer data initiative. Use these principles when you synchronize data across multiple applications. A typical use case for data synchronization is a customer registration for a self-service portal that facilitates customer support, interaction with other customers, knowledge articles, billing, product catalog, marketing and other functions. Figure below provides an example of a customer registration scenario without customer data mastering. It shows the flow of sequences between enterprise applications for the customer creation process.
Figure in below leverages all customer data mastering services. The CRM shape in following figure can represent a group of multiple customer-facing applications (CRM, e-commerce, loyalty platform, etc.). The customer identities for multiple applications are managed in a standardized identity directory of the CIAM application, in order to access all applications via single sign-on (SSO) with the same credentials. When you synchronize customer profile data with the MDM, each application will hold the global ID of the MDM. You can trace data lineage, govern customer data and reconcile data based on the most trusted source applications. Finally, you enable and maintain a very granular consent flow. Every customer-facing application collects data for different purposes and customers can opt-in or out for every purpose individually. It is safe to collect behavioral customer data via web and mobile software development kits (SDKs) of the CDP once the CDP receives the consent from the CPM system.
Customer data mastering services rely on a successful integration implementation and maintenance. You need fast integration processes to achieve market-leading customer engagement. Speed should be a priority for your integration design. Use asynchronous patterns where possible to allow critical customer-facing processes like the account activation to be synchronous and fast. For example, the creation and distribution of the global ID is an asynchronous process might consider using kafka as a streaming.
A customer profile is defined by a collection of attributes. The underlying structure is a configured customer entity model which supports different relationship types (one to one, one to many, and many to many). A common customer profile contains master data such as first name, last name, address, email and phone number at its core. You can extend the customer profile with transactional data such as customer lifetime value, contact preferences, housing demography, behavioral data, recent purchases, communication history, and more. Anything related to a customer can be part of the profile, but it should contain master data at its core. Quality goes over quantity as a profile with many data points, but invalid core attributes (like email or phone number), is not valuable.
Organizations are struggling under the weight of “centralize all or nothing.” Centralizing everything can become uneconomical as not all data is equal and not all data should be governed the same way. For example, accounts and contacts should be in the MDM system and CDP, while orders can stay in the CDP. You should store information about the order, such as inventory and invoice, in an ERP system or data warehouse. Successful implementation of customer data shared services requires mediation and data sharing.
Customer Profile Mastering With a CDP Reference Architecture
While there are shifts in vendor capabilities, CDPs primarily focus on marketing data use cases, taking a consolidation approach. CDPs ingest data from multiple enterprise applications and data sources, then use this data to drive personalization and customer experience optimization.
The reference architecture in below shows how a CDP can consolidate customer data from a digital experience platform (DXP), CRM, commerce and marketing platform into a common schema with global customer profiles.
The activity of customers on the website (first-party cookies) and mobile app is also captured into the schema depending on the customer’s consent. The capture of the customer consent is facilitated through the web and mobile SDK of the CDP, or by integration with a CPM.
Customer Profile Mastering With an MDM Reference Architecture
An MDM solution focuses on operational use cases. Compared with a CDP, an MDM solution doesn’t offer as many connectors to SaaS applications like e-commerce, marketing or sales. Instead, it offers connectors to major database providers, APIs and an integration framework with real time, near real time and batch processing to facilitate synchronization of data among multiple applications. API integration is the best practice because you can build an event-based integration where the MDM and other applications subscribe to certain events to take appropriate actions. Consider an event broker as middleware like kafka. For example, you can subscribe to the “delete user” event to delete the user in every application in the correct order. However, if you were to use a database connector to delete the user from the database before deleting them from customer-facing applications, you’d risk breaking referential integrity.
The following figure shows the role of an MDM reference architecture among SaaS applications such as DXP, CRM, e-commerce and marketing platforms.
Customer Consent Management
Broadly speaking, data privacy is the right of individuals to control how their personal data is disclosed and to whom. As the data privacy landscape has become increasingly complex and consequential, countries have reacted by enacting new legislation. Most notable are the EU General Data Protection Regulation (GDPR) and California Privacy Rights Act (CPRA). Consent management is the leading principle of GDPR and CPRA. It should be the starting point for organizations that are implementing or improving data privacy.
Since organizations must obtain consent to use personal information in many instances, consent collection is part of various technologies like CDPs, CIAM and CRM systems. Decentralized consent collection results in multiple siloed consent repositories. Consent & Preference Management (CPM) vendors are specialized in collecting, tracking and synchronizing consent. Figure in below shows how to integrate consent and PII when using multiple customer data mastering services.
Practically speaking, you will have to support multiple integration types to enable the full set of customer data capabilities. As following figure shows the range of integration services required to support each component of your customer data architecture, as well as the ideal integration approach for each component. Event-driven integrations require event broker middleware to enable its pub-sub approach to data flows.
The below figure shows an example event-driven flow for creating integrations between your customer applications and a customer MDM service that manages the customer master and global ID.
Components of the Economic Framework
An economic framework is a model for understanding the impact of architectural decisions on the economic outcomes of any development effort. Product development consultant and author Donald G. Reinertsen describes five key economic metrics for measuring the performance of a given effort, as shown in:
- Product Value: The value of the capability to the business and customer.
- Lead Time: How long it takes to design, build and deliver the capability, from initial request to delivery.
- Risk: The chance that the capability will not be delivered, or will fail to meet expectations.
- Operating Cost: How much it costs to deploy and operate the capability.
- Development Cost: The cost in labor and materials to implement the capability.
A change to any one of these affects the others. It highlights this trade-off: hiring junior developers reduces hourly rates and lowers development cost per hour of labor. However, it increases lead time because the team is less skilled than senior developers.