Master Data Management (MDM)

Definition

The processes, governance, policies, and technology used to ensure that an organisation's critical shared data entities — such as customers, products, suppliers, and accounts — are accurate, consistent, and controlled across all systems and business units. MDM creates a single trusted source of master data, reducing duplication, resolving conflicts, and enabling reliable reporting and analytics. Effective MDM is foundational to data-driven decision-making and is a prerequisite for successful ERP integration, M&A data migration, and regulatory compliance.

Complementary Terms

Concepts that frequently appear alongside Master Data Management (MDM) in practice.

Data Pipeline

An automated sequence of data processing steps that extracts, transforms, and loads data from source systems into target systems for analysis, reporting, or machine learning model training. Well-architected data pipelines are critical infrastructure assets that enable data-driven decision-making and AI deployment, and their reliability directly impacts downstream business processes.

Data Mesh

A decentralised data architecture paradigm that treats data as a product owned by domain-specific teams rather than centralising all data management in a single platform team. Data mesh is built on four principles: domain ownership, data as a product, self-serve data infrastructure, and federated computational governance.

Data Lineage

The documented lifecycle of data as it moves through an organisation's systems, showing its origin, transformations, dependencies, and destinations. Data lineage provides visibility into how data is created, processed, and consumed, enabling organisations to ensure data quality, comply with regulatory requirements (particularly GDPR's right to explanation), debug data pipeline issues, and assess the impact of system changes.

Data Governance

The framework of policies, standards, and processes that ensures data assets are managed consistently, securely, and in compliance with regulations throughout their lifecycle. Strong data governance increases the reliability and value of data as an intangible asset, directly supporting analytics, AI applications, and data monetisation strategies.

Data Quality Score

A quantitative measure of data fitness for its intended use, typically assessed across dimensions including accuracy, completeness, consistency, timeliness, uniqueness, and validity. Data quality scores enable organisations to monitor and improve the reliability of their data assets, prioritise remediation efforts, and establish trust in analytical outputs.

Customer Data Platform (CDP)

A software system that creates a unified, persistent customer database accessible to other systems by collecting and integrating customer data from multiple sources — including CRM, website analytics, email, social media, transactions, and customer service interactions. CDPs resolve customer identities across channels and devices to build comprehensive individual profiles, enabling personalised marketing, customer journey orchestration, and advanced segmentation.

First-Party Data

Data collected directly by an organisation from its own customers, users, or audience through owned channels such as websites, apps, CRM systems, transactions, and surveys. First-party data is considered the most valuable data category because it is collected with consent, is unique to the organisation, and provides direct insight into customer behaviour and preferences.

Data Clean Room

A secure, privacy-preserving technology environment that enables multiple parties to combine and analyse their datasets without either party gaining access to the other's raw data. Data clean rooms use cryptographic techniques, aggregation rules, and access controls to enable collaborative analytics while maintaining data privacy compliance.

Put this knowledge to work

Use Opagio's free tools to measure and grow the intangible assets that drive your business value.