Consistency, timeliness and quality
All elements of a solid Master Data Management (MDM) strategy. In this day and age, if you just let data find its own way in your organisation, you are asking for trouble. As per Moore's law, computer power grows exponentially, and internet traffic is said to double each year. Whether you want it or not, data on your organisation's servers will grow - and fester, if you let it. With fragmented, duplicated and out-of-date data as a result.
We don't need to explain that this is problematic. Both from an obvious hygiene perspective, as well as from a regulatory perspective, especially considering data privacy and cyber security. The exponential growth of data in general means that MDM is no longer solely something to be worried about at the Enterprise level. All organisations, small and medium included, should have some idea of how they manage their data, which data is most important to them (and conversely most valuable to malicious actors), and which data is regulated and needs extra care.
Horses for courses
Luckily, there are multiple flavours where MDM is concerned. And it makes a lot of sense to opt for an approach to MDM that fits your organisation's footprint, both in terms of:
- industry, which will in part determine what type of data you are dealing with and associated responsibilities
- size / complexity of your organisation, which is one of the important drivers of what kind of solution you need to take care of your data
- IT landscape, as some organisations have a very deliberate / limited compilation of applications, perhaps mostly from one vendor, and others have grown quite organically over the years and their systems landscape might include legacy or homegrown applications. This has a massive impact in terms of the extra steps that need to be taken in looking after your data.
Why are we talking about this?
You might wonder why we make a point out of MDM. Sure, we are a data company, but we don't specialise in MDM solutions. Although this is correct, we do believe - as we have eluded to on other topics in the past - that quality data integrations can solve multiple problems in one go.
If the aim of MDM is to have consistency across data in multiple applications, ensure robust data quality and have data in destination applications on time, these are certainly requirements that integration services can fulfil.
The type of integration you need is determined by size and complexity and and some best practices for how to set up integrations can be found here.
Now, time for more detail on MDM and integrations
As indicated above, integrations can assist with MDM objectives. But of course there is some finer detail to consider.
First of all, there are various implementation models of MDM*:
- Source-of-record, where a single application, database or other source are viewed as the source of truth
- Registry, where cleaning and matching algorithms spot duplicates, prompting the organisation to update various source systems
- Consolidation, where the source of truth is a separated and centralised data repository
- Coexistence, where the hub as mentioned in consolidation is synced back and forth with the various connected applications
- Transaction/ centralised, where the central hub is used for all activities around the data and periodically pushes out data to all other applications that need it.
Quality application integrations in combination with well defined business processes for data input, can function as a source-of-record MDM approach, with the benefits of the coexistence model. After all, once inputed, your data can be carried over to any other application in your IT environment that needs the same data.
If the data integrity and performance of the integrations are monitored adequately, there are many scenarios where creating a separate data repository for MDM purposes becomes redundant. If security of the integration is monitored as well, why would you increase your risk by enlarging your attack surface by creating more copies of data than necessary?
Along similar vein, master data transmission in MDM can be approached in various ways:
- Data consolidation, where data gets captured in a separate hub, for replication in destination applications
- Data federation, where a single virtual version of the truth is created from many different applications
- Data propagation, where 'point-to-point' interfaces copy data from one application to the next
The latter is sometimes seen as an approach for legacy systems, however this is where modern, cloud computing technology helps. Cloud computing in combination with standardised methods for exchanging data between applications, enable 'point-to-point' integrations to become much simpler than they once were. This is where a managed integration service can unburden IT teams as well as others significantly, and remove the need for separate MDM solutions (and all time and money they need to run properly) completely.
Why create a separate hub or even a virtual view for master data transmission, which needs to be monitored and managed, if you can be sure the right data is in the right place at the right time in a fully automated fashion?
We would love to be challenged on our thinking, and hereby call all MDM experts to correct us if we are wrong. We have certainly seen integrations fulfil the roles of keepers of data quality, timeliness and consistency very well.
Should you be willing to have a further discussion to challenge our service offering in a different way - by requesting a demo to see what we are all about :) we would love that too!
Or just reach out to have coffee with us. Coffee is always on us, don't be shy.
Image by Yan Wong from Pixabay