<img height="1" width="1" style="display:none;" alt="" src="https://px.ads.linkedin.com/collect/?pid=4011258&amp;fmt=gif">
ARTICLES

January 4th, 2023  |  9 min read

How a solid data foundation helps you do more with less 

Knowledge is power, and having near real-time intelligence about your customers, products, locations and assets is a given for any organisation that wants to remain competitive and grow. This is particularly true during times of uncertainty and economic pressure, such as pandemics and recessions, when businesses seek not only to manage costs but also to pivot and respond to change. Consider the manufacturing company with multiple locations, product lines and suppliers. Insights into the total cost of production by product line combined with customer intelligence on buying preferences by region could lead such a company to divest or cease production of certain products. 

Little wonder then that in a recent EY article, it was claimed that 93% of organisations plan to continue to increase their data and analytics spending from 2022 - 2027. In fact, despite the current economic slowdown, Gartner expects global IT spending to climb to $4.6 trillion in 2023, registering a year-over-year increase of 5.1%. According to Gartner analysts, “enterprise IT spending is recession-proof.” This makes sense, as technology – and analytics and business intelligence systems in particular – are a vital enabler of automation, improved digital experiences and reducing complexity. All of which are geared towards reducing costs and boosting the bottom line.

Hang on a minute though! Isn’t this exactly what many businesses have been trying to do for years? Digital transformation is nothing new, and neither is the need to collect and analyse swathes of data in order to fuel that transformation. Organisations have already made significant investments in Data Lakes, Data Warehouses, Business Intelligence tools and Analytics platforms – and the people who administer them. Yet still it seems like something is missing. Still it appears that technology and business leaders are searching for the insights they need and are prepared to pump in more investment in order to find it.

But is the answer really spending more money on more tools and data stewards, scientists and engineers to deploy and manage them? Or is there a more fundamental, basic problem that needs addressing? We think there is. We believe that one of the major reasons enterprises are struggling is because they have a fundamental data quality issue, in that they have no way of managing data quality in a consistent and automated fashion. They have the data, and they have the tools to analyse it, but the data itself is siloed, inconsistent, stale, inaccurate… or all of the above. The output of analytics and business intelligence tools can only ever be as strong as the input – and if the data is poor, so the results will be too.

Surely there’s a way of taking poor quality data from different sources and bringing it together in a way that is consistent, accurate and scalable? This has been the promise of Master Data Management (MDM) systems for the past 30 years, but in reality the process still involves a great deal of time and effort from Data Engineers whose time would be better spent elsewhere. This is one of the reasons why 75% of MDM projects still fail, because instead of automatically fixing the majority of data quality issues and empowering business users to address the exceptions, human involvement from data and IT specialists is still required most of the time.

The problem is not that organisations have not invested enough in managing and analysing their data estate. The issue, until now, has been that the tools available to master that data have not been fit for purpose. Traditional MDM systems can only deliver a single view or Golden Record AFTER the modelling work has been done upfront – by you. Which typically takes months, requires heavily involvement from IT staff and is out of date by the time it’s complete, thus fuelling the disconnect between MDM and business value.

We get it! In order for you to have over your data, it’s always easier to build the end state and work back from there, right? Yes, in some ways it is, but when you put this into practice, it never works. Or maybe it worked for the first few systems, and then got way too hard as more data came into your systems or you were required to make model changes. This is not unique to MDM, in fact it is exactly why there has been a new influx of "modern Data Warehouse" vendors. They follow the same principle of not forcing everything to conform to the same model on entry, but rather keep the data in an unmodelled state and build different models on the fly. This was ONLY made possible because of the economic model offered by the cloud whereby suddenly you could access hundreds of machines and pay at per-second-scale.

Modern MDM platforms remove many of the barriers to creating value from data by automating or eliminating manual tasks, and using advanced technologies like Graph, AI and ML to ingest data from thousands of sources and allowing the natural data model to emerge. This means leaving data in its natural state, and projecting it into your Information Models on demand. This also opens the door to having an MDM system that can host your data model in 15, 20 or 25 different shapes. They are also built for the Cloud, which means that they are easy to install, simple to set up and can scale up or down as you require. Perhaps most importantly, they break the traditional model of technology teams assuming sole responsibility for mastering data by giving business users access to the data and the power to run their own queries, to build their own information models and interrogate data in their own ways.

In many ways, the new breed of MDM platform doesn’t look much like MDM at all. We believe that MDM as we know it will cease to exist, and we are entering a new category – that of Augmented Data Management – which seeks to simplify and speed up the process of preparing data for insights. It will still have many of the inner workings and expectations of MDM, but will capitalize on the influx of modern technology. With so much riding on the output of analytics and intelligence initiatives, it is crucial that a fully governed, reliable, compliant and accurate supply of data is made available to the whole organisation.

So before building the case for increasing your investment in data and analytics, it is worth considering that it might not be more investment that is needed, but a change in focus. A solid data foundation – one that doesn’t discriminate between different types and sources of data – should always be the first priority when it comes to providing actionable intelligence and commercial insights to the business. Without it, analytics and BI tools will never deliver the outcomes demanded by organisations today.
Tim-avatar
By Tim Ward
Chief Executive Officer at CluedIn

Want to learn more about how to optimise data management to deliver better outcomes?