The Financial Value of Data Quality

We are constantly hearing about the fact that the issue of data quality is rapidly moving up the corporate agenda to become a board level discussion – dawning the age of the Chief Data Officer.

One of the key drivers for this is the realisation that data has a financial value – either in its own right or via the impact it can have on business processes and outcomes that drive the profitability of the organisation at large.

However, despite this, there is still all to often a sense of apathy towards tackling the data quality challenge. As a result many organisations are still struggling to make the case for larger corporate wide data improvement initiatives.  This is largely driven as a result of ‘data quality champions’ within the organisation being poorly equipped to make the linkages required between data inaccuracy and overall business performance.

When it comes to data quality it’s essential to start thinking about the long game and how it specifically pertains to customer or party data. Customer data is not only the lifeblood to the effective operation of an organisation – it also has commercial value. This will become more and more apparent as business models around data evolve. Gartner state that by 2016, 30% of businesses will have begun directly or indirectly monetizing their information assets via bartering or selling them outright.

So why do organisations struggle to put a value on their customer data assets?

One of the key factors here is visibility and ownership at a corporate wide level. Many organisations today hold data within a multitude of silos perceived to be owned by a range of individuals around either lines of business or the IT department itself. This is probably exemplified best when you look at statistics around the roll out of data quality technology investment, with most deployments pertaining to one project or department and very few spanning more than three projects or departments.

Another key challenge is that there are often ‘hard’ and ‘soft’ benefits associated with any technology investment. Many of the benefits of investing in data quality are perceived to sit in the ‘soft’ (difficult to prove) bucket. This is because a lot of the upside sits in improved operational efficiency. Take labour productivity as an example. Gartner state that Data Quality impacts overall labour productivity by as much as 20%. This highlights the importance of data quality as a critical enabler to process quality. Data champions within organisations today need to start mapping the impact of data quality back to real life – and ideally measurable – business processes such as customer care performance or on time delivery.

One size fits all never fits anyone particularly well…

Therefore to convince any board to move forward with an investment in a data quality initiative it’s essential for them to see for themselves the cost of data inaccuracy as it pertains to their own organisation. The good news is technology can enable this utopia and the market at large is starting to wake up to that fact. This is evidenced in the circa 10% increase in the adoption of data profiling and discovery tools between 2012 and 2013.

The key is to select technology that can tell you not just the percentage of data inaccuracy that exists in your organization’s customer data today, but to connect the dots between inaccurate customer data and ‘things’ such as customer value, helping to put a value on your data quality problem. Taking this approach in the early stages of scoping a data quality initiative will give you the ammunition you need at board level, whilst identifying the low hanging fruit for data improvement.

 

 

 

Related content

Access full article

B2B strategies. B2B skills.
B2B growth.

Propolis helps B2B marketers confidently build the right strategies and skills to drive growth and prove their impact.