Not perfect, but close: why a ‘good enough’ mindset for data might be holding your business back
Jamie Beaumont
Data and AI product manager
It’s long been said that data is the lifeblood of AI. However, data is also difficult to get right.
While we know it is inherently valuable, few people understand how to distil its insights so they can act on that intelligence. For example, looking at the existing workforce we see a raging war on talent means it’s challenging for organisations to acquire data skills. According to the UK Government, there is a huge disparity between the 234,000 data-related vacancies and 10,000 data professionals entering the market each year.
It’s often overlooked how hard it is to gain a single and complete view of data when it’s located in disparate repositories within a business. Finding and consolidating data is a burden. Meanwhile, ‘cleaning’ data to address quality issues like missing, inconsistent and duplicate information, as well as logical conflicts and dependencies, incurs an additional cost. Furthermore, security and governance are often viewed as stifling innovation rather than producing reliable, quality data that can be used to foster a culture of information and insight-driven activity across the business.
Whether data is used to leverage AI or to run cloud applications, every business needs data to move forward.
Today, 90% of organisations struggle with reduced performance as a result of low-quality data. Becoming one of the 10% who don’t could provide your business with a significant competitive advantage.
Many problems only become apparent once you deploy a new data product. This could be due to multiple change management issues, such as a lack of resources, stakeholder buy-in or alignment between teams, or perhaps the project was ill-defined, so the business isn’t clear on the outcome it wants to achieve with its data.
Ultimately, these factors mean your business is unable to scale its operations in line with its needs, leaving you reliant on intuition for decision-making. While this might come from an experienced voice, it’s not scalable and can't be passed on to everyone.
Does data really need to be perfect?
Strive for perfection, and you’ll never achieve anything. That’s the thinking of many organisations who choose ‘good enough’ data to ensure the business keeps moving forward and that costly investments in perfect data are avoided. Yet, according to research from Gartner, poor-quality data costs organisations an average of $12.9 million each year.
If perfect is the enemy of good and good is the enemy of great, good enough is the enemy of quality data, right?
Within incident management, there is the 1x10x100 rule. It says that the cost of addressing a data quality issue at the point of entry equates to 1x the original cost. However, if you fail to address that issue early on, the cost will escalate to 10x once it hits your systems. If you allow it to reach the stage where the business is making decisions based on that poor-quality data, the cost hits an incredible 100x due to downtime, remediation efforts, lost opportunities and customer dissatisfaction.
While your data doesn’t need to be perfect, an incremental improvement – such as moving your data quality from 5/10 to 7/10, for example through some small-scale automation or integration – will help you transition towards a data-driven organisation that isn’t reliant on gut instinct.
Flip the script to reset C-suite expectations on data
Rather than look from the top down and question what data can do for a business, consider what can’t be done because the data isn’t good enough. This could be anything from achieving greater organisational output, to avoiding potential risks. Thinking differently about data helps to build business cases for investment in robust data initiatives.
The impact of any data or AI product will live and die by the quality of the data it ingests. As the saying goes, “garbage in, garbage out”. We know 80-90% of data is unstructured and likely to exist in many different formats, be fragmented and reside in functional siloes. Therefore, simply gaining visibility of your data estate at scale can bring about significant changes to the quality of your data. When you can see what data you have, where it is, who has access to it and the controls that protect it, you can bring it all together more easily.
Find our more about how Claranet design and deliver a data platform that scales with your business and helps you keep pace with change with our Platform Services guide.
Discover our full Data and AI solutions or speak to an expert.