by Stephanie Collis
Head of Marketing, Zetaris
Follow me on LinkedIn & Twitter
Many businesses are beginning to realise that there’s a potential trap implied in the term ‘big data’. These organisations have put a larger focus on quantity over quality, assuming that to succeed at big data they simply have to have the biggest data sets.
This has led to a reliance on unfocused, disorganised large data sets constructed internally or bought from third-party providers and often integrated with little scrutiny into their decision-making process. While a valuable resource for many organisations, this practice can lead to poorly informed business decisions, which has some executives concerned. True to form, the Forbes Insights and KPMG “2016 Global CEO Outlook” reporting that 84% of chief executive officers were concerned about the quality of the data they were basing their decisions on.
There is a real hunger in the market for certainty, as organisations look to make their data sets more readable and useable. Currently, the majority of data cleansing solutions – as the process of organising, curating and preparing large data sets is known – are delivered by high-earning data scientists through a manual process, but this is tipped to change. According to the International Data Corporation, spending on machine learning systems and tools to provide self-service visual discover and data preparation will outstrip similar IT-controlled tools by 250 per cent, eventually bringing an end to costly manual data wrangling operations.
A strong foundation for your business
But why is there such a focus on quality? Fundamentally, a company’s data is the foundation of every business decision. The old adage that ‘you are what you eat’ is as true for organisations as it is for human beings; the quality of your business’ data has a significant impact on what it can do and how well it can do it.
As we’ve explored in a previous blog, there is a direct flow from data to information to insight to action. What’s done at each step from data to action influences the next, and closes off potential avenues, ideally and ultimately narrowing you down to a series of actionable insights supported by your data. Just as an athlete needs to closely scrutinise what they eat in order to get the best performance from their body, decision-makers need to closely scrutinise incoming data to ensure they get the most from their organisation.
The risks of low-quality data
Low-quality data can be an existential threat to even the largest organisations. Everything from minor public relations embarrassments to lawsuits for regulatory non-compliance has been the direct result of companies becoming laissez-faire about the quality of their data. IBM estimated that bad data cost the US economy alone a staggering 3.1 trillion USD (nearly 4 trillion Australian) in lost revenue and productivity.
One of the most famous commercial flops of all time – New Coke – is partially attributed to bad data, Coca-Cola failed to make the connection between the data they were getting (hundreds of thousands of testers liked New Coke) with the data that suggested that the existing formula was essential to their brand, resulting in a PR disaster so bad it’s become the cautionary tale about aggressive rebranding.
On a more realistic level, unclean data can obscure the real value of an asset or of a market. DataScopic described a way in which a block of transactions in the Albuquerque area worth $500,000 were spread between 31 different misspellings of the word ‘Albuquerque’. Anyone looking at the data in its current form would get a scattered and untrue picture of the current state of the market. If they were making hard decisions about pulling out of certain markets and erroneously thought that their contracts in Albuquerque were worth only $300,000, that might be enough to have them abandon the city.
The stakes are even higher in industries subject to tight governmental regulation. Analytics software company SAS sketched out a (fictionalised) account of what happens when an organisation – in this case a US-based healthcare network – has a disconnect between their healthcare team who creates data about patients, and the team responsible for data management. The book asks hard questions of anyone involved in managing sensitive data:
“What if a drug recall is issued and a healthcare organization cannot notify the population that is potentially taking the drug? What if a knee implant is recalled but the organization cannot identify the patients who received that particular device?”
“These are serious public health and quality of care issues that structured data quality processes could potentially avoid.”
But low-quality data doesn’t even have to be used or acted upon to be a risk or a liability to your business. Even by simply existing on your servers, bad-quality data becomes an obstacle that has to be navigated around, increasing the lag between the time when a piece of data coming in and it being acted. This can lead you vulnerable to disruption by more prepared and informed competitors who have a more detailed and mature understanding of the market thanks to their data.
Smarter, faster, better decisions
Having good quality data that you can rely on is a boon for your business on every level. Not only will you be able to make better informed decisions, you’ll be more certain in them and be able to make them in a more timely manner, allowing you to capitalise on opportunities as they arise. It even has benefits at lower levels in your organisation, increasing the productivity of your boots-on-the-ground who will spend less time correcting errors in the data and more time using it accomplish their jobs. As mentioned above, it can also make passing compliance or auditing a breeze, as good data hygiene practices will naturally lend themselves towards regulatory compliance.
At Zetaris, it’s all about accuracy and precision. We have a proven track record of helping organisations across the public and private sectors do better; recently, Melbourne Business School students doing their Practicum with Zetaris discovered anomalies in some Australian Government datasets. The government was informed and the datasets fixed, meaning that it’s likely you’ve benefitted from Zetaris’ data cleansing practices without even knowing.
If you’re looking to make smarter decisions, start by building smarter data. Zetaris can provide comprehensive data cleansing services, ensuring that you’re building the future of your business on solid ground.