‘Data is the new oil’ was term coined by British mathematician Clive Humby 2006. It’s become an overused phrase largely meaning that if your organization has access to vast amounts of data, you can use it to aid decision making and drive results.
While there is great truth in that having access to data can lead to greater business intelligence insights, what companies actually need is access to ‘good’ data and its insights. However, knowing what makes data valuable is something that many still struggle with. With considerations often including factors such as quantity, age, source or variety, not truly understanding what type of data is good for business means it’s easy to get lost in data sets that are ultimately poor quality and bad for decision making.
The big cost of the wrong big data
The cost of handling poor quality data is high. On average the price is $13m per business or 10-30% of revenue, and for companies of any size that’s a huge burden.
Companies have become used to making decisions based on big data sets. They use spreadsheet software to analyze and use that analysis to make decisions. But this approach fuels the need for even more data in which to spot trends of ‘statistical significance’. The challenge is that it’s difficult to truly scrutinize the source and authenticity of information. Take consumer insights for example. If a business acquires its data sets from a third party, can it ever be 100% certain that all of that information was provided by authentic respondents, with none of it from bots or people not being 100% truthful?
Co-founder and CXO, GetWhy.
For case studies on why large data sets don’t always mean more accurate results, we can look to politics. 2024 is due to be a monumental year with both the US and UK set for elections and political polling will once again have a role to play in predicting outcomes. However, they’re not always right. During both the 2016 and 2020 US elections, the polls leading up to vote predicted some very big things wrong, the former even predicting that Hilary Clinton was going to celebrate a huge win. Reasons for the wildly incorrect predictions include nonresponse bias, where Trump voters were less likely to interact with polls, skewing the results towards Clinton.
Similarly in the UK, the ‘Shy Tory factor’ has been referred to during elections where the Conservative Party has performed better than polls predicted. In this case, respondents would say they were going to vote the opposite way to what they ultimately ended up doing.
While a handful of such respondents in large data sets may not provide too much influence over final analysis, the aforementioned election polls show what can happen when data isn’t truly reflectively of the external world. Now for businesses which use such analysis to drive decision making, acting on that information can cost them heavily.
Listening vs understanding
Relying on big data sets is also a sign that businesses are often set up to listen to, not understand, their consumers. This means that while they can use big data to see trends, they don’t understand why those trends exists. For instance, if an organisation knows that consumers like the colour blue but then don’t seek any further information, they’ve just listened. In the short-term this may prove successful but if that trend suddenly shifts and consumers start liking green, they’ll be slow to react.
Now, if a business knows consumers like blue, but then go a step further and discover why they do, they’ll understand what actually influences them. Perhaps blue is in response to an event or a particular mood, and when an organization has that information not only can they make decisions that are more empathic towards consumers, but they can better prepare for any evolution in requirements.
Ushering in a new age of empathy
Empathy is critical at a time when the world is facing a challenging time. With various significant geopolitics events occurring, understanding consumers is one of many things that can help bring about a new age of empathy. Also, companies have work to do to keep consumers onside as there is a growing distrust of brands driven by a number of reasons. For instance, consumers are frequently exposed online to unfair practices, including fake reviews and data concerns around targeted advertising.
To break the cycle, businesses need to revisit how they’re discovering insights. Collecting insights has typically involved huge time and cost investments, and the resulting big cumbersome data sets that dehumanize respondents down into a number are no longer suitable in a world where people’s views are consistently shifting. Not only do they take too long to collect, but data might be incorrect in the first place.
Organizations need to place more emphasis on understanding consumers. They need to know why they think a certain way, not just that they do. AI-driven qualitative insights enable businesses to quickly understand what audiences truly want. The AI can run survey tools with respondents from demographics across the globe before delivering analysis in hours, and with the same quality of traditional methods. Able to then watch recordings back, brands not only see what the respondent says, but how and why they say it.
Ultimately, bad data costs businesses a lot. Acting on information that isn’t inaccurate can have significant repercussions ranging from slightly unhappy consumers to complete failure. Companies have to do away with their old processes and adopt a new approach to insight collection. Bigger data sets doesn’t mean better insights, a more thoughtful, targeted approach does. And, when businesses truly understand consumers, it drives empathetic decision making, brand trust and greater results.
We’ve listed the best customer database software.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: