Poor Data Quality is the Biggest Barrier to Operational Efficiency for Modern Organisations, Says STX Next
9 September 2024
Employing data specialists, selecting the right tech and understanding the value of a patient and meticulous approach to validation are all fundamental elements of an effective data strategy.
Recent research shows that data is an asset that many organisations undervalue, with businesses generating over $5.6 billion in annual global revenue losing a yearly average of $406 million as a direct result of low-quality data.
Bad data primarily impacts company bottom lines by acting as the bedrock of underperforming business intelligence reports and AI models – set up or trained on inaccurate and incomplete data – that produce unreliable responses, which businesses then use as the basis for important decisions.
According to Tomasz Jędrośka, Head of Data Engineering at STX Next, a global leader in IT consulting, significant work behind the scenes is required for organisations to be confident in the data at their disposal.
Jędrośka said: “Approaches to data quality vary from company to company. Some organisations put a lot of effort into curating their data sets, ensuring there are validation rules and proper descriptions next to each attribute. Others concentrate on rapid development of the data layer with very little focus on eventual quality, lineage and data governance.
“Both approaches have their positives and negatives, but it’s worth remembering that data tends to outlive all other layers of the application stack. Therefore, if data architecture isn’t designed correctly there could be issues downstream. This often stems from aggressive timelines set by management teams, as projects are rushed to facilitate unrealistic objectives, leading to a less than desirable outcome.
“It’s important to remember that the data world is no longer recognisable from where we were 20 years ago. Whereas before we had a handful of database providers, now development teams may pick one of a whole host of data solutions that are available.
“Businesses should carefully consider the requirements of the project and potential future areas that it might cover, and use this information to select a database product suitable for the job. Specialist data teams can also be extremely valuable, with organisations that invest heavily in highly skilled and knowledgeable personnel more likely to succeed.
“An integral aspect of why high-quality data is important in today’s business landscape is because companies across industries are rushing to train and deploy classical ML as well as GenAI models. These models tend to multiply whatever issues they encounter, with some AI chatbots even hallucinating when trained on a perfect set of source information. If data points are incomplete, mismatched, or even contradictory, the GenAI model won’t be able to draw satisfactory conclusions from them.
“To prevent this from happening, data teams should analyse the business case and the roots of ongoing data issues. Too often organisations aim to tactically fix problems and then allow the original issue to grow bigger and bigger.”
“At some point, a holistic analysis of the architectural landscape needs to be done, depending on the scale of the organization and its impact, in the shape of a lightweight review or a more formalised audit where recommendations are then implemented. Fortunately, modern data governance solutions can mitigate a lot of the pain connected with such a process and in many cases make it smoother, depending on the size of the technical debt.”
Jędrośka concluded: “Employees who trust and rely on data insights work far more effectively, feel more supported and drive improvements in efficiency. Business acceleration powered by a data-driven decision-making process is a true signal of a data-mature organisation, with such traits differentiating companies from rivals.”