Menu

Blog

How Data Quality Drives Digital Transformation – Part II

Written by John Kosturos on September 24, 2020

Second in a two-part series of articles on the role of data quality in Digital Transformation

This is the second of two articles about the role of data in Digital Transformation (DX). The earlier article defined DX as companies and their customers using technology to renegotiate the fundamental terms of their relationship. In some cases, the renegotiation ends with the company going out of business. Borrowing a term from the laws of natural selection, Thomas Siebel, the author of Digital Transformation: Survive and Thrive and an era of Mass Extinction, calls this “Punctuated Equilibrium.”

Data quality and the four pillars of DX

According to Siebel, there are four pillars of DX: Big Data, data analytics, Artificial Intelligence (AI) and IoT. Data is critical for success with each pillar. However, it’s not just data that you need for DX. You need data that’s accurate and up-to-date. You need data that’s formatted in ways that allow it to serve the goals of DX. This means having normalized data. Without such high-quality, normalized data, DX efforts will be sub-optimal or fail altogether.

  • Big Data – Big Data sets are heterogeneous, making data quality consistency essential to success. For example, a transformative approach to business development might involve linking Account-Based Marketing (ABM) processes with external data sources, such as social media and other publicly available information (PAI). This Big Data setup would enable salespeople to target key people inside major accounts on an accurate and timely basis. However, it will only work if the account data about is correct and aligns with the social media data and PAI, e.g. if the name of an executive at the target account is spelled differently on social media and in an account database, the Big Data solution may miss the connection.
  • Data Analytics – The analysis of data, including data visualization and reporting, depends on having data of high quality. Duplicative or mis-formatted data will cause problems—perhaps hard to spot—in data analytics workloads.
  • Artificial Intelligence (AI)— As with Big Data and analytics, AI needs good information to be effective. An AI algorithm that’s parsing bad data leads to the ultimate “garbage in, garbage out” process. It’s particularly bad because AI algorithms often include Machine Learning (ML) techniques that enable the algorithm to “teach itself” to “get smarter.” If it’s learning from junk data, it will get progressively more “stupid,” so to speak.
  •  IoT—IoT is data-intensive. To make DX work with IoT, operational systems have to be able to correlate device and customer data. For example, if a retailer wants to use location sensors on shopping carts to send coupons to shoppers using SMS texts in real time, it has to be able to match the sensor output with the exact customer record. If John Q. Smith is on Aisle 1 in Store A and John L. Smith is on Aisle 2 in store B, the retailer has to know who is who and where they are—or the whole project will fail to achieve its goals.

These pillars are interdependent, so data quality can drive improved outcomes across the entire organization. Good data from IoT, for instance, will lead to better data analytics and AI. Poor data will create a negative synergy, with useless AI outputs and ineffective Big Data processes.

Not everyone is up to this

According to recent research from NTT DATA Services, a global digital business and IT services leader, there is a big discrepancy between companies’ embrace of data for DX and their follow-through. Their study, “The Big Pivot: From Data Islands to Data Insights” found that while 79% of organizations recognize the strategic value of data, just 10% of organizations are using data effectively for transformational purposes.

In particular, NTT found that companies face major challenges from what it calls “siloed islands of data” spanning the organization. With an accompanying lack of data skills and talent, only 37% of companies surveyed are very effective at using data to adopt or invent a new business model. Only 31% are able to use data to enter new markets.

The data orchestration solution

How can companies span the gulf between the current reality of data silos and inconsistent data and the promise of the four pillars of DX? Data orchestration offers a solution. Data orchestration solutions automate the processes that convert raw data into coherent, accurate data sets that can serve DX. They can normalize data in the process. Specific workflows will vary extensively, of course, but in general data orchestration performs the tasks of normalization, verification, enrichment, scoring and de-duplication. 

The data orchestration process can also handle linking and routing of new and historical data. Data arrives at a business from many different directions. An orchestration solution is able to match inbound data with existing datasets as well as with people inside the business who need to see it. For example, if an account representative is handling sales leads for a given company, and new leads arrive from that company via a web registration form, the orchestration solution will route those leads to the account rep.

Data orchestration can also enrich data sets with external data. In a customer contact use case, for example, a data orchestration solution can add SIC codes and Zip+4 postal codes to an existing customer list. The process can correlate data between different sources and identify duplications and opportunities to add missing fields. Such enrichments make the customer data more valuable to DX processes.

Companies will leverage a broad range of technologies in DX, but ultimately, it is a data-driven phenomenon. Data makes DX work. And, to ensure that data is suitable for its DX role, it must be normalized, tagged, classified and routed to the right people. The data has to be clean, enriched and purged of duplicates. Making this happen requires a data orchestration solution. A data orchestration solution automates the critical processes needed to prepare data for DX.

Top