How to Create a Data-Driven Business

White Paper

This whitepaper provides guidance for those looking to escalate their data quality initiatives up the corporate agenda to help organisations achieve company-wide adoption of best practice when it comes to managing their data assets. Download now.

Get the download

Below is an excerpt of "How to Create a Data-Driven Business". To get your free download, and unlimited access to the whole of bizibl.com, simply log in or join free.

download

Catalyst

Organizations are beginning to understand the potential of data to improve customer engagement, reduce operational cost and protect themselves from regulatory risk - dawning the requirement for enterprise wide data governance programs. But while there is a lot of discussion around and enthusiasm for a corporate wide approach to data management few organizations today can claim to have achieved this state of utopia.

This report sets out to explain the business value of data quality, and provide guidance on how to advance your current data quality initiatives.

Ovum view

Ovum’s engagements with its clients are always eye-opening experiences, providing a clear view into the state and maturity of technology adoption beyond vendor marketing hype. Different market segments and industry verticals face differing pressures, of course, but data quality represents an ongoing battle for all of Ovum’s clients. The issue of data quality has been a challenge for enterprises for the last two decades and has subsequently created a large software vendor community, and one might be forgiven to think that data quality problems have been resolved and gone away. But this is clearly not the case. While considerable progress has been made to raise awareness of the importance of data quality for business-driven IT projects, and investment in data management software remains robust, plenty of work still needs to be done. Recent business and technological trends continue to raise the bar for data quality, and Ovum finds many enterprises still struggling to get a trusted view of their business data. The sheer volume and pace at which data enters and exits the organization coupled with the increased reliance on data for a business to function smoothly and stringent regulatory compliance mandates for some industries has raised the stakes considerably for data quality.

Recent client interactions and enquiries fielded by Ovum around data quality highlight a gap between vendor perception of the market and the reality of where their customers are in the adoption curve.

Many organizations remain in the dark about the quality of data flowing through their systems and the financial impact that poor quality is having on business performance. Many continue to persist with tactical, piecemeal approaches to data quality tooling (primarily around customer data, though some are starting to address other domains, notably product data). There is still confusion about how and where it fits within an overall data management strategy and with other tooling – especially master data management (MDM). And relatively few organizations have formalized data governance programs in place. That is starkly different from how vendors envisage their data quality technologies to be implemented and used. What is clear is that many enterprises are seeking step-by-step guidance on how to advance their data quality initiatives, and that a significant vendor opportunity remains up for grabs in this important segment of the information management market.

Key messages

  • Organizations are only just starting to assess and count the costs of data quality in their business.
  • While vendors push a vision of integrated data quality suites, point tooling decisions are still the purchasing norm.
  • While the primary focus of data quality initiatives remains focused on customer data, organizations are straying into multi-data domain quality.
  • Enterprises are thinking about data governance, but are unsure about where to start.
  • Enterprises are still unclear on the relationship between data quality and MDM.

Understanding the Current State and Cost of Data Quality is a Starting Point

Data management remains a high-priority investment

According to Ovum's ICT Enterprise Insights (ICTEI) research, data management and integration technologies, which include data quality, remains a key target for corporate IT investment (see Figure 1 below).

[Download PDF to see Figure 1]

According to our data, over 95% of the 6,700 companies surveyed by ICTEI already recognize data management and integration as keys to success and have invested in various tooling. A significant number are now looking to advance and enhance their current capabilities – existing capabilities beyond core data integration (ETL, migration, etc.). Many are looking at data quality and MDM as being key enablers for such advancement, recognizing the importance of a single, trusted view of their data as a necessary foundation for successful business-driven IT projects.

Understanding your data is the first step towards improving its quality

Investment continues to rise; however, Ovum found that well over half of the enterprise IT clients we engaged with in 2013 still have little or no idea of the state of data quality in their systems.

We believe this situation exists because:

  • Data quality, like many other things in life, is only noticed when it's missing.
  • Too many organizations are running blind because they either do not have all the timely and accurate information they need or, in some ways even worse, think they do have it, but in fact the data is flawed.
  • Organizations are only now starting to recognize that many upstream business and IT problems – such as operational process inefficiencies, bad strategic decision-making, ineffective/unsuccessful IT projects – are rooted in poor data quality.
  • Existing data quality tooling investments deployed in main corporate systems are not in widespread or systematic use across the organization.
  • Data quality is a subject that is admittedly hard to get excited about; for many organizations it is still considered largely an IT problem, and has yet to become a fixture on corporate boardroom agendas.

Companies can save themselves some headaches by proactively planning for bad data quality to happen. They should start by recognizing that the poor state of their data quality strategy is a key barrier to running an effective business and then measuring the state of their data quality. This is starting to happen as more enterprises recognize data profiling as an important first step, as an initial discovery tool for initial assessment. Enterprises can also turn to a number of independent or vendor-driven services and knowledge resources for best-practice advice and guidance on data quality implementation, including a variety of online educational resources, best-practice self-assessment templates, and online forums. These are more cost-effective than buying lengthy, rigorous, expensive business consulting services. In particular, they act as a useful educational aid for fostering internal data quality champions looking to communicate and raise greater awareness across the organization and (later on) to guide and align efforts for automated data quality software tooling decisions.

Once that understanding happens, organizations are in a better position to put in place effective data quality improvement processes and the right set of technologies that clearly target specific business pain-points that are affected by the provision of poor quality data at the source.

Most companies are early in the data quality maturity curve

Data quality governance is not achieved overnight. Initiatives typically progress on the following maturity curve (Figure 2).

[Download PDF to see Figure 2]

The maturity model is marked by four distinct phases:

  • Phase 1 (Undisciplined) – The organization lacks any defined processes and policies for integrating data and specifying data quality rules.
  • Phase 2 (Reactive) – In this stage, the organization takes positive action and steps to identify and address data quality issues, but after the fact. Typically, dirty data has already entered corporate IT systems (ERP and CRM) and business processes, and the company is effectively in a reactive "fix-it" mode.
  • Phase 3 (Proactive) – The organization takes a more "proactive" and centralized approach to data quality, driven primarily by a domain-specific focus (split either horizontally by business process or vertically by industry need).
  • Phase 4 (Governed) – Few organizations have attained this level, which means a selfsustaining, fully governed data quality environment, complete with well-communicated and documented rules and controls around data quality across all aspects of the business.

At least 60% of the organizations we have had interactions with are at either Phase 1 or 2. But most are now working steadily towards Phase 3, with planned investments in data quality software and governance programs.

A good starting point for an organization is to honestly assess where it sits in a maturity model, and outline the steps needed to move up a maturity curve towards a fully managed and governed environment. Progression up this maturity curve, from haphazard to governed data quality management, will depend on several factors, not least support from an efficient IT and organizational infrastructure supported by strong automated tooling. However, implementation of an effective and sustainable enterprise-wide data governance program involves changing organizational processes, culture, and behaviors, as much as implementing new technology.

Counting the business cost of dirty data

Most companies have no idea of the cost impact of poor data quality to their business. Many of the companies we have engaged with admit to not having a formal system for measuring the cost of poorquality data. But the harmful impact of poor data quality has been estimated at as much as a third of revenues, costing US companies a staggering $700bn per year and exercising a similar drain on the productivity and profitability of European firms.

Without hard data, it is difficult for data quality to get the attention of senior business executives. We often find that many organizations enter into a "vicious circle," where some people (in IT and business) may be aware that a data quality problem exists, but do not necessarily feel motivated to address the issue. Often senior management is unaware of the problem and those lower down the corporate hierarchy (for example, LOB managers) do not want to highlight the issue for fear of accountability and being "the messenger to be shot."

The key is ensuring better collaboration between IT and business users, especially coordination of activities and defining roles, responsibilities, and ownership of data quality processes. The encouraging news is that this is slowly happening. For example, a manufacturing client, after initiating close collaboration between LOBs and IT, found out that data duplication carried a real cost in effectively managing its product specifications and inventories.

Securing commitment and budget for data quality is key

Counting the cost can be a powerful driver for getting the business side of the organization to take data quality seriously and secure budget for the initiative. Data cleansing for "data cleansing's sake" is rarely embraced by key senior business stakeholders and is often the root cause as to why data quality initiatives have sustainability issues over time. One of fastest ways to kill a data quality initiative is to tell your management that you're spending budget on cleaning data. The hard lesson behind this joke is that data quality is often disregarded as a "sunk" IT cost, rather than a valuegenerating initiative for the business. The key for organizations is to instill self-awareness across the enterprise that not only equates business problems back to poor data quality but attributes a real cost directly to it.

To help, organizations should equally consider applying techniques like Six Sigma frameworks to improve data quality management processes, and highlight the business benefits of good data quality. They should investigate incorporating core Six Sigma processes and methodologies – documenting process flows, inputs, and outputs, and understanding voice of customer (data). Companies can also tailor business process management frameworks to understand data quality (e.g. understanding corporate mandates that depend on data quality). For a credit card company, this might include financial reporting, credit risk, and management reporting about trends that would signal predatory or discriminatory practices. Each of these mandates also requires organizations to document their information dependencies, i.e. determine where data are provisioned within the company, and whether they are created by upstream business processes or obtained from customers or third parties. This insight can subsequently be used to determine criteria for "good" or "bad" data – completeness, accuracy, consistency, reasonableness, and other relevant quality dimensions. A variety of formats are available to present measurement and analysis; consider modifying statistical process control charts to track data quality via scorecards that juxtapose critical data elements on one axis and mandates on another. Again, this is an exercise in organizational action. Finally, organizations that possess the budget should aim to formalize data quality as an institutional activity, perhaps as part of an integration competency center (ICC), that explicitly defines, manages, and measures data quality and governance efforts.

Additionally, once budget is secured and the project is up and running, companies should also put in place good documentation processes that detail to business stakeholders how the investment is achieving expected ROI. Hence, while data quality is implemented tactically from below, it needs to be championed from above.

Point Data Quality Tooling Decisions are Still the Norm

Insights into current buying patterns

The message from vendors and consultants is now clear: data quality is not implemented as a single technology, but a discipline supported by multiple tooling – core data cleansing, matching, deduplication, profiling, standardization, enrichment, etc. – that is ideally delivered as complete, tightly integrated platforms and suites.

Yet, Ovum continues to entertain enquiries from clients asking specifically for tool-specific functionality, particularly around basic data cleansing and (increasingly) profiling. This says a great deal about current buying patterns for data quality, and there are several interpretations we can infer from these requests:

  • Companies are still taking a tactical approach to data quality, on a project-by-project basis, and are looking to solve them with discrete tooling.
  • Companies that have a vested interest in a particular data management vendor are disappointed with discrete functionality, and are looking to supplement that with best-of-breed tooling.
  • Companies recognize that a full best-of-breed approach is the best way forwards in terms of functional prowess, yet acknowledge potential challenges in terms of integration of the disparate vendor tooling.

Rising interest in data profiling signals a change

While Ovum still addresses many requests for basic data cleansing (notably, data de-duplication and name and address cleansing) tools, we see a rising interest in data profiling. The former is understandable but represents a tactical "fix-it" approach. The latter, however, shows that many companies are looking to tackle data quality issues upstream and capture and resolve bad data before it enters their business systems.

This is an encouraging trend, and points to an increased awareness among enterprises that quality is not just about reactively correcting syntactically incorrect data after the fact, and requires a more proactive approach – i.e. actively finding and fixing data quality issues before they adversely impact business-critical IT systems and business processes. At an enterprise scale, that can only be achieved with an integrated platform approach to the problem. Organizations should therefore look to integrate data quality in the context of a broader data management and integration strategy and platform.

Quality is Becoming a Multi-Data Domain Issue

Expanding beyond customer data

Customer data remains the core focus of many data quality initiatives; hardly surprising considering that the data quality market has grown up around scores of suppliers that were good at dealing with customer data attributes (e.g., cleansing and de-duplication of name and local postal addresses).

But we see that changing as more companies become increasingly aware of the multidimensional nature of business data. Many of the data quality problems they are in fact looking to solve are by definition multi-domain and rely on an understanding of the relationships that exists with other data types – financial, product, asset, and location data, for example. Organizations are slowly recognizing this and the quality disconnect that exists between them (for instance, an online retail client that found that its ordering processes relied on not just accurate customer data sets, but equally clean financial payments, credit history, demographics, and location data).

Organizations should therefore aim to take a broad view of their data and align data quality across non-customer data types as well. However, they should bear in mind that data domains are much more complex than customer names and addresses and less structured, so simple and well-known processing algorithms and rules that are commonly applied to customer data are tougher to apply.

Enterprises are Thinking About Governance

But they don't know where to start

While data quality ensures that data is correct, consistent, complete, and current, data governance ensures that it stays that way and remains fit for business purpose. Without sophisticated governance controls, processes, and workflows in place, a data quality tooling investment can quickly become a "headless" application and the long-term returns on the technology investment are quickly diminished or lost. Data governance programs often strive to address this by putting into place stewardship controls (policy-based approvals, decision steps, issue resolution, change control, etc.) as well as constant measurement and monitoring. Hence, data governance becomes a key "second" part of data quality.

Although many organizations understand that data quality tooling can only be effective as part of a continuous and sustainable enterprise-wide data governance program, few have done so, formally at least. We find that data quality governance initiatives fail to get off the ground from the outset because they are over-scoped and overly dependent on consulting services. Simply knowing where to start and where to go next is challenging. Those that have embarked on a governance strategy tend to rely on simple and narrow data stewardship roles and activities primarily focused on enforcing business rules at the source system for particular IT projects or certain key lines of business (often manually).

The next step for organizations is to automate and expand these activities into an enterprise-wide data-standards-oriented form of data governance. Encouragingly, we see growing interest in data quality and data governance engagements and discussions at the enterprise level versus at a project level, signaling a possible shift in that direction. Many clients are looking to evolve their initial data quality efforts towards more institutionalized data quality governance programs within the next three years. While they are still in the early stages of planning and a long way away from operationalizing data quality, this is the right direction to be headed.

Vendors are stepping in to help

Data governance is certainly an increasing area of interest in the market. Organizations should increasingly look to their data quality software supplier of choice to provide them with more than just technology. Fortunately, vendors are responding not only with services, but sponsored online networking and collaboration communities that deliver a wealth of knowledge and guidance for individuals who recognize that data needs to be managed as an enterprise asset and that provide pragmatic information resources for getting data governance initiatives off the ground – such as raising awareness, communicating industry best practices, and providing maturity assessment tools.

Want more like this?

Want more like this?

Insight delivered to your inbox

Keep up to date with our free email. Hand picked whitepapers and posts from our blog, as well as exclusive videos and webinar invitations keep our Users one step ahead.

By clicking 'SIGN UP', you agree to our Terms of Use and Privacy Policy

side image splash

By clicking 'SIGN UP', you agree to our Terms of Use and Privacy Policy

All of these efforts are helping to educate business IT end users of data quality software on the importance of implementing it alongside a governance framework; without stewardship controls and measurement and monitoring activities, the returns on data quality tooling investments will be lost. Data quality stewards should not only be made accountable for maintaining data quality. They will also need a proper set of software tooling. That is best served by an integrated platform approach that not only tightly integrates built-for-purpose data quality tooling as part of a process but also taps into complementary technologies.

Enterprises are Still Unsure of the Relationship Between Data Quality and MDM

A symbiotic relationship

Clearly, a variety of data management disciplines can learn from data quality techniques and stewardship, which is resulting in more integrated platforms and suites. One such discipline is MDM, which is now starting to absorb data quality techniques.

While integration continues, Ovum has found that many organizations are still unsure of the relationship between data quality and MDM. Data mastering is indeed an insidious problem in large organizations. Speaking with clients, we find on average they maintain six to nine different systems holding supposedly "master" data about customers and products.

It is of course possible to have a data quality initiative without considering MDM, and many do. However, Ovum believes an initiative should also have a strong MDM hub component attached to it. By the same token, ideally, every MDM project should have a data quality component because without it, data quality efforts will consume an unexpectedly large part of the MDM effort and budget. Why? Because the state of data quality is always worse than people realize, and this is particularly true when building an MDM hub.

While it might be obvious that there's a strong data quality component to an MDM initiative, and vice versa, several customers that have implemented data quality solutions remain confused about where both technologies play. Based on client feedback, we see two sticking points:

  • Many data quality and MDM initiatives are still managed separately, as different IT projects, and are driven by different business units, which we think is a mistake.
  • Many organizations tend to take an upfront, "big bang" implementation approach to both data quality and MDM – often falling into the trap of viewing the undertaking as a huge hill (and bill) that once climbed will require no more hard work.

Acquisitive organizations should, however, look to pair up the principles of MDM and data quality capabilities to drive these data integration and conversion initiatives. In particular they should look at ways to either integrate off-the-shelf data quality tools with existing or newly developed MDM platforms – ideally from the same vendor – and vice versa.

Data quality should therefore be a central part of MDM hub-building initiatives and considered carefully when evaluating software and planning projects. How your MDM platform handles data quality is ultimately going to be a big question. Start by looking into it early on in the MDM evaluation and selection process. In particular, evaluate which data quality tools your potential or current MDM platform can work with and whether the MDM hub has a built-in data quality tool – either as part of the overall platform or integrated on an OEM basis.

An acquisitive market will continue to drive convergence

We find that one of the biggest drivers for data quality and MDM convergence has been the increased spate of mergers and acquisitions in the market across nearly all industry sectors. Acquiring companies need to integrate the customer, product, supplier, and other important master data into their enterprise systems quickly and effectively.

Hence, it is no coincidence that a large number of clients we have engaged with on data quality have, or are undergoing, significant merger and acquisition activity in their business. Nearly every acquired company is going to have some overlap with the acquiring company's core customer base. And the data quality of the acquired data is suspect, because in many cases little is known about the source systems or applications of the acquired company (and what acquisitions and data conversion they did) prior to the acquisition.

Recommendations

How to create a data driven business

Look at data profiling as a launch pad for improving data quality

You can't improve what you don't understand. Data profiling should represent your first major investment in data quality tooling. But make sure it can be easily integrated as part of a broader solution set – ideally from the same platform vendor or a best-of-breed suite approach.

Aim to shift from reactive to proactive mode

Aim to proactively find and capture poor data quality before it enters and pollutes your IT systems and business processes. The key to safeguarding data quality is making it a continuous and sustainable process. That will require a robust supporting governance structure and controls to be put in place as well; as new data issues are identified, new data quality rules will invariably be needed to proactively manage them, requiring constant supervision and revision.

Act tactically, but think strategically about data quality

Point tools provide only a tactical fix for certain data quality problems. Aim for a broader and long-term view of advancing your data quality efforts on a scalable, integrated, and high-performance data management platform. Make sure your platform is linked to governance controls for enterprise scale and sustainability.

Secure a budget for data quality

Securing senior business-level buy-in is especially critical for securing and maintaining funding for a data quality program for the next year and beyond. For sustained yearly funding you will need to demonstrate quick and clear ROI for your investment. That is achieved through a mix of internal championing, good documentation/metrics, and communication. Make sure all three happen.

Want more like this?

Want more like this?

Insight delivered to your inbox

Keep up to date with our free email. Hand picked whitepapers and posts from our blog, as well as exclusive videos and webinar invitations keep our Users one step ahead.

By clicking 'SIGN UP', you agree to our Terms of Use and Privacy Policy

side image splash

By clicking 'SIGN UP', you agree to our Terms of Use and Privacy Policy