Why data quality can make or break accountability?

Data that are ‘fit for purpose’ are a vital foundation for marketing strategy and in ensuring that the Metrics model provides an accurate p...

Data that are ‘fit for purpose’ are a vital foundation for marketing strategy and in ensuring that the Metrics model provides an accurate picture of the market and in measuring the performance of the organization in meeting its goals. This has become an increasingly important issue in marketing, owing to the increasing use of technology applications such as customer databases, data warehouses and CRM systems.

The importance of data quality

One of the key challenges faced by marketers in developing an effective metrics strategy is to ensure that the quality of the data used as a source for the measures is appropriate to provide reliable information.

A data quality workgroup consisting of members of the Cranfield Marketing Measurement and Accountability Forum (MMAF) identified the following as constituting
‘best practice’ in data management to support marketing strategy:

● An enterprise-wide data strategy is essential in achieving high levels of data quality. Marketing strategy is often supported by data managed in other parts of the business, for example in operational areas such as customer service centres, underlining the importance of having an enterprise-wide strategy.
● Earlier research undertaken on data management at Cranfield by one of the authors (Mouncey and Clark, 2005) indicates that a company-wide strategy is still a rare situation.
● Data need to be collected with the wider needs of the enterprise in mind, rather than being collected for a single purpose (as is often the case).
● Data definitions (metadata) need to be consistent.
● A business case for data quality is essential to identifying and quantifying the real costs and lost opportunities.
● Data quality needs to be ‘owned’ by business units, not IT.
● Overall data strategy needs to be ‘owned’ at board level and made the responsibility of a dedicated team.
● ‘Soft’ and ‘derived’ data are becoming increasingly important in developing competitive advantage, and pose particular challenges within a data management strategy.
● Data quality must be viewed as an iterative issue, requiring constant attention, its own defined metrics framework, continual investment and regular auditing.
● Communication is an essential component within a data management strategy to ensure commitment.

Are data the weakest link in your marketing strategy?

The Cranfield research study (Mouncey and Clark, 2005) on data management strategies referred to above found that practices within organizations were polarized. In some organizations data management was defined as the weakest link in their strategy, in others their core strength. The following quotation underlines the importance of good data governance practices within one area that has grown to be of significant importance for marketers in recent years and required large-scale investment in new processes, hardware, software and capabilities within many companies – customer relationship management (CRM): ‘Managers wishing to fail at CRM or sabotage a CRM project need look no further than “Data” to find the weakest link in the CRM project.’ This quotation from Carving Jelly, a guide to CRM project management by Nick Siragher (2001), is equally appropriate to any investment in marketing activity, especially as CRM systems and associated databases have become increasingly important sources of information to measure the performance of marketing.

Taking marketing communications as an example, one of the key reasons why entries to the IPA Advertising Effectiveness awards fail to impress the judges is that the evidence to link expenditure on advertising to any impact on business results is either lacking or deeply flawed. The quality of the data used to support the case is also often suspect (Institute of Practitioners in Advertising, 2006).

The above quotation from Siragher also implies that some employees may have a hidden agenda and see this as a way to frustrate or derail the ambitions of their organization, or may be seeking to justify or defend a course of action that does not in reality represent a good investment for the enterprise’s scarce resources. In either case, it underlines the need to ensure that the strategy being developed by marketing has the full support of all other areas of the organization that are necessary to implementation. This commitment also needs to be at all levels in the organization.

A meaningful data management strategy to support marketing activity must not simply focus on one area, such as customer-related data. It is not uncommon for the aims within a marketing plan to be frustrated by bad practice in data management within other parts of the organization. The following real-life case study illustrates how the poor management of data, in this case within the manufacturing division, can increase costs and create an uncompetitive situation in a B2B marketplace. In this example this led to increasing difficulties for the marketing and sales teams in achieving their goals in a particular geographic market. 

Sometimes, a whole market can be transformed by a new entrant with an innovative vision of data management that can create a step-change in the relationships between suppliers and customers. Take the case of GHX’s arrival in the medical supply market (Mouncey, McDonald and Ryals, 2004). Historically, each manufacturer of a particular medical product would sell direct to hospitals and clinics, or to specialist intermediaries. Mostly these transactions also took place off-line, using paper catalogues produced by each supplier. However, many hospitals had more than one source for particular products and as each manufacturer used its own description and numbering, the inventory lists needed to manage medical supplies were extremely lengthy and confusing. This led to many incorrect orders, slow delivery times, large numbers of returns, high costs, withheld payments and lots of heated discussions when things went wrong! None of this was good for the relationships between suppliers and hospitals (or for staff and potentially, patient care) as meetings tended to focus on errors and delayed orders, rather than on new ideas and more collaboration. GHX came into this chaotic marketplace with a radical, but highly practical, solution. They set up as an intermediary between the manufacturers and hospitals, but they developed a single electronic catalogue of products where each product, no matter who manufactured it, was given a common description and stock number. Also, the whole order process was put online. Figures 10.1 and 10.2 illustrate the process.

As you can see from Figure 10.3, this created immense benefits for all stakeholders. This innovative, data-led, solution has provided error-free transactions, improvements in cash flow, major cost savings and removed immense frustration from the process of ordering and supplying medical supplies. Most importantly, the trust between customers and suppliers improved leading to the manufacturers being able to develop collaborative-level key account relationships with hospitals, enabling discussions to be strategically focused on identifying new opportunities for innovation in products and services, instead of ‘fire-fighting’ day-to-day operation issues.

Data and competitive advantage

When responsibly managed and creatively used, data to support and facilitate marketing-related activities (including sales and customer service activities – as contained in the definition of marketing used in this book) can provide organizations with significant differentiation from competitors and transform their relationship with the marketplace. However, by failing to develop and implement effective strategies for data management, organizations are likely to underachieve within their sector and suffer from higher, and costly, levels of customer churn. Their ability to accurately measure outcomes and progress in achieving goals will also be significantly compromised.

The availability of data to support, facilitate and measure the impact of marketing has grown exponentially in the past decade. Sean Kelly (2006), a leading expert on data management in marketing, describes today’s world as the era of ‘data wars’, where the competitive success of organizations is increasingly dependent upon their data management competencies in supporting marketing. Kelly has also described the rise of ‘information intermediaries’, as illustrated in Figure 10.4, organizations within the overall demand/supply chain that have recognized the competitive advantage of customer-related data as a key weapon in controlling marketplaces. For example, in the FMCG sector, a retailer with a sophisticated customer loyalty programme can decide to charge its suppliers for access to this valuable store of data, thereby potentially limiting suppliers’ knowledge of customers and the market, or increasing the cost of being able to access the data. This has obvious implications for organizations that rely on intermediaries for access to end-users and their ability to identify segments in the market and collect data about them and their needs. In Kelly’s view, it is very difficult for a single organization to effectively exist on both sides of the dotted line (see Figure 10.4).

The availability of data on customers and the granularity of the data, coupled with the opportunities to use the data to create differentiated value propositions for different types of customer, were highlighted within a further Cranfield research report (Clark, McDonald and Smith, 2002). Understanding the data flows available to an organization is a key factor in identifying, firstly, viable marketing-related strategies for an organization and, secondly, the most appropriate strategies to adopt.

One energy company believes that data quality creates a differentiation from competitors, especially as its strategy is based on creating a single view of the customer. For example, customer services can resolve queries at the first call.

A leading engineering company identified that, owing to multi-numbers for the same part plus difficulties in identifying the geographic source of a product, order delivery times and distribution costs, and therefore prices, had become uncompetitive. By integrating product codes and source data in the data warehouse, the part could be sourced direct to the customer, leading to a cut in delivery times from 15 to 5 days and a significant reduction in distribution costs.

Data literacy

English (1999) provides ample evidence of the cost, often highly significant, to organizations of poor data management. It is not simply that organizations have no, or an inadequate, strategy for managing their data; it is much more fundamental than that. It is as if these organizations suffer from poor data literacy, a type of black hole within their culture – and data literacy is a prerequisite for marketing in today’s world. For example, poor quality of data was cited by respondents within a survey conducted by Strathclyde University for the Institute of Direct Marketing (IDM) (Mouncey et al, 2002) as the key factor that inhibited the value and application of their customer databases. Any organization that has substantial numbers of customer records that cannot be included within marketing programmes because of data quality issues, or that are inaccurate, leading to poor targeting, is sacrificing substantial future flows of revenue – rather like having half the production line out of action, or the shop shut at times of peak demand. An illustration of the cost to an organization of poor data quality is given in section 10.10. Records that are inaccurate or lack key data items lead to dissatisfied customers, inappropriate offers being made, and invalid metrics, and could potentially contravene the fourth principle within the UK Data Protection Act 1998 requiring personal data to be both accurate and, where necessary, up to date.

Challenges to data integration

In addition to struggling with data quality issues, organizations also quickly discover that the allied challenge of integrating data captured through a disparate range of sources also creates numerous problems. For example, how can (if at all) data collected through traditional market research surveys, a rich source of customer profiling and the essential ‘why’ (attitudinal and behavioural) information, be combined with the narrowly focused transaction records commonly the main basis for a customer database? How can all that be analysed alongside that which customers post on social media sites, videos they upload and SMS messages they leave with the firm?

What are the legal and ethical boundaries that organizations face when attempting to integrate personal-level data obtained from a variety of internal and external sources? The challenge increases exponentially as organizations implement increasingly complex multichannel strategies, particularly if real-time information becomes essential. Organizations also tend to forget that data are generated through business processes, and process mapping therefore needs to be a key competency in the marketing data strategy toolbox.

A report on data management strategies (Information Age, 2005) included a comparison between the mobile phone manufacturer Nokia and Barclays Bank. The report cited the 120 separate databases within Nokia containing customer data – a patchwork built up by its different divisions for their own purposes (legitimate)
as the company has grown at breakneck speed’. The structure includes, for example, individual data marts for:

● analysing the performance of mobile operators;
● tracking third-party resellers;
● logging end customers who registered their product.

Overall the situation had led to high levels of data duplication, effort and confusing multiple versions of the truth! It means that Nokia had problems answering such questions as:

● How many active customers are there (rather than phones shipped)?
● Who are the most profitable customers and what are their profiles?
● How loyal are Nokia customers?
● Which sales are primarily for business use?

Barclays, on the other hand, took three years to solve similar problems by building an enterprise data warehouse to improve the interaction with its 12 million customers. This led to a saving of £10 million in its annual marketing budget by improved targeting. Barclays also claims other economies, as there are fewer systems to support or maintain – estimated as around £1.1 million per mart within a large organization (including software licences).

The objectives for an enterprise warehouse were cited as:
● a single version of the data;
● a single view of the customer;
● improved data quality (one source for cleansing and ensuring accuracy);
● accessibility by users throughout the organization;
● a quicker response to changing business needs;
● more frequent updates;
● an enhancing of regulatory compliance.

These are all issues faced by many organizations. Data quality issues simply become magnified when data integration projects are attempted, leading to potentially severely flawed decision making and contact strategy, and major challenges in accurately measuring the performance of marketing.

Creating a business case (return on investment) for data quality

A further dilemma faced by organizations is that they have no real framework for identifying the return on investment (RoI) for the data that they hold or need. While individual items of data can be stored at relatively low cost, to this must be added the more substantial ongoing investment in collecting or acquiring the information, and keeping it up to date. Organizations need a framework that can identify the core data essential to achieving their business goals (including performance measurement), and that also enables them to demonstrate the added value created by the data. The logic of the BDN suggests that the data alone does not generate RoI, but that they enable the implementation of other strategies that do. The monetization of investments that create capabilities that enable ambitious strategies is a perennial issue for IS research (Goh and Kauffman, 2005).

Corporate priorities can play a major role in addressing data quality issues. For example, a leading international manufacturer based in the UK was able to establish the real contribution to the overall business of accurate data once finance realized the importance of the data warehouse as a key source of management information and managed it as a company-wide strategic asset. This example underlines the impact of different data standards within an organization. As finance wanted to use the data to provide accurate and up-to-date information to help improve the operational management of the business, they introduced more stringent requirements to those previously applied for marketing purposes.

Creating insight

Organizations increasingly talk about customer (or consumer) insight instead of market research, but unless there is a structured approach to knowledge management, real insight will be extremely difficult to achieve in practice. Customer segmentation is a key tool in deriving insight, but this needs to be tailored to the data available to the organization, the market sector, and to be multidimensional. In some sectors, such as travel and personal electronic devices, a customer-managed segmentation may be more appropriate. Some firms have moved beyond this and into the mass customized zone – practising one-to-one or segment-of-one marketing. However, some of the biggest challenges facing any organization developing a segmentation-led strategy include a lack of data, the level of granularity relative to that necessary for decision making, and the poor quality of available data.

Some insight data can be defined as hard (readily factual) data (eg name and address, transaction details) and others as soft data, such as attitudes and behaviour. Data captured through internal financial systems are usually hard data, whereas traditional survey research-based data are classed as soft. Most customer databases and data warehouses contain primarily hard data. A third category used to support marketing needs is derived data generated through modelling, such as trend and propensity used both inbound and outbound for marketing analysis. Similarly, despite pleas from leading exponents of the customer equity concept, there is as yet no recognized accounting methodology that allows the customer base, and the knowledge held about it, to be treated as some form of capital asset, in the same way that brands can be valued on a company balance sheet.

Key data for most organizations include data that can identify the customer, together with some form of transaction information. Typical customer-related data that are critical to marketing are shown in Table 10.1. The primary type of each data item is shown by an ‘X’, and subsidiary types, used to create the derived data, by ‘y’. This indicates the importance of, firstly, analytical/modelling tools and associated competencies in order to create the derived variables and, secondly, ‘soft’ data in gaining a comprehensive picture of customers.

One leading IT company also divides data by appropriateness: static – how well the data provide an accurate description of a customer; and dynamic – whether the data are suitable for predicting future behaviour, and whether the data match the strategic future needs of the organization.

Technology and Information Systems

Information Systems facilitate data capture, hold data, and provide the tools with which to extract value and deploy the knowledge. Customer data, the database platforms, data warehouses and integration systems, tools and deployment technologies are all now key components within the core infrastructure of many organizations. These tools require constant investment in order to keep them up to date – best practice data management is a complex, enterprise-wide, iterative journey rather than a one-off, functional project, with data at the core. But ‘garbage in, garbage out’ will be the result unless data quality issues are addressed as part of the overall strategy.

Data need to be viewed as a key corporate asset if the continual investment in the necessary infrastructure and application tools is to be readily accepted at board level. The data asset therefore needs a long-term strategy all of its own.

Success factors

Evidence from the rich databank compiled by QCi, a then subsidiary of the Ogilvy Group, comprising audits of over 5,000 companies using their CMAT benchmarking tool (Woodcock, 2000) clearly indicates however that, despite the undoubted importance of information, technology and processes, the three key factors that make the difference within customer-related strategy are to do with the people (culture, training, etc), measuring what happens, and the customer management practices devised by the organization. QCi advise that these three should be the priority for attention, and that these should be developed to support the overall business model – not the other way round. They conclude that: ‘Companies who manage customers well using sensible, observable, well-implemented business practices are likely to be best-in-class performers. Conversely, companies who do not set up good customer management practices are likely to be poor performers’ (Mark Say, QCi).

A global survey of 600 CIOs and IT directors undertaken in 2001 by PricewaterhouseCoopers posed six questions that CEOs need to consider in deciding whether the organization is paying sufficient attention to data issues and at the right level within the company structure:
● Have we suffered significant problems, costs or losses in any area because of data quality?
● In two years’ time will more of our business depend on automated decisions and processes based on electronic data?
● Are we paying sufficient attention to data issues at board level?
● Who is ultimately responsible for the quality of our data?
● Do we have a data management strategy – or just a series of fragmented policies?
● Do we trust the quality of our own data – or of anyone else’s?

The same survey showed that effective data management had led to the following important benefits for companies interviewed:
● reduced processing (59 per cent of companies interviewed);
● increased sales through improved prediction (35 per cent);
● winning a significant contract (32 per cent);
● increased sales through better analysis (43 per cent).

Identifying the cost of poor data quality

As illustrated earlier on in this chapter, many organizations have to date either underestimated the importance of data quality or failed to address this as an enterprisewide issue. As described earlier, a survey of companies commissioned by the IDM (Mouncey et al, 2002) found that data quality was the top mentioned barrier that limited the role of the customer database, even in those organizations claiming to be gaining high value from their database.

According to QCi, 39 per cent of organizations have no data quality standards in place, and 56 per cent have no capability for tracking whether their data quality is improving or not. QCi have several examples within their ‘Data Roll Call of Shame’ that illustrate the consequences of inadequate standards of quality:
● In a mailing of 20,000 mugs, 5,000 were returned as undelivered or ‘gone away’.
● A holiday company specializing in holidays for women did not include a title field in their file sent to a mailing house, which inserted a default of ‘Mr’.
● Counter staff at a bank used the name field to flag customers whom they suspected of fraud by adding ‘(Care fraud)’ after the surname. As the direct marketing team were unaware of this practice, a mailing was sent out including letters addressed to customers with ‘(Care fraud)’ printed after their name.

These types of errors will inevitably have impacted on customer retention and other revenue streams and have incurred additional costs in rectifying the problems. In addition, there are also likely to have been negative consequences for brand image – more of an issue today because of the rise of ‘culture jamming’ (Lawes, 2007) through blogging on the internet.

Similar problems also occur in the public sector – owing to an incorrect look-up table, court offenders were sent letters requesting payments for the wrong offence.

In terms of personal data held about customers, examples such as those above could lead to these organizations having breached the Data Protection Act 1998 principles covering accuracy and the holding of up-to-date personal data. Privacy Laws & Business, an advice service on data privacy, believe from their survey data that many leading organizations are failing to take this legislation seriously enough, with a significant minority transferring personal data to third parties without the permission of the data subject (Privacy Laws & Business International, 2004).

The cost to business of inadequate data quality is high – some experts put this as being between 15 and 25 per cent of operating profit (Cooper and Murray, 2004).

The following example, based on a real calculation made in the late 1990s, may not be up to date but provides a graphic illustration of the revenue lost as a result of poor data quality. Table 10.2 shows the predicted loss of revenue in two categories (future sales of the core product, cross-/upsell opportunities) from the inability to contact customers through direct mail methods for three reasons:

● ‘Gone away’ markers attached to the record – records suppressed for mailing owing to mail having been returned by the Royal Mail marked as ‘No longer at this address’ (ie no up-to-date address for that customer).
● ‘Do not mail’ markers – records suppressed because of Mail Preference Service markers, other requests not to mail, or poor internal processes that lead to such markers being applied for other non-related reasons.
● Missing or incorrect data items – markers indicating that key personal identifiers or product holding details are missing from the records, or records known to contain inaccurate data or suspected of being incorrect.

Rigby and Ledingham (2004) underline the point that perfect data comes at a cost – in terms of processes, systems and the actions that may be necessary to respond to the data. The extra accuracy may deliver little or no real incremental added value to either the company or its customers. They describe why a leading global printer equipment manufacturer opted for real-time information to stem a growing tide of customer dissatisfaction with the service provided by its call centre. The article describes the impressive results in terms of increased call centre productivity, lower training costs, reduced call waiting times, lower product returns, and increased insight into customer needs and behaviour that can be used to target customer communications more effectively. The key point is that the pay-off could be measured and that the benefits were more widespread than initially anticipated. This article also includes a framework for identifying the true value of information and addressing the key questions:

● How good is the information?
● What is it good for?
● What are the costs (of keeping/putting it right)?
● Which business results matter most (and therefore where are accurate data vital)?

Data management strategy

The Gartner Group, which has published several estimates of CRM project failure rates, has cited ignoring customer data as the number one reason for the failure of CRM investment (Nelson and Kirby, 2001), echoing the quotation from Siragher (2001) earlier in this chapter. The survey conducted by PricewaterhouseCoopers (2001) mentioned earlier found that only 40 per cent of ‘traditional’ (excluding dotcoms) organizations had a formal and board-level approved data strategy, and 57 per cent of boards only occasionally, rarely or never discussed data issues.

According to QCi (QCi Assessment, 2002), organizations implementing CRM tend to invest heavily in technology without sufficient investment in data management. Out of the 260 best practices covered by the CMAT audit process used by QCi to audit organizations’ customer management capabilities, no fewer than 140 required evidence of the effective management and use of customer data (Foss et al, 2002). Organizations are acquiring increasing quantities of data, but the objectives for doing this are often unclear and, in addition, the problem of how to maintain the data is not being adequately addressed. The result is what QCi call ‘data chaos’. Based on their in-company assessments, ‘best practice’ customer-focused companies:

● have recognized the implications of EU data privacy legislation and are improving the accuracy and understanding of the data they hold;
● are increasing the visibility of customer-related data and making it accessible to customer-facing staff, business partners and intermediaries;
● are displaying a more trusting and mature attitude towards their customers by increasing the visibility of customer data, thus enabling these customers to gain a measure of control over their relationship with the organization and maintain the information held about them (usually resulting in a higher level of accuracy).

Only 9 per cent of the organizations assessed through the CMAT audit process in 2002 had developed effective business cases for customer strategies that would enable progress to be tracked over time. This has major implications for the extent to which issues to do with data are recognized and actively addressed within the overall strategy – what gets measured gets managed.

A key problem facing organizations is that existing processes and data are fragmented and uncoordinated across and between traditional business silos or functions – sales, marketing, customer service, call centres, retail outlets, websites, etc. Front-office and back-office systems are not effectively linked together. For example, the call centre support system may not be directly linked to the customer database and therefore agents are denied access to contacts and transactions through other channels – or these updates are not sufficiently frequent to provide a ‘real-time’ picture. Local systems may be designed to meet purely local needs. In addition, organizations are often dependent upon ‘legacy systems’ as key sources of data, where the processes and definitions used for data may be poorly documented.

The key question is the extent to which organizations have strategies for data management in place that can help resolve these types of issues and support the overall marketing strategy. The evidence from earlier research investigating customer-focused strategies indicates two fundamental factors that lead to data quality issues inhibiting progress: 1) Any strategy for data tends to lag behind the decision to implement customer strategies (Mouncey and Clark, 2005). 2) A comprehensive, enterprise-wide data strategy is rare. QCi believes, for example, that few organizations (4 per cent in 2002) have an enterprise-wide information strategy or plan.

Data to support marketing may be sourced from many different points within the organization. Data may also be obtained from external sources, such as business partners or information providers (eg research agencies, advertising/media agencies). Overall, this diversity creates problems of ensuring consistency, integrating the different feeds, and overcoming resistance from data owners and conflicting business objectives across the enterprise. Company mergers and acquisitions cause further problems in confidently identifying individual customers through problems with integrating data from different systems and data management regimes.

Why an enterprise-wide approach to data management is vital

Issues that arise in this situation can include the metrics that drive operational units, such as customer service/contact centres, where any emphasis on productivity metrics conflicts with requirements from other areas of the business in either updating existing customer data or collecting new data items. These issues can be resolved only either by having an enterprise-wide strategy for data, where everyone understands the importance of accurate and comprehensive data in achieving business goals, or where there is a process in place that requires a cost–benefit case to be made, identifying the enterprise-wide opportunities that particular data might provide – and then measuring the results.

A case study based on a leading telecommunications company (Reid and O’Brien, 2005) describes the outcome where inadequate processes and data quality issues had not been addressed, leading to the initial attempt by the organization to build a single customer view as having ‘failed to model anything close to a real-world customer entity’. The authors conclude that:

● Organizations should not assume that data held in dispersed databases will be of a similar format.
● Data from secondary sources may be out of date.
● Organizations need to engender a culture where data are viewed as being for the greater good of the whole enterprise rather than for the exclusive use of a business unit or in a single operational process.

For example, within the organization where the emphasis in strategy is focused on customer retention, owned within part of the sales and marketing team, difficulties could be experienced whenever this team tries to gain the support of other teams who collect and process customer data – such as the call centres. In an Asian company, where strategy hinges on the accuracy of the data collected within a questionnaire completed in-store by new customers (Mouncey and Clark, 2005), there are specially trained customer service staff within each shop who help ensure that the necessary information is obtained, by focusing on the subsequent benefits that can be enjoyed by customers and their households. Despite this emphasis, there are still residual data quality issues.

Within an international telecommunications company, the responsibilities for customer strategies have been devolved into the business units, and there is no longer a board role with this title. However, despite there having been a senior champion in the past, the initial strategy did not lead to a truly ‘data literate’ culture across the constituent parts of the overall business unit. To help address this ‘black hole’, an information management steering board was formed with the responsibility for creating a corporate data strategy covering this business (Mouncey and Clark, 2005).

External pressures can sometimes act as a catalyst for change, especially if the pressure comes from an industry regulatory body. For example, despite the emphasis on the customer within a leading UK mutual financial services organization, marketing has historically been carrying the metaphorical torch for data quality as a groupwide issue. However, the changing external regulatory framework for the industry sector as a whole is now driving data strategy on to the corporate agenda. New standards for integrating and reconciling data have been introduced, and the increased requirement for ensuring quality may well lead to a main board member having data strategy added to his or her portfolio of responsibilities.

Developing an enterprise-wide information strategy

English (1999) describes one method to assess the current state of information management within an organization and the associated criteria for measuring progress, the Information Quality Management Maturity Grid, adapted from the methodology for assessing quality management devised by Philip Crosby. This is illustrated in Table 10.3.

This methodology maps five stages in information strategy maturity against six measurement criteria, describing the factors for each cell within the matrix. Such a framework can help senior management identify the current position, develop an effective strategy and then measure progress towards the defined goals. Definitions or rules need to be agreed for factors such as:
● accuracy (including the level of confidence);
● matching/integration;
● updating;
● archiving;
● discarding;
● compliance (with any sector regulations or legislation);
● fit with business goals;
● setting markers covering usage.

Within one leading energy company, ‘data’ are viewed as an integral part of the corporate planning cycle, as illustrated in Figure 10.5.

A process like this moves data quality to the top of the corporate agenda by establishing a clear link between data quality and its vital role in achieving strategic goals. This process helps reduce the risks to marketing strategy, as data quality is viewed as a strategic issue for the whole organization.

Data governance

Developing a strategy for data management is only the starting point. The continuing challenge is to ensure that the agreed policy and defined processes are adequately actioned throughout the organization at an operational level. As a key part of their strategy, organizations therefore need to audit formally the extent to which information is being effectively managed to support marketing strategy, identify the gaps, develop an improvement plan and measure progress over time. Without this they will be unable either to pin down the costs of poor quality or to quantify the benefits that will flow from a programme of improvement. This analysis will also help scope the budget necessary to achieving the level of quality required to achieve business goals. Where data are critical to measuring the performance of key business functions, the board could agree to this responsibility being given to the internal audit function.

Defining what is meant by data quality is a key issue. ‘Fit for purpose’, rather than absolute quality, should be the aim. For example, some gaps and inaccuracies may be acceptable within a data-set used for modelling, but the standard would need to be much higher where transactions data and records of customer contact history, through all channels, are used in real time to support a service call centre or a selfservice website.

‘Fit for purpose’ may also be defined by needs to meet regulatory requirements (eg Basel 2 requirements within financial services organizations) and legal requirements (eg European data protection legislation – keeping data accurate and up to date; meeting subject access requirements; being able to differentiate between SMEs and domestic customers or differentiate personal data from non-personal data held about business contacts, etc; safety legislation, such as being able to contact car owners to recall vehicles to rectify safety defects, etc). According to a survey conducted by Privacy Laws & Business International in 2004, many organizations are failing to take data privacy issues seriously, and QCi (QCi Assessments Ltd, 2002) found that only 37 per cent of the companies they had assessed had adequate plans in place to meet the requirements of the 1998 Act. Finally, ‘fit for purpose’ considerations also apply to the issues affecting the capture of source data and the user situation. For example, the competence of employees involved in the capture of data and those who have access to it needs to be taken into account.

Data quality also covers the need to ensure that critical data items are identified and appropriate strategies are developed to ensure that any deficiencies are addressed. A US insurance company (Pula, Store and Foss, 2003) identified that ‘roof year’ (the date that a new roof is put on a building) was a key data item in assessing risk within buildings insurance. Subsequent analysis of its database showed that:
● Seven per cent of records contained a null value for this item.
● Many records held ‘default’ years – 1900 or 1908.
● Nearly two-thirds of values were for 1997 as a result of a major data file conversion in that year, as any record with a null value or a roof year equal to the building’s date of construction was assigned the 1997 default to ensure policyholders were not penalized because of incorrect information.
● There were varying and inconsistent business rules for assigning a ‘roof year’.
● There was an assumption (proved wrong) that the system introduced in 1997 was built and maintained to a higher quality in terms of data than earlier systems.

In fact, it was discovered that no data cleansing of source files for the new system had been undertaken as part of the migration process.

Similarly, the data quality programme at the Bank of Scotland (Clark, 1998) discovered that a very high proportion of customers were shown as being the same age as the century. This was due to 1900 having been used as the default for this field if the date of birth was unknown!

Other examples of poor data definitions include 12 different spellings of the colour ‘beige’ (Automobile Association roadside services database) and 37 reasons for cancelling an insurance policy (Pula, Store and Foss, 2003)! An energy supplier has implemented a data quality audit process with the following objectives:

● to develop a data quality scorecard and supporting framework;
● to explain clearly the methodology so it can be replicated;
● to answer the key question: ‘How accurate is the information held about customers?’;
● to produce a detailed data quality audit report, including data quality metrics and business rules;
● to assess the cost to the business of data quality issues;
● to describe clearly the methodology behind the costing model;
● to document recommendations and the value to the business of improvements;
● to provide a framework to support the data governance strategy and input to future work programmes;
● to identify and highlight the value of ‘quick wins’.

The scorecard covers: completeness, conformity, accuracy, consistency and duplication. The governance strategy is managed within the customer insight team, with commitment at board level and from senior finance management. A specialist consultancy undertook the initial data quality audit, with a remit as a partner in achieving, and embedding, best practice within the organization. The strategy has created a much more open and transparent approach to discussing data issues – not just about applications.

It has enabled data flows to be identified within the company, and the impacts caused by inadequate data quality. Workshops are used to identify priorities, including the cost and revenue implications.

This is against a background within the company where there was a perception that data quality was poor, but with no responsibilities for addressing the problem, no policies, no impact assessments and no clear understanding of the costs to the business of poor data quality. In addition, there was the probability that the situation infringed legal and regulatory frameworks. The strategy is now firmly positioned to identify and communicate the cost to the business of poor data quality, especially from a customer relationship perspective. Communication is a vital part of the strategy, together with a ‘no blame’ approach, engendering a passion to improve and engagement across all parts of the business through a cross-functional flexible approach.

The leading international manufacturing company described earlier has also included data quality in its scorecard and board-level dashboards as part of its corporate performance management strategy. As mentioned earlier, board-level interest in data quality was fuelled by finance taking over the responsibility for the company’s data warehouse.

A key initial step in the quality process is to audit all the ways that the organization collects particular types of information, for example auditing the processes for collecting information from new customers or prospects – application forms, call centres, websites, and third parties such as agents, retailers, business partners, data providers, etc – to ensure that a common format for collecting core customer details is in place. One organization found that basic data on customers were recorded in 32 sources, but there was no consistent approach to the format used, for example sometimes collecting full forenames but on other occasions only one or more initials, recording date of birth in some sources but age in others, and treating some data as optional in one business area and essential in others.

A further step in the overall audit process is to ensure that checks are regularly undertaken to ensure that agreed standards are being adhered to and that processes deliver the required level of quality. For example, a leading mutual organization commissioned a market research agency to undertake a survey of members to assess the accuracy of data held about them prior to demutualization in order to estimate the likely extent to which voting papers would be received by only those entitled to vote. The results of this survey could have been used as evidence if any member had challenged the validity of the voting process. Regularly auditing customer data in this way should be part of a best-practice data management strategy.

Particularly in the early stages of implementing strategy, there needs to be a dedicated data quality team. Within a telecommunications company interviewed in earlier research (Mouncey and Clark, 2005), this responsibility was a defined role within the central customer insight team, which reports to the marketing director. An engineering company has ‘data champions’ in all its business units round the world. These are not data specialists; generally they are experienced managers in a variety of roles, but with a common appreciation of how good-quality data are an essential foundation to creating an efficient, well-managed business.

The responsibility for defining and implementing a data strategy must be business unit owned, rather than being left to the IT department. The same applies to any team put in place to manage data quality – this must represent business interests and be managed from a business perspective. The tools described in this and other chapters are designed to be used by business units. The IT specialists will play an important role in supporting the business units to achieve their goals, and have tools and solutions available to help facilitate the implementation of the agreed data strategy. In addition, the data quality programme must be business led. The key criteria for the data quality business case should include:

● productivity improvements (eg shorter-duration phone calls);
● reduced costs (eg reduced errors in the order process, fewer complaints to resolve);
● increased revenue (eg cross-/upsell, improved customer lifetime value);
● reduced customer churn.

Proving the business case for the improved quality of information over time may also be incorporated into the measurement of the incremental value generated by marketing activities. For example, Vauxhall Motors measured the incremental effectiveness of its overall CRM programme (Boothby, 2002) by having a representative control cell of 10 per cent of the overall customer and prospect base who received no communications from the company. Control samples could also be applied to measuring the value of improved data management processes in terms of the impact on revenue, customer satisfaction and image.



Analytics Case Study Content Experience How-To Mobile Marketing Social Media Strategy Strategy
The Digital Media Strategy Blog: Why data quality can make or break accountability?
Why data quality can make or break accountability?
The Digital Media Strategy Blog
Not found any posts VIEW ALL Readmore Reply Cancel reply Delete By Home PAGES POSTS View All RECOMMENDED FOR YOU LABEL ARCHIVE SEARCH ALL POSTS Not found any post match with your request Back Home Sunday Monday Tuesday Wednesday Thursday Friday Saturday Sun Mon Tue Wed Thu Fri Sat January February March April May June July August September October November December Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec just now 1 minute ago $$1$$ minutes ago 1 hour ago $$1$$ hours ago Yesterday $$1$$ days ago $$1$$ weeks ago more than 5 weeks ago Followers Follow THIS CONTENT IS PREMIUM Please share to unlock Copy All Code Select All Code All codes were copied to your clipboard Can not copy the codes / texts, please press [CTRL]+[C] (or CMD+C with Mac) to copy