The importance of accurate and consistent data


Large organisations typically have datasets relating to operational processes, customer contact details and much more, stored across multiple systems and locations. This can cause a challenge for organisations as it may result in opportunities for errors and inaccuracies to be introduced, such as missing, duplicated or inconsistent data.

To optimise decision making – and to support mission-critical processes such as sales, production and customer service – this data needs to be accurate and consistent, requiring an effective data reconciliation framework. A powerful framework such as this has multiple business benefits including:

  • Enabling a business to gain a single customer view across all their systems
  • Improving operational data quality
  • Creating consistency across business systems and assets.

In this blog we explore some of the data challenges that businesses face and how data reconciliation can help.

Skip to section...

What is Data Reconciliation?

Data reconciliation is a term used to describe a set of tools and technologies that verify the accuracy and consistency of data – either during a data migration from one system to another, or in business-as-usual (BAU) scenarios such as a routine check of production data, order data, or customer contact details.

There are many approaches to data reconciliation; these range from counting the overall number of columns and rows in a dataset and comparing the figures to source data, through to more detailed checks such as checksum, which compares a small block of data to determine if errors or inconsistencies have been introduced during data migration or storage.

What is Data Reconciliation?

Data reconciliation is a term used to describe a set of tools and technologies that verify the accuracy and consistency of data - either during a data migration from one system to another, or in business-as-usual (BAU) scenarios such as a routine check of production data, order data, or customer contact details.

Data reconciliation techniques and technologies enable organisations to identify and fix:

  1. Errors that occur when data is keyed into systems
    Data errors generated by customers or customer service staff, resulting in either inaccurate or poorly formatted data that cannot be understood or used by other systems or teams.
  2. Inaccuracies that are introduced into data over time
    This may occur due to industry, organisational or customer changes, such as updates to product catalogues, product and service pricing, and changes in customers’ addresses or contact details.
  3. Structural differences in source systems and data stores that compromise the data stored
    This may occur when there are limitations to storage and when data has been compressed for storage and transactional processing speed.

Why is data reconciliation important?

For all modern businesses, timely and accurate data is key to making the best-informed decisions across a wide range of activities and functions. Data on customers’ buying preferences can help to inform new marketing campaigns and product development decisions, for example, while operational data can be used to improve process efficiency and management and to support everything from resource allocation to employee training and sustainability initiatives.

Conversely, inaccurate data can negatively impact the decision making process, reduce visibility of new business opportunities, and affect customer experiences and relationships. In the most severe cases, customers may be locked out of their accounts, unable to access their funds, or otherwise impacted in catastrophic ways, leading to lost revenues and irreparable reputational damage.

To ensure that organisations can fully trust their data for business decision making, a dynamic and automated reconciliation process needs to be performed on a regular basis as part of a company’s business-as-usual activity and defined as per the business’s specific requirements.

When is data reconciliation needed?

Whether data resides in multiple systems in multiple formats across multiple business divisions, or whether it’s being migrated from a ‘source’ system to a new ‘target’ system or database, there are multiple opportunities for errors and inaccuracies to be introduced. Data reconciliation helps to verify where data is missing, duplicated, incorrect, or where formatting errors exist, in three key scenarios:

1. Data accuracy and consistency during data migrations

Data reconciliation allows organisations to identify and fix omissions or errors that occur when data is migrated between systems. This is achieved by checking that the data held in the original source system, and the data in the target system after processing are the same or differ only according to a set of pre-defined and understood rules.

Different technologies and approaches are used to check the consistency of data after migration to a new target system. These often look at the total number of columns and rows migrated to ensure that totals – such as the total value of sales for the year, for example – are consistent in the original data and in the target system. Other technologies, such as checksums, can be used to compare small blocks of data and to check for inconsistencies between systems.

Group of colleagues looking at data

These kinds of checks and data migration reconciliation best practices provide organisations with the confidence to take new systems live, safe in the knowledge that all data has been migrated from the source system correctly and accurately.

2. General data quality and consistency assessments in business-as-usual (BAU) scenarios

The second key scenario for a data reconciliation process is checking data accuracy and consistency on an organisation-wide basis.

In this context, a range of checks are used periodically to ensure that data is, and remains, consistent across multiple business divisions and systems. These checks will report discrepancies found in records where further cleansing and validation of data is required.

This kind of reconciliation process might be used to ensure that customer details are correct in the Customer Relationship Management (CRM) system and billing systems, or that product pricing tallies in the product catalogue, on the e-commerce platform and in other related systems, for example. In many cases, data reconciliation tools use a set of reference data to ensure consistency of information across multiple systems and/or business divisions.

Importantly, BAU data checks need to be conducted continually as data is frequently updated in multiple systems and the opportunities to introduce errors or inconsistencies into the data are ongoing.

This kind of data reconciliation method is also invaluable for organisations growing inorganically through mergers and acquisitions, and where custom data integrations have been implemented to connect systems in different areas of the business, or where offline processes have been used to migrate data to different systems around the business. In these kinds of scenarios, data reconciliation can support techniques such as gross error detection (GED), which is used to identify possible systematic errors in measurements or data.

3. Complex financial services scenarios

Data reconciliation in financial services requires specific or specialised functionality due to fluctuations in macro-economic factors (such as inflation), and because of other variables, including the currency of the original transaction (currency conversions) and tolerances for rounding up errors and margins.

All of this means that data validation and reconciliation in financial services requires additional functionality beyond comparing data from multiple systems. To meet these needs, additional algorithms or business logic can be overlaid to ensure that all variables are accounted for, and that data is formatted in a way that streamlines monthly, quarterly or annual financial and accounting reporting.

The three scenarios data reconciliations helps in, as listed above

View text version of infographic

What are the top four data reconciliation challenges?

To meet their requirements to reconcile data across multiple systems, many organisations develop customised tools based on mathematical algorithms, usually in SQL or a similar environment. But doing this is costly and time consuming. It also means that business decision makers need support from developers or other technical team members to understand and act on data inconsistencies or other data quality issues.

In an attempt to overcome these challenges, businesses have the option to work with ‘out-of-the-box’ data reconciliation tools. Unfortunately, these frequently require customisation that involves additional coding work, and excessive processing power and storage resources to function correctly, making them impractical for some projects and business needs. The use of these ‘out-of-the box’ data reconciliation tools often leads to the following frequently observed challenges:

  1. No single version of the truth
    Most businesses have different versions of the same datasets in multiple systems across the business, with no way to check which information is current and accurate.
  2. Complex, expensive solution development
    In house solutions are typically expensive and require significant internal technical capabilities.
  3. Limited data reconciliation insights
    Custom data reconciliation solutions can only be operated and used by technical team members, making it hard for business decision makers to access insights on data quality and consistency.
  4. Poor connectivity
    Little or no connectivity between siloed systems and data sources can result in data inaccuracies and inconsistencies across systems.

How can we help?

Our solutions allow you to analyse large volumes of information quickly and efficiently to identify errors and inconsistencies between data held in different systems across your business. Critically, you can spot and prevent errors, such as formatting mistakes and changes to customer details and take immediate action to ensure data consistency.

There’s no need for custom coding or ‘black box’ solutions that restrict data insights to technical teams. Instead, all decision makers – including business decision makers – can access data reconciliation checks with minimal training and effort, increasing organisation-wide trust in data and effectively supporting your mission-critical decisions, processes and activities.

As an additional benefit, our solutions connect easily with your systems and workflows, making it fast and simple to improve accuracy and consistency across all your corporate information assets. They’re also highly scalable, making them ideal for even the largest data migration projects and BAU data reconciliation scenarios.

Take the next step on your data reconciliation journey

Our Aperture Data Studio platform is designed to help you tackle your most pressing data quality challenges, from gaining a single customer view to improving operational data quality, assisting with data migrations and enhancing accuracy in compliance reporting.

Aperture Data Studio combines self-service data quality and globally curated data sets in an intelligent data quality and enrichment platform. It empowers modern data practitioners to rapidly build the most consistent, accurate, and holistic view of their customers.

Get in touch

We can help you to quickly profile, standardise, cleanse, transform and enrich your data with Aperture Data Studio.

Let's talk

Data reconciliation helps in three key scenarios:

  1. Data accuracy and consistency during data migrations
  2. General data quality and consistency assessments in business-as-usual (BAU) scenarios
  3. Complex financial services scenarios
Copy Link Copied to clipboard