Data quality management for SMEs – Part 2
Posted on by experian
Estimated read time: 5 mins
5 main points to achieve higher levels of data quality
Data quality begins at source, which is the key when establishing best practice around data quality verification.
- Discuss, create and finalise a business strategy: What are you trying to achieve as a business? Work with all relevant teams to establish what the data can be used for and how. This step is essential and shapes the way the rest of the data quality management programme works.
- Data fields – Capture: After the business has established what it wants to do and why, the data capture needs to be defined. The business needs to be sure that it is capturing information that will ultimately lead to better decision making. This will ensure that the business is working more efficiently.
- Humans: Human error and deliberate human actions can lead a database into a dark infestation of inaccurately recorded data. The element of human error needs to be identified and reduced, and this is only achievable by undertaking a series of internal audits and observations. Create procedures for data capture and incorporate the training into your induction and development plans.
- Machines: The software you use may be relatively dated but still does the job. However, some rules can easily be put into place to make this system more effective. Decide what data is essential.
- Hygiene at point of capture: Despite the necessary controls being in place, bad data will still get through. Bad addresses never seem to go away but there are a few free and low cost solutions available in the marketplace to support your data quality endeavours
Duplication – free: Your database is more than likely to have the capability to identify duplicates. This can be done when the user tries to insert a new customer. Duplicates should be checked for from point of capture before new records are transferred into the database. Automation is the key to performing duplication checks
Verification – low cost: Verifying the correct postal address at point of capture is easily achievable and goes a long way in ensuring that the data you put into your database is not garbage. These services are available to bring in-house, but also accessible as an online service e.g. Address Validation – Pro.
How to avoid data degradation
Data can sit in the database for months, years and even decades. It sometimes doesn’t see the light of day and has no associated value. The customer may have lapsed many years ago, but businesses still hold this data in a hope that it will someday be of some use. However, that day will probably never come, which begs the question – why keep it?
This is where the phenomenon of data hoarding comes in. Collecting masses of information may not cost much to the business by way of operational costs, but the real value of the real data can become lost within the overwhelming amounts of data the business has decided to keep.
Data continually gets pumped into the database and every so often the business may decide to send out a mailshot to its entire base, hoping that someone will respond. It’s quite obvious that the response rates and ROI will be very low. It is essential for businesses to recognise the value of the data. However good your point of capture processes are, data will degrade very fast.
Around 18,000 people move house every day. Thirty per cent of people change email address every year. Consumers specify Mailing and Telephone preferences, and the number of deceased contacts grows daily. All businesses must be proactive in the maintenance of their data to extend the data lifecycle, and to ensure that brand reputation is protected and money is saved.
The maintenance of data isn’t something that occurs only before you use it for marketing purposes. It encapsulates the processes before use of the data and after use of the data. Processes need to be in place where all changes and updates are fed back into the database to ensure accuracy and consistency.
In Part 3, I will look at 7 simple steps to help extend data life and retain data worth.
Did you miss Part 1? Read it here.