Aug 2017 | Data Quality

Data migration

In the recent Data Migration Research Study (carried out by Data Migration Pro in partnership with Experian) we took a detailed look at what’s happening in today’s data migration space.

I’ve been exploring some key observations in my ongoing blog series and today I’d like to look at a particularly important factor – reliance on the ‘Waterfall Method’ of software development.

This quote from one respondent summed up the experiences of many people:

“[the customer] saw the data migration project as a “normal” development project so used their existing software development framework. They didn’t follow the proposed data migration strategy because they didn’t get that this was not about creating software but transforming data.”

To be fair, it’s understandable for project managers to adopt a Waterfall approach because after all, a substantial element of the project requires the development of some form of logic to manage the data transformation. The classic Analyse, Design, Build, Test and Release approach would, therefore, appear to be a natural solution.

Why doesn’t the Waterfall Method work for data migration?

The problem is, data migration projects are more complicated (and risky) than most people realise because, at the start, you just don’t have enough cast-iron facts and accurate requirements to fulfil an end-to-end design for a Waterfall migration.

To reduce the level of risk, the Data Migration Research Study found that 62% of data migration projects are now using a phased approach to delivery.

Phasing a project means it is broken up into manageable data delivery iterations, instead of a one-off ‘Big-Bang’ movement of data that puts increased strain and risk on the business.

As I mentioned earlier, another challenge with Waterfall project structures is that you assume all the requirements are fully documented at the outset.

While some technical elements can be specified explicitly early on, there is always a significant amount of uncertainty around the quality of data, target platform selection, change of business model and other factors. These grey areas mean you can’t specify (with a high degree of accuracy) what your end-to-end requirements will be.

A sketchy requirements phase can often lead to incorrect assumptions being made about the scope and design of the project (not to mention the wrong budget and resourcing allocation being set).

Why should you adopt a more Agile delivery method?

A smarter approach is to work in cycles, using a more Agile delivery method.

By delivering ‘early-and-often’ with a leaner, Agile approach, as opposed to ‘late-and-once’ with a static Waterfall strategy, you stand a much greater chance of delivering what the business needs.

The reality is you can’t map out the entire process plan from the cradle to the grave, so moving forward ‘through the fog’ in high-value cycles of delivery, is an excellent way to reduce risk and costs.

The delivery method you choose is just one of a multitude of factors when it comes to successfully negotiating your way through the migration process.  Check out my other blogs for further insights into what’s going on in the world of data migration.

To get the latest research and learn more about leading a data migration, visit the Data Migration Leadership Hub.