Data migration is the process of moving data from one location to another, one format to another, or one application to another. Generally, this is the result of introducing a new system or location for the data. The business driver is usually an application migration or consolidation in which legacy systems are replaced or augmented by new applications that will share the same dataset. These days, data migrations are often started as firms move from on-premises infrastructure and applications to cloud-based storage and applications to optimize or transform their company.
The short answer is "data gravity." Although the concept of data gravity has been around for some time, the challenge is becoming more significant because of data migrations to cloud infrastructures. In brief, data gravity is a metaphor that describes:
To move applications and data to more advantageous environments, Gartner recommends "disentangling" data and applications as a means overcoming data gravity. By making time at the beginning of the project to sort out data and application complexities, firms can improve their data management, enable application mobility, and improve data governance.
The main issue is that every application complicates data management by introducing elements of application logic into the data management tier, and each one is indifferent to the next data use case. Business processes use data in isolation and then output their own formats, leaving integration for the next process. Therefore, application design, data architecture, and business processes must all respond to each other, but often one of these groups is unable or unwilling to change. This forces application administrators to sidestep ideal and simple workflows, resulting in suboptimal designs. And, although the workaround may have been necessary at the time, this technical debt must eventually be addressed during data migration or integration projects.
Given this complexity, consider promoting data migration to "strategic weapon" status so that it gets the right level of awareness and resources. To ensure that the project gets the attention it needs, focus on the most provocative element of the migration – the fact that the legacy system will be turned off – and you’ll have the attention of key stakeholders, guaranteed.
There are numerous business advantages to upgrading systems or extending a data center into the cloud. For many firms, this is a very natural evolution. Companies using cloud are hoping that they can focus their staff on business priorities, fuel top-line growth, increase agility, reduce capital expenses, and pay for only what they need on demand. However, the type of migration undertaken will determine how much IT staff time can be freed to work on other projects.
First, let’s define the types of migration:
Data migration involves 3 basic steps:
Moving important or sensitive data and decommissioning legacy systems can put stakeholders on edge. Having a solid plan is a must; however, you don’t have to reinvent the wheel. You can find numerous sample data migration plans and checklists on the web. For example, Data Migration Pro, a community of data migration specialists, has a comprehensive checklist that outlines a 7-phase process:
This may appear to be an overwhelming amount of work, but not all these steps are needed for every migration. Each situation is unique, and each company approaches the task differently.
Even though data migration has been a fact of IT life for decades, horror stories are still reported every year. Here are the top 10 challenges that firms encounter in moving data:
Not contacting key stakeholders. No matter the size of the migration, there is someone, somewhere who cares about the data you’re moving. Track them down and explain the need for this project and the impact on them before you get going on the task. If you don’t, you’ll certainly hear from them at some stage, and chances are good that they’ll disrupt your timeline.
Not communicating with the business. Once you’ve explained the project to the stakeholders, be sure to keep them informed of your progress. It’s best to provide a status report on the same day every week, especially if things get off track. Regular communication goes a long way in building trust with all those affected.
Lack of data governance. Be sure you’re clear on who has the rights to create, approve, edit, or remove data from the source system, and document that in writing as part of your project plan.
Lack of expertise. Although this is a straightforward task, there's a lot of complexity involved in moving data. Having an experienced professional with excellent references helps the process go smoothly.
Lack of planning. On average, families spend 10 to 20 hours planning their vacation, while IT teams may spend as little as half that time planning a small data migration. Hours spent planning don't always guarantee success but having a solid data migration plan does save hours when it comes to actually moving the data.
Insufficient data prep software and skills. If this is a large migration (millions of records or hundreds of tables), invest in first-class data quality software and consider hiring a specialist firm to assist. Good news: An outside firm will probably rent you the software to help conserve costs.
Waiting for perfect specs for the target. If the implementation team is sorting out design criteria, press on with steps 2 and 3. Target readiness will matter later in the project, but don’t let it stop you now.
Unproven migration methodology. Do some research to be sure that the data movement procedure has worked well for other firms like yours. Resist the temptation to just accept the generic procedure offered by a vendor.
Supplier and project management. Vendors and projects must be managed. If you're still doing your day job too, be sure that you have the time to manage the project and any related suppliers.
Cross-object dependencies. With the technology and capabilities of data management tools available today, it's still shocking to learn about a dependent dataset that wasn’t included in the original plan. Because cross-object dependencies often are not discovered until very late in the migration process, be sure to build in a contingency for them so that your entire delivery date isn’t thrown off.
The terms data migration and data conversion are sometimes used interchangeably on the internet, so let’s clear this up: They mean different things. As pointed out earlier, data migration is the process of moving data between locations, formats, or systems. Data migration includes data profiling, data cleansing, data validation, and the ongoing data quality assurance process in the target system. In a typical data migration scenario, data conversion is only the first step in a complex process.
The term data conversion refers to the process of transforming data from one format to another. This is necessary when moving data from a legacy application to an upgraded version of the same application or an entirely different application with a new structure. To convert it, data must be extracted from the source, altered, and loaded into the new target system based on a set of requirements.
Another term that is sometimes confused with data migration is data integration. Data integration refers to the process of combining data residing at different sources to provide users with a unified view of all the data. Integrating data from multiple sources is essential for data analytics. Example of data integration include data warehouses, data lakes, and NetApp® FabricPools, which automate data tiering between on-premise data centers and clouds or automatically tier data between AWS EBS block storage and AWS S3 object stores.
Move to Infrastructure as a Service (IaaS):
Move to Platform as a Service (PaaS):
Choosing a deployment model that aligns with business requirements is essential to make sure that any data migration is both smooth and successful and delivers business value in terms of performance, security, and ROI.
From artificial intelligence to data centers in the cloud, learn why NetApp is the gold-standard for data storage and management.
AI requires efficient management and processing of huge volumes of data. NetApp designs AI solutions to meet the most challenging needs.
NetApp leads the storage industry with its all-flash arrays that deliver robust data services.
Modernize your IT environment with the world's leading data management experts and specialists.
You need a solid foundation for your seamless hybrid cloud. NetApp® ONTAP® data management software gives you every advantage possible
To edit this Page SEO component