Migrating Data from Legacy Application to ServiceNow

Kamal17
Kilo Sage

In implementations where clients replace their legacy systems with ServiceNow, often we come across request for data migration. Though it's not recommended to migrate transactional data like incidents, tasks and cases. But there are some exceptional situations where we need to migrate transactional data for various reasons.

 

In this article, I'll share one of my experiences where my team was tasked with such migration. Though I did migrations in the past, this time it was a bit larger than what I did in the past. We had an awesome team who went above and beyond to make this migration happen and meet customer expectations. This article covers the overall solution without going too much detail into the technical details.

 

Requirement:

What ?
Migrate over 150GB of transactional and some part of foundation data from legacy system to ServiceNow (on-prem setup). 

 

Why?
Primarily, customer support operations team refer to this data for their day-to-day work. Also, there were other secondary reasons like auditing and data retention requirements. Moreover, storing this data in an external data repository systems will make it difficult to access the data while supporting customers.

Point is, there was a strong business justification for this migration and it was a significant part of the project.

 

Before I go into how we did this migration. I would like to highlight few key items that influenced our migration strategy and solutioning.

 

Considerations:
-Ease of access. All data should reside in ServiceNow.
-Migrated data should not result in significant performance issues.
-OOTB CSM setup should not be customized.

 

Challenges/Issues:

We faced several challenges during this migration. But I have covered some of the key items in this article.

1. Direct connection between legacy system and SN is not allowed for security reasons.
2. Sensitive information like Social security number should not be stored in ServiceNow. But it is used as unique identifier for couple of tables. So, we had to find alternate options.
3. Data model was different and heavily customized in the legacy application. So, we had to find a way to avoid customizing the OOTB tables to store legacy application data.
4. Import all necessary data but at the same time minimize the no of custom tables used for migration.
5. Over 7 million cases were there. A trial run to import the data, caused the process to hang.
6. MID server can't be used due to it's limit on handling the data import size. As a result, we couldn't use integrated windows authentication.
7. Converting open cases from legacy application. On top of that, legacy data like groups and group hierarchies were different from new groups and hierarchies. So we had to find a way to map old data to the new data during migration while converting open cases into ServiceNow cases.

 

Solution:

Now let's look at the five steps we followed to complete this migration.

 

How?

 

Step 1: Assess the source data

Finalize tables - This is key to successful migration. We finalized the tables by working with customer to identify their business needs and see which table data is needed to meet their customer support team's operational needs. For e.g., we selected tables which stored case data, customer data, groups, choice values and few others.

 

Finalize fields - Since the legacy application was heavily customized, it had over hundred fields in some tables. So we had to work closely with the team who was currently using the application to identify only necessary fields. No of fields included in migration will have a significant impact on system performance during migration and after migration when the team starts using the system. It also has a greater impact on the time taken to complete the migration.

 

This will be a time consuming activity. So, make sure to call it out in advance to identify a POC from the team who built the legacy system to share relevant details during initial phase. In our case, we didn't had a POC from team who built the legacy application. So we worked with the support team who had limited knowledge. This resulted in more work for our team and consumed additional time. Because in some cases we had to go through product documentations of the legacy application to understand the data model and purpose of the fields to take a call on whether it is needed for migration.

If step 1 is completed correctly, it can avoid lot of challenges during migration.

 

Step 2: Clean the data at source

This is something we strongly recommended the customer to do before migrating the data. Because during our initial assessment we understood, some tables (E.g., consumer/customer profiles) had lots of duplicates and it was impacting their overall reporting accuracy. But customer didn't had the resource or time to do the cleanup. Instead, they requested our team to do the cleanup during migration.

 

To handle the cleanup, our approach was to identify the unique identifiers for each table and then merge the duplicate records. But in some tables there was no unique identifiers. So to handle this situation, we started identifying combination of fields to identify duplicates. However, this approach didn't solve the duplicate problem 100% due to data discrepancies but it helped us to taken down a significant portion of the duplicate records during migration.

 

So, as part of step 2, clean the data at source. If that's not possible, then at least finalize the fields which can be used as unique identifiers to clear the duplicates during import or after importing the data in ServiceNow.

 

Step 3: Finalize the migration approach

We preferred to use JDBC import to directly connect to the database and pull data from the legacy application. Moreover, there was no development team available for legacy application to make changes needed for doing a webservice based setup. So using webservice imports was never an option for us.


However, due to security reasons, ServiceNow was not allowed to connect with legacy application dB directly. To address this, we had to setup an intermediary database where the dB backup of legacy application was applied. Then, we connected to this intermediary dB via JDBC and imported the data.


Some of the PII information had to be masked before it was imported into ServiceNow. We had to mask the fields which contained PII information in the intermediary dB before we started the import.

 

Step 4: Prepare ServiceNow to store the incoming data

Our key objective was to ensure we don't end-up customizing the OOTB CSM data model. So as part of data modeling, we grouped the tables chosen for migration into 3 groups.


1. OOTB tables - Non-transactional data like consumer/customer information were stored in OOTB tables. Because the data model in legacy application and ServiceNow matched for some of the tables. Only couple of custom fields were created in these tables.


2. Custom tables - For transactional data like cases, we created custom tables. There are two key reasons for doing this.
              1. Data model in the legacy application was heavily customized and didn't match ServiceNow's data model.
               2. Storing over 7 million case records in ServiceNow CSM case table on day 1 will have an impact on performance.
So, we decided to keep the transactional data in custom tables. Also, our plan is to drop/delete these tables once the retention period is over.


3. Transition tables - these are temporary custom tables created in ServiceNow to store migrated data like choice list, categories and other similar dependent table data which are needed for the primary tables like customer and case. Because in the legacy application in primary tables, fields referring to these secondary tables were like reference fields and stored only sysid and not the labels. Hence, we had to migrate these tables into ServiceNow. Later executed a transformation script to replace the sysid with labels. After migration, we deleted these transition tables as they were no longer needed. In this way, we ensured only limited no of custom tables were created for migration.

 

Note: 
Tables storing migrated transactional data was set to read-only for everyone including system administrators.
We also configured controls (E.g., before insert/update BRs) to prevent creation of duplicate entries based on the unique identifiers identified for specific tables E.g., consumer/customer profile tables.

 

Step 5: Configure, Migrate and Test

For migration, we used combination of default import sets and concurrent import sets depending on the table data size. For case table, we had to migrate 13 years of data, with each year having over 50k records and in some cases over a million. Import jobs got stuck when we did a trial run to evaluate the load handling capacity of the system while using a single data source to pull all data.

 

So, we decided to configure multiple data sources to pull records sequentially. We configured query for each data source to ensure it pulled around 100K records for optimal performance. Later, we grouped these data sources by creating data import schedules and executed them in a sequence.


In some cases, to bring down the time for migrating data, we executed import jobs in specific nodes in parallel. Since, PROD instance had 8 nodes, we used 4-5 nodes to import different sets of case data by executing different jobs in each node. This significantly reduced the migration time.

 

To bring down the maintenance window during go-live, our approach was to migrate the data in phases. As part of this, 1-2week before go-live we took a full back up of the PROD dB from the legacy application, applied it on the intermediary dB and then imported it into ServiceNow and completed the testing. Then during the go-live cutover window, we took the back up from legacy application and applied it on the intermediary db. But this time we imported only the data that was created/updated after the first import. This approach helped us to bring-down significant workload and maintenance window during go-live.

 

We also created a transformation script, to convert open cases migrated from legacy application into ServiceNow cases to provide continual support. We also made sure that the migrated cases are linked to the converted ServiceNow cases.

 

As part of our testing strategy, we did

  1. Preload test: this is critical to validate if the setup we did is working fine and all the transformation is happening as expected. In our case, during this preload test, we identified issues in advance by importing small data sets and fixed them promptly to avoid huge rework after doing full import.
  2. Count validation: we compared the no of records created in SN per table with corresponding table in the legacy application. For some tables, where we did the duplicate clean-up, there was difference in the count which was expected.
  3. Sampling validation: we selected random samples in different years and compared the values with legacy application.

We created a runbook to track pre-migration, migration and post-migration activities along with approximate duration for each activity. We also created an excel sheet to track the testing results. Later, we shared this test tracker sheet with customer to receive sign-off on the migration.

 

I hope you find this information helpful.

 

Regards,

Kamal S

 

4 REPLIES 4

ersureshbe
Giga Sage
Giga Sage

Hi,

I have noticed that you have submitted your writing as an article; however, it should be categorized under your 'Architect Forum Article' page. Kindly remove the content from the question page.

 

Regards,
Suresh.

Thanks @ersureshbe, I couldn't find an option to create a new article or blog in the architect forum. Hence, posted it in here.

 

Hi, Please find below screenshot and add your item in article page not Q&A Page.

ersureshbe_0-1726479806337.png

 

 

Regards,
Suresh.

Hi Suren, I actually posted this article from there. But the article still got posted in architect forum.

 

Regards,

Kamal S