Import bulk data into ServiceNow
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-21-2024 11:32 PM
Hi Team,
We do have some XYZ external application and they are storing data in SQL database.
Now they are planning to import xyz application's data into ServiceNow , which has large data.
So what is the best practice to import bulk data from external app to our ServiceNow.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-22-2024 06:58 AM
HI, I suggest you sit back with your data source (app XYZ) guys and re-discuss data structures. Consider that it most likely is also additional effort for the data source to generate that many files, I do not think they have that many tables in the source SQL database, while project will benefit from lower effort if the data is carried in bigger chunks
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-21-2024 11:54 PM
Here’s how you can best import bulk data from an external XYZ application into ServiceNow.
Plan Data Mapping: First, ensure that you have a clear mapping between the fields in the external application and the corresponding fields in ServiceNow.
Data Cleaning: Before importing, review and clean the data from the external source to remove duplicates and inconsistencies, ensuring high data quality post-import.
Use Import Sets: Utilize ServiceNow's "Import Set" feature to initially load the data into staging tables. After that, use "Transform Maps" to map the data to the appropriate ServiceNow tables.
Leverage CSV Files: For large datasets, consider exporting data from the SQL database to CSV files and then importing those into ServiceNow. You can split large CSV files into smaller chunks to reduce system load.
Use Web Services or APIs: For handling large volumes of data, you might consider using ServiceNow’s REST or SOAP APIs to programmatically import data. This allows for automated scripts or scheduled jobs.
Plan for Batch Import: Instead of importing all data at once, set up batch processing to import the data in smaller portions, which helps avoid performance issues.
Test and Validate: Always test the import process in a sandbox or test environment before running it in production to ensure everything works smoothly.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
08-21-2024 11:57 PM
Hi, we had few situations like this, I suggest you start with agreeing with the stakeholders /application XYZ owner on the scope of data to transfer. I mean there is usually no need to transfer all history, only the currently active/open cases / records and a small bit of history only (like cases closed current year).
When comes to technical transfer, I recommend you take "classic" approach with offloading the old database records to Excel files, manipulate data off-line in Excel to do necessary conversions / remaping (usually some action of this type is needed to convert data structure co comply with SN tables model and attributes) and then use file-based import to SN with transform map to load the data.
At this step you have 2 procedural choices, select depending on the nature of data you have:
Option 1: load in non-production (test), review data there, validate & correct and then move the validated data from test to your production SN instance with XML data export & import
Option 2: load in non-production and see if data loads OK, fix the transform maps until the data is OK, then move the import definition (data source, staging table, all transform maps with scripts) to production instance and load the data to production instance
Option 1 allows for better data control, but the Option 2 is a must when your transferred data has multiple references to some other data, specially if the data is of high dynamic (some business cases).
Hope this helps!