Bulk Data Migration
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
2 hours ago
How can we migrate 876K records/data which comes from many tables including attachments (if needed) from one Instance to another Instance at one time? Can this be achieved using Scheduled jobs, REST API?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
an hour ago - last edited an hour ago
Hi @CA5
Depends on whether this is a one-time migration (e.g., a "lift and shift" to a new instance) or a continuous sync.
Option 1 - One-Time Migration: XML Export/Import (Best for Integrity)
For a one-time move of nearly 1 million records, exporting to XML is often the cleanest method. XML preserves sys_id values, which is critical for maintaining references between tables (e.g., ensuring an Incident still points to the correct User).
Process: Export records from the source instance in chunks (to avoid timeouts) and import them into the target instance.
Attachments: You must specifically export/import the sys_attachment and sys_attachment_doc tables.
Caveat: You cannot do 876K in a single export. You should batch them (e.g., 50K–100K at a time) using encoded queries.
Option -2 Instance Data Replication (IDR)
If you have the licensing, IDR is the modern, ServiceNow-native way to sync data between instances.
Pros: It handles bulk data efficiently and manages the transformation of data.
Cons: Requires a specific subscription and might be "overkill" if you never intend to sync these instances again
Option -3 Import Sets & Transform Maps
This involves pulling data from the source using a Data Source (like a JDBC connection or a REST call) into a Staging Table.
Pros: Allows you to clean and map data during the migration.
Cons: More configuration effort; requires careful handling of attachments via a separate script.
Yes we can use Scheduled Jobs and REST APIs
These are perfect for orchestrating the migration in the background. A scheduled job can trigger a script that fetches data in batches (using sysparm_offset and sysparm_limit) to prevent the instance from "hanging.
https://support.servicenow.com/kb?id=kb_article_view&sysparm_article=KB0727636
If this helped to answer your query, please mark it helpful & accept the solution.
Thanks!
Krishnamohan
