Transferring a large number of records from table to table - best practice

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-04-2025 02:10 AM
Hi All,
Working with a couple of scoped applications, I have to transfer records from a table in one app to another (similar) table in a different app (on the same instance). There are around a million records, so I believe it would be best to process them in batches.
I was just wondering if there's a best practice approach to this?
Thanks,
Tim.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-04-2025 02:37 AM
Perhaps this is related to GlideRecords only, but consider using and setting the autoSysFields(false) as well - it is setting the created, created by, updated, updated by and it would be nice to keep the original value before the moving... you can test it for one or a few record(s) and you will see how it will look in the target one...
Note: it is plural > fieldS - in the autoSysFields
/* If my response wasn’t a total disaster ↙️ ⭐ drop a Kudos or Accept as Solution ✅ ↘️ Cheers! */
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-04-2025 02:37 AM
@Tim D Scott more details on that here:
https://www.linkedin.com/posts/lukasz-szumilas-servicenowdeveloper_why-i-use-setworkflowfalse-and-au...
/* If my response wasn’t a total disaster ↙️ ⭐ drop a Kudos or Accept as Solution ✅ ↘️ Cheers! */

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-04-2025 02:17 AM
Hi @Tim D Scott ,
moving a large volume of records (about a million) between similar tables in two different scoped applications on the same ServiceNow instance — the best practice would definitely be to process the transfer in batches to avoid performance issues and possible timeouts.
Since you’re dealing with scoped apps on the same instance, you can absolutely write a background script, use Flow Designer, or even scheduled jobs.
However, what I’d strongly recommend here — especially when you have two scoped apps that need to keep data in sync or transfer large data sets efficiently — is to use the Precision Bridge tool.
Precision Bridge is specifically designed for this type of controlled, large-scale data movement between applications. It handles things like batching, error handling, and record mapping out-of-the-box, which makes your life easier and avoids building and maintaining custom scripts that can get messy and hard to scale.
So, my suggestion would be:
Explore and configure the Precision Bridge in your instance.
Define the source and target tables and set up the data flow.
Use batching options within Precision Bridge to handle large data volume safely.
If there are minor field differences between source and target, you can also configure transformations or mappings within the bridge.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-04-2025 02:27 AM
Hi Tejas,
This is on a customer instance, so unlikely we can use any external tools, I think it'll have to be done using scheduled scripts, thanks!
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-04-2025 05:11 AM
Hi @Tim D Scott ,
Have you explored the IntegrationHub ETL feature. It provides a user-friendly, UI-based approach to efficiently transform and load data into target tables. One of its key advantages is the ability to easily define field mappings and transformations, and it also supports creating relationships between records during the import process.
Regards,
Srinija