Processing more than 10,000 records with a scheduled job.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎05-25-2025 11:18 PM
I would like to update/delete more than 10,000 records with a scheduled job, but I am concerned that this may affect performance.
I would like to know the best practices for processing large volumes of records with a scheduled job.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎05-26-2025 06:32 PM
Hi @OlaN -san,
Thank you for your replying.
Sorry, I was wrong about the number. I'm planning to run the process on 200,000 records. Will there be any performance issues?

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎05-26-2025 10:31 PM
In general terms, I would say it's still an acceptable volume to process.
However, it might affect performance. It's hard to say without knowing more on your environment and what you are planning to do with your script.
It's almost always better to divide the job into smaller chunks if possible.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎05-26-2025 11:21 PM
@OlaN -san,
Thnks replying.
I would like to copy record to other table and delete copied record.
Thank you. Specifically, I would like to implement the following steps:
1. Copy records from table A to table B
2. Copy records from table C to table D
3. Link the copied records from table B to table D
4. Delete the records from table A (the source of the copy)
5. Delete the records from table C (the source of the copy)
I would like to implement this process for 200,000 records.
Is there a high possibility that this will affect performance?

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎05-27-2025 01:09 AM
Why do you need to copy records in between tables? What's the purpose behind that?
I can still not say if it would affect performance, because I do not know of the other circumstances in your environment, and if the record creation and/or record deletes has complex processing behind it or not.
Regardless, I would run this process during the evening/night (or any other time when the load on the instance is low) just to be safe.
And I would probably split it into at least two different runs. One to copy the data, and link the records. Then a second one to perform the needed deletes.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎05-25-2025 11:48 PM
@MiY please use latest features added by @servicenow in Tokyo and later releases.
ServiceNow introduced Update Jobs and Delete Jobs in the Tokyo release, allowing bulk operations without scripting.
These jobs support scheduling, previewing affected records, and rollback capabilities, ensuring safer bulk operations.
There will be no issues with updating records in volumes ranges of 10.000 records.