Processing more than 10,000 records with a scheduled job.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎05-25-2025 11:18 PM
I would like to update/delete more than 10,000 records with a scheduled job, but I am concerned that this may affect performance.
I would like to know the best practices for processing large volumes of records with a scheduled job.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎05-25-2025 11:26 PM
divide it in chunks
1) create 2 scheduled jobs to delete 5000 with setLimit(5000) each
2) ensure you run this in non-business hours
If my response helped please mark it correct and close the thread so that it benefits future readers.
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎05-26-2025 06:32 PM
Hi @Ankur Bawiskar -san,
Thank you for your replying.
Sorry, I was wrong about the number. I'm planning to run the process on 200,000 records. Will there be any performance issues?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎05-26-2025 08:14 PM

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
‎05-25-2025 11:31 PM
Hi,
For large volumes of deletes, I would recommend looking into Delete jobs.
For updating records, I see no issues with updating records in volumes ranges of 10.000 records.
It would depend if there is much processing going on with business rules and such to affect performance.