- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-09-2025 04:34 AM
Hey guys, I need to update like millions of records through a background script, but every time I try, the script either crashes or just stops halfway. I’m guessing it’s some memory or transaction limit issue. Has anyone handled this kind of bulk update before? What’s the best way to do this without killing the instance?
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-09-2025 04:41 AM
Hey @cherryarora ,
Write a Scheduled Script Execution that:
Fetches only a batch of records (e.g., 500–5000) at a time using .setLimit().
Updates them.
Ends, then runs again on a schedule (e.g., every 5 min) until all are processed.
Track progress using a sys_property, sysparm, or a marker field to avoid reprocessing.
Please mark this as helpful if it helps.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-09-2025 04:41 AM
Hey @cherryarora ,
Write a Scheduled Script Execution that:
Fetches only a batch of records (e.g., 500–5000) at a time using .setLimit().
Updates them.
Ends, then runs again on a schedule (e.g., every 5 min) until all are processed.
Track progress using a sys_property, sysparm, or a marker field to avoid reprocessing.
Please mark this as helpful if it helps.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-09-2025 11:56 PM
Thank you
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
07-09-2025 04:42 AM
divide it in chunks of 50000 and ensure you use setWorkflow(false) to avoid triggering any business rule on update.
If my response helped please mark it correct and close the thread so that it benefits future readers.
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader