servicenow

cherryarora
Giga Contributor

Hey guys, I need to update like millions of records through a background script, but every time I try, the script either crashes or just stops halfway. I’m guessing it’s some memory or transaction limit issue. Has anyone handled this kind of bulk update before? What’s the best way to do this without killing the instance?

1 ACCEPTED SOLUTION

AyushKumarM
Mega Guru

Hey @cherryarora ,

 

  • Write a Scheduled Script Execution that:

    • Fetches only a batch of records (e.g., 500–5000) at a time using .setLimit().

    • Updates them.

    • Ends, then runs again on a schedule (e.g., every 5 min) until all are processed.

  • Track progress using a sys_property, sysparm, or a marker field to avoid reprocessing.


Please mark this as helpful if it helps.

 

 

View solution in original post

3 REPLIES 3

AyushKumarM
Mega Guru

Hey @cherryarora ,

 

  • Write a Scheduled Script Execution that:

    • Fetches only a batch of records (e.g., 500–5000) at a time using .setLimit().

    • Updates them.

    • Ends, then runs again on a schedule (e.g., every 5 min) until all are processed.

  • Track progress using a sys_property, sysparm, or a marker field to avoid reprocessing.


Please mark this as helpful if it helps.

 

 

Thank you 

Ankur Bawiskar
Tera Patron
Tera Patron

@cherryarora 

divide it in chunks of 50000 and ensure you use setWorkflow(false) to avoid triggering any business rule on update.

If my response helped please mark it correct and close the thread so that it benefits future readers.

Regards,
Ankur
Certified Technical Architect  ||  9x ServiceNow MVP  ||  ServiceNow Community Leader