Process Bulk data
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-29-2025 10:40 AM
Hi All,
I want to sync the shared mailbox data into custom tables of servicenow. There is a powershell script which pulls the data from exchange server and it's output has all the required data in JSON format.
There are around 11K shared mailboxes in production.
I'm converting the JSON data into an array and processing each element of the array and creating/ updating the mailbox records in custom table using flow actions. But in the flow settings has a limit to process only 1000 iterations in a loop. So, can anyone let me know if we can process the data in batches in the flow designer? or, please suggest if there is any other approach available
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-29-2025 03:11 PM - edited 05-29-2025 03:15 PM
So you are going to have to batch the base JSON Array using script and then perform the for each on the batches instead. The only way I can think to do this in ServiceNow is using array.slice(). But that doesn't allow you to use the built in for each flow logic so to achieve your outcome you would need to move the part of your flow that processes each element to create/update mailbox records in your table to a subflow and call it via a script action. You can then use a scripted step to batch and process arrays containing a large number of objects.
var batchSize = 500; //Number of array object to be processed per batch.
for (var i = 0; i < jsonArray.length; i += batchSize) { //standard json loop logic
var batch = jsonArray.slice(i, i + batchSize); //identifies the current batch in the loop.
//Call your subflow here passing the batch of 1000 or less objects (equal to your batchSize) to the subflow for processing as an input parameter.
var inputs = {};
inputs['batchArray'] = batch;
sn_fd.FlowAPI.startSubflow('subflow name', inputs); //This will start the subflow to process the provided batch. You will not have access to any outputs and it will not wait until this batch finished before starting the next.
}
This will distribute the work to several separate subflow executions allowing you to avoid any constraints on iteration restrictions. The way that inputs is configured your subflow in my exact example would have an input "batchArray" that would be the JSON array containing the batch of objects that would be processed.