Flow Designer: Exceptions

Muhammad Salar
Giga Sage

Hello All,

I have a Flow Action in which I’m using a script to retrieve data from an integration and insert it into a ServiceNow table. The total number of records exceeds 40,000.

To handle this, I’m recursively calling a function based on a condition to retrieve data from the REST API in batches. However, I’m encountering the following error:

com.glide.sys.TransactionCancelledLoopException

Alternatively, when I use a while(true) loop with a conditional break (to exit when the condition is met), I get this error:

Error: Transaction cancelled: maximum execution time exceeded.

The data is being inserted into the table correctly, and the logic works for smaller datasets (e.g., around 5,000 records). However, for large volumes, it eventually fails with one of the above errors.

Is there any recommended workaround or best practice for handling large data sets like this in a Flow Action script?

1 ACCEPTED SOLUTION

Ankur Bawiskar
Tera Patron
Tera Patron

@Muhammad Salar 

you can use Data stream and pagination

What Are Data Stream Actions? 

Data Stream Actions - Learn Integrations on the Now Platform 

If my response helped please mark it correct and close the thread so that it benefits future readers.

Regards,
Ankur
Certified Technical Architect  ||  9x ServiceNow MVP  ||  ServiceNow Community Leader

View solution in original post

1 REPLY 1

Ankur Bawiskar
Tera Patron
Tera Patron

@Muhammad Salar 

you can use Data stream and pagination

What Are Data Stream Actions? 

Data Stream Actions - Learn Integrations on the Now Platform 

If my response helped please mark it correct and close the thread so that it benefits future readers.

Regards,
Ankur
Certified Technical Architect  ||  9x ServiceNow MVP  ||  ServiceNow Community Leader