- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-16-2025 06:08 AM
Hello All,
I have a Flow Action in which I’m using a script to retrieve data from an integration and insert it into a ServiceNow table. The total number of records exceeds 40,000.
To handle this, I’m recursively calling a function based on a condition to retrieve data from the REST API in batches. However, I’m encountering the following error:
com.glide.sys.TransactionCancelledLoopException
Alternatively, when I use a while(true) loop with a conditional break (to exit when the condition is met), I get this error:
Error: Transaction cancelled: maximum execution time exceeded.
The data is being inserted into the table correctly, and the logic works for smaller datasets (e.g., around 5,000 records). However, for large volumes, it eventually fails with one of the above errors.
Is there any recommended workaround or best practice for handling large data sets like this in a Flow Action script?
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-16-2025 06:15 AM
you can use Data stream and pagination
Data Stream Actions - Learn Integrations on the Now Platform
If my response helped please mark it correct and close the thread so that it benefits future readers.
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
06-16-2025 06:15 AM
you can use Data stream and pagination
Data Stream Actions - Learn Integrations on the Now Platform
If my response helped please mark it correct and close the thread so that it benefits future readers.
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader