How to import huge response body data ( 50k , or 1L) into ServiceNow?

shabbir5
Tera Guru

Hi All,

 

Suppose while we are importing data into servicenow via rest API outbound integration ,

 

if the response body is huge lie in 50k , or 40k , or 1 Lakhs , then what is the process to import this payload into servicenow ?

 

if we run all 50k records at a time then this will lead to performance issue rite ?

 

How to avoid this issue ? what are the recommendable approaches to follow ?

 

please help with your inputs 

 

Regards,

Shabbir Shaik

3 REPLIES 3

Bhimashankar H
Mega Sage

Hi @shabbir5 ,

 

Don’t post 40–100k records in one REST call. Use Import Set API with pagination, process asynchronously, and stage data into import set tables, then transform. This pattern avoids long-running transactions, timeouts. I did one which having 15k records at a time and using offset fetching the next records.

 

Insert the record in staging table which extends import set table, that way once data is inserted in staging table automatically based on transform map it will insert in target table. Email the errors when importing the data to specific people.

 

Prefer scheduling during off-peak windows. Pagination would be best for such scenarios.

 

Thanks,
Bhimashankar H

 

-------------------------------------------------------------------------------------------------
If my response points you in the right directions, please consider marking it as 'Helpful' & 'Correct'. Thanks!

ifti122
Giga Guru

Hi Shabbir,

If the response has 50k or more records, sending all at once can make ServiceNow slow. It’s better to send small batches of 500–1000 records and use background import so the system does not slow down. You can also get data in pages from the source, which is easier and faster. This way you can safely import large data without any performance problems.
Thanks & Regards,
Muhammad Iftikhar,
If my response helped, please mark it as the accepted solution so others can benefit as well.

Animesh Das2
Mega Sage

Hi @shabbir5 ,

 

As suggested by others, pagination is the best option in this case to pull data page by page in smaller chunks. You can of course use pagination logic direct with the REST API request.

If you are using flow to trigger the integration then there is a low code way using 'Data stream' in the flow which will do the paginated REST API call.

You can refer this nice video from chuck if you would like to use Data stream,

https://www.youtube.com/watch?v=F37zZURAw6E

 

If this address your question, please don't forget to mark this response correct by clicking on Accept as Solution and/or Kudos.

You may mark this helpful as well if it helps you.

Thanks, 

Animesh Das

Data stream actions are a powerful way to get lots of records from a remote system that supports pagination without the risk of overloading either system. See how easy they are for developers to build and Flow Designers to use in this video. Links: * Docs: Data Stream Actions and Pagination ...