Importing large amounts of JSON data

JB1
Kilo Contributor

Hi, when bringing in large amounts of JSON data via a REST API and updating back-end tables, what would be the preferred method? I have had issues in the past when trying to handle large amounts off data as variables in script. Thanks..  

8 REPLIES 8

JB1
Kilo Contributor

I have created a REST message in ServiceNow which calls the external system. I am therefore consuming data from an external system.

The calling of the API and the handling of JSON data is all done within a script include in ServiceNow. My concern is that the data will be too large to handle in script (I have seen issues before where the JSON data has been > 16MB and the interface has fallen over).

I guess I can retrieve the data in batches but I wondered if there was a more elegant way of handling large quantities of data via a scripted GET command. 

Hi,

it all depends on the 3rd party application whether they help you identify the chunks which would help you to reduce the performance; whether they can send data in chunk or they will send at once

Example: consider you consume their endpoint and get to know that you would receive 10000 records; so using some endpoint you can give offset and limit such as 0 to 4999 for getting these records then next chunk

Regards
Ankur

Regards,
Ankur
Certified Technical Architect  ||  9x ServiceNow MVP  ||  ServiceNow Community Leader

Hello Ankur,

In my case there are around 90000 records. And I'm using offset parameters to limit the records.

And I'm successfully reciveing the data in chunks. eg. 0-25000

But what is the best pratice to use this chunk logic in servicenow. I mean when should i fire my next post call? How should be the logic?

should I create 3 different Scheduled jobs and 3 different rest post message?

How much time should i wait to fire next post call? How can I get a confimation from first chunk call that I've recived 25000 records and it has been successfully updated in target table. Now you can fire next post call.

What will happen if I fire next post call in between when first chunk was running.

 

 

@Tushar Walvekar 

in the same schedule job

You know the total count and the chunk size so you can run particular function that many number of times and pass the json.

Now how much time to wait depends because if you are running and transforming the data asynchronously then you can invoke it after every time the script runs for chunk size.

Regards,
Ankur
Certified Technical Architect  ||  9x ServiceNow MVP  ||  ServiceNow Community Leader