Join the #BuildWithBuildAgent Challenge! Get recognized, earn exclusive swag, and inspire the ServiceNow Community with what you can build using Build Agent.  Join the Challenge.

Importing large amounts of JSON data

JB1
Kilo Contributor

Hi, when bringing in large amounts of JSON data via a REST API and updating back-end tables, what would be the preferred method? I have had issues in the past when trying to handle large amounts off data as variables in script. Thanks..  

8 REPLIES 8

Harsh Vardhan
Giga Patron

you can use here import set api to store those data into staging table and then map those data into your target table using transform map. . 

 

doc link for further details. 

 

https://docs.servicenow.com/bundle/geneva-servicenow-platform/page/integrate/inbound_rest/reference/...

 

 

I forgot to mention that we are doing a GET from ServiceNow - the data is not being pushed into ServiceNow, so I do not think that the Import Set API is an option, unless I am mistaken..

Hi,

Are you consuming ServiceNow's GET endpoint in your external application?

Regards
Ankur

 

Regards,
Ankur
Certified Technical Architect  ||  9x ServiceNow MVP  ||  ServiceNow Community Leader

 

whats the flow ? 

are you updating external database ?