Importing large amounts of JSON data
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-16-2020 07:13 AM
Hi, when bringing in large amounts of JSON data via a REST API and updating back-end tables, what would be the preferred method? I have had issues in the past when trying to handle large amounts off data as variables in script. Thanks..
- Labels:
-
Integrations

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-16-2020 07:19 AM
you can use here import set api to store those data into staging table and then map those data into your target table using transform map. .
doc link for further details.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-16-2020 07:41 AM
I forgot to mention that we are doing a GET from ServiceNow - the data is not being pushed into ServiceNow, so I do not think that the Import Set API is an option, unless I am mistaken..
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-16-2020 07:46 AM
Hi,
Are you consuming ServiceNow's GET endpoint in your external application?
Regards
Ankur
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
04-16-2020 07:48 AM
whats the flow ?
are you updating external database ?