- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-12-2021 03:13 PM
Hey guys,
I am exploring Batch API
My requirement is , an internal team will post 10000 of Rest Calls per day.
Each Rest Call has around 300 records that has to be inserted to target table . So i cannot use Import Set API , as Import Set API only inserts/updates one record per call .
So I was exploring Scripted Rest API to parse the JSON and then insert to the Import Set table to transform.
There are some reference fields i have to populate and also create a Parent Child Relationships , so i would have to have a transform to build logic .
I also wanted to know about the Batch API .
Lets assume , i provide the Batch API end point to a different team, if they are going to post Rest Calls with multiple records , how is the data being transformed and how can i add logic before i transform data to Target table .
Is Batch API is an upgraded version of Table API with batches of Records ??
Thank You
Solved! Go to Solution.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-17-2021 09:01 PM
Hi,
you need to send array of json objects
the key would be the column name in import set
Example:
field1 and field2 are the column names of the import set staging table
[
{
"field1":"value1",
"field2":"value2"
},
{
"field1":"value3",
"field2":"value3"
}
]
Regards
Ankur
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-12-2021 05:05 PM
Ok so I follow you. You can absolutely use scripted rest API to stringify the JSON and then put it all into fields on the first table .Then use the first table as a vessel to produce the 300 records to the final destination table.
What I have done is made a Huge text field in the first table that takes in the JSON text- then parse it on insert and disperse it to different fields in the first table that i want populated. Then consider this instead of using a transform map- Why not do a BR ON INSERT(ordered after the parsing is done) on the table that will first receive the data that can then Gliderecord spin out the 300 incidents.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-12-2021 07:49 PM
Hey bammar ,
Not all fields can be directly mapped .
Some fields are reference fields , so have to do individual glide records to get the reference values .
So transform maps can hold these information and manageability is good .
I really wanted to know about the functionality of Batch API .
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-12-2021 10:28 PM
this link has the approach from Aman
Import Set API - multiple records
Since you wanted to have parent child relationship you can have 2 transform maps running in order so that 1st parent is created and then the child
Regards
Ankur
Ankur
✨ Certified Technical Architect || ✨ 9x ServiceNow MVP || ✨ ServiceNow Community Leader
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
02-14-2021 09:00 AM
Hi Ankur ,
Yes i am aware of that.
I want to understand more about Batch API .
IF batch API acts like a Import set API which can process multiple records in one payload .
And to your above comment , i am getting a payload of Parent and Child as a whole , i might be able to use Scripted Rest API and parse that and then post to the staging table and transform the parent and child.
I want to understand more about Batch API .