How to import structured JSON into multiple Tables
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-16-2023 09:33 AM - edited 11-16-2023 09:39 AM
Is there any step by step guide available to how to import a structured JSON?
We have a process which generates a JSON file which is in the following form
[
{
"entity1_attr1": "value",
"entities2" : [
{
"entity2_attr1": "value",
"entities3": [
"entity3_attr1": "value"
]
},
]
}
]
We need this to update multiple tables and add references in as it is processed.
I have seen references in the "Robust Import Set Transform" documentation to JSON input (such as in Robust Import Set Transformers Overview) where it says "...when the Data source has JSON/XML payloads in a single column", but very little information on how to configure all the downstream resources.
Alternatively is there a way to use the Data Load functionality and trigger a Script? I am very familiar with managing JSON in a script and updating Tables with GlideRecord - so that aspect feels doable, but we'd rather not require the user to update the input file as an attachment first - ideally a "Load Data" can perform the file upload and trigger script with a reference to the file (e.g. attachment Id).
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-16-2023 11:57 AM
Hi, you could use a Scripted Rest API and an import set\transform, however unless you need to return specific response to your source before data is fully processed? then I would consider an Import-set\transform with a beforeTransform script to decode your JSON and map your payload.
I don't think your example is valid JSON, and you can confirm this with assorted online JSON validators,
but here a background script with similar example.
var myPayload = '{ "entity1_attr1": "FirstLevelValue", "entities2" : [ { "entity2_attr1": "SecondLevelValue", "entities3": [ {"entity3_attr1": "ThirdLevelValueA" }, {"entity3_attr2": "ThirdLevelValueB"} ] }, ] }';
var parser = new JSON();
var myObj = parser.decode(myPayload);
gs.info('entity1_attr1 ' + myObj.entity1_attr1);
gs.info('entities2 ' + myObj.entities2[0].entity2_attr1);
gs.info('entities3 ' + myObj.entities2[0].entities3[0].entity3_attr1);
gs.info('entities3 ' + myObj.entities2[0].entities3[1].entity3_attr2);
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-17-2023 02:50 AM - edited 11-17-2023 04:22 AM
Thanks Tony - no I don't need to respond back. Are you saying that for the Import-set\transform approach it can handle updating of multiple tables and link (via table references) the items? My reading around this seemed to suggest that may require multiple transforms - and that Robust Import Set Transforms were more appropriate, but I'd love to see an example of this either case.
Yep (invalid JSON) - I don't doubt it - I just hand crafted that in the question - it was just an indication of the multi-level nature of the data rather than it being an array of flat items.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
11-23-2023 05:14 AM
Just to flag I have decided to avoid the Robust Import Set Transform approach and build a simple Control Table with a UI Action on the Form - the user can use the standard Attach function and use the UI Action to run script and issue a `gs.eventQueue()` to run the load in the background.