Pranith2
ServiceNow Employee
ServiceNow Employee

Importing NLU Model JSON using Platform Import Sets:

Suggested Reading: Platform Import setsImporting JSON files

            Model JSON can have different formats. For any JSON format, you can choose to import the whole JSON file as a single record in a staging table and write a transform script to process the data. When reusing the same data source for different files with the same JSON format, make sure the data is not truncated while loading it into the staging table. In such cases, manually set the maximum length of columns that were created due to JSON data to a large number in the import set staging table.

In this article, the following JSON format is used. 

{
    "name": "App",
    "language": "en",
    "desc": "",
    "intents": [
        {
            "name": "Book"
        },
        {
            "name": "None"
        }
    ],
    "utterances": [
        {
            "text": "Book a conference room",
            "intent": "Book"
        },
        {
            "text": "Need help resetting PIN",
            "intent": "None"
        }
    ]
}


First, navigate to System Import Sets -> Data sources and create a new Data source record. Attach the json file to the record with the same parameters as shown in the screenshot below.

 

 

In the Related Links section of the record, click “Test load 20 Records” to create the import set staging table.

Create a Transform Map with the target table as NLU Model and create field maps for the Display name, Description, and language.

 

 

See the sample field map for the map name in the screenshot below:

 

Next, create similar field maps for other columns.

Create an “onAfter” transform script to import intents and utterances into the newly imported model.


Here is the sample script:

(function runTransformScript(source, map, log, target /*undefined onStart*/ ) {

    // Add your code here
    var scopeId = gs.getCurrentApplicationId();
    var modelId = null;
    var intentIds = {};
    
    var gr = new GlideRecord('sys_nlu_model');
    gr.addQuery('display_name', source.u_name);
    gr.addQuery('language', source.u_language);
    gr.addQuery('sys_scope', scopeId);
    gr.query();
    if(!gr.next()) {
        gr.initialize();
        gr.setValue('display_name', source.u_name);
        gr.setValue('language', source.u_language);
        gr.setValue('sys_scope', scopeId);
        modelId = gr.insert();
    } else {
        modelId = gr.getValue('sys_id');
    }
    
    var intents = JSON.parse(source.u_intents);
    for (var i=0; i<intents.length; i++) {
        // see if an intent already exists else create it
        var intent_gr = new GlideRecord('sys_nlu_intent');
        intent_gr.addQuery('model', modelId);
        intent_gr.addQuery('name', intents[i].name);
        intent_gr.query();
        if(!intent_gr.next()){
            intent_gr.initialize();
            intent_gr.setValue('name', intents[i].name);
            intent_gr.setValue('model', modelId);
            intentIds[intents[i].name] = intent_gr.insert();
        } else {
            intentIds[intents[i].name] = intent_gr.getValue('sys_id');
        }
    }
    
    var utterances = JSON.parse(source.u_utterances);
    for (var i=0; i<utterances.length; i++) {
        var utterance_gr = new GlideRecord('sys_nlu_utterance');
        utterance_gr.setValue('utterance', utterances[i].text);
        utterance_gr.setValue('intent', intentIds[utterances[i].intent]);
        utterance_gr.insert();
    }
    
})(source, map, log, target);

 

Now go back to the data source and click “Load All Records” under the Related Links section.

 

Click “Run Transform”.

 

Select the Transform map and click “Transform”.

 

Result: the JSON data is imported into NLU tables and a Success message appears as shown in the screenshot below..




Update set is attached for reference.

Version history
Last update:
‎06-23-2020 11:21 AM
Updated by: