Servicenow nlu movement to other environment
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-17-2023 03:54 AM
Hello All,
I have my intent and entity defined in servicenow dev environment. Now i want to move entire nlu model to uat, but when it tried exporting to csv from dev and importing that in uat, i got only intents and did not get associated entities to uat. Can someone let me know how can i do that.
I have update sets created, but it has lot of entries, which includes testing of virtual agent chats as well. It is leading to confusion and hence i tried using exporting and importing. Please advise.
Thanks,
Satish
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-30-2023 01:24 AM
Hello Chris,
I am able to publish the model after syncing one of the referenced vocabulary manually.
However, I could see that few of the referenced vocabularies were not moved to new environment through update set. Please see the below screenshot. Here I have used two vocabularies(@EmailDistributionLists,@EmailDL) in my utterance where one I have created(@EmailDL) and the other I referenced existing one(@EmailDistributionLists).
Technically @EmailDistributionLists will not gets recorded in the update set as i didn't do any modifications to that and just used it, that is fine. But when I moved the update set to the new environment, I could not see @EmailDistributionLists in referenced vocabulary sources though it is being referenced in the utterance that got moved through update set.
It should show up in the list right, confused why it was not showed. Could you please help.

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-30-2023 07:34 AM
Sorry, I haven't added any new vocabulary recently so I don't think I can help much from experience 😞 I don't recall any issues in the past when I added vocabulary or vocabulary sources...
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-31-2023 04:26 AM
Okay, can you also look into below question that I have.
I want to exclude certain words to hit my intent. For example, all the words starting with ritm.
I don't want to write any regex in vocabulary as that will be commonly used for model. I want to write specifically in the intent itself. I tried writing regex in the utterance directly (Like this - ^(?!.*\bRITM\w+\b)). But it did not work.
I do see it as limitation, as I can write what I want in the utterance, but what about the words/sentences which I don't want to hit the intent. Is there a way to write an utterance that will work as "words does not contain string RITM".

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-31-2023 06:04 AM - edited 05-31-2023 06:05 AM
Q, I want to exclude certain words to hit my intent. For example, all the words starting with ritm.
A. NLU Models are inclusive and if properly built, you would not need to exclude certain vocabulary from intents. Negative testing (i.e. You test an utterance and it should not return any intents) is much harder to accomplish, and requires NLU Model tuning to remove any conflicts. However, I published KB1318436: [NLU] Utterance is returning an unexpected intent, when it should not return any intents in a custom NLU Model with a workaround for this issue, as our current Test Panel Feedback is not working, for which I also raised a defect, and once fixed, this should be the best option to ensure you get the correctly predicted intent or no intent returned for the test utterance. Once in Production, you should use the NLU Expert Feedback Loop feature to provide feedback on Virtual Agent chat log utterances to help the system continuously learn and to better predict user input.
Hope this helps.
Regards,
Brian

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-31-2023 04:30 AM - edited 05-31-2023 04:31 AM
I have raised a defect as this "Add to Update Set" utility will add records from different scopes into a single Update Set, and this will be fixed in Vancouver.
Vocabulary Source Lists are trained when you train the NLU Model that references there Vocabulary Source Lists in your utterances. However, if you train these on the source instance, and then move them to the target instance, you will also need to capture the associated ML records in the "Global" scope to also move the ML Model Artifacts [ml_model_artifact] containing the trained Vocabulary Source, which is often missed, and then it throws an error when trying to train the NLU Model on the target instance, because it is missing the Vocabulary Source List model artifacts. Vocabulary Source Tables doesn't have this issue, as they can be synced on the target instance to regenerate the model artifacts.
If you open table [sys_nlu_vocabulary] and group by "Type", you will see the "Static Lookup", which will be all your Vocabulary Source Lists on the instance and the "Solution Name" field will link to the solution in tables [ml_solution], [ml_model_artifact] and [ml_capability_definition_base], which are versioned. Each time you train any AI Capability, it will create a new version.
In order to "force" train the Vocabulary Source List, remove the "Solution Name" value in table [sys_nlu_vocabulary] and when you train the NLU Model, it will now train the Vocabulary Source List. Once NLU training has completed, you will see the "Solution Name" being populated again in table [sys_nlu_vocabulary].
I need to create a KB article on this, as many of you are struggling to successfully move NLU models from one instance to another, and being unable to train the NLU Model on the target instance after the move. I will update this thread, once I have published the KB article.
Hope this helps.
Regards,
Brian