- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
Global organisations face a persistent challenge: delivering high-quality support experiences to employees across many languages. In ServiceNow, language support has historically relied on translation layers that convert queries and content into English before processing. While this approach works, it can introduce semantic drift and limit answer quality.
Recent enhancements to Now Assist expand how we handle language. As outlined in Ashley’s recent article on multilingual support for ServiceNow generative AI products, customers now have stronger options for managing language with generative models: Dynamic Translation and direct Native Language Comprehension (via an LLM’s built-in language abilities). You can read more in Ashley’s post:
https://www.servicenow.com/community/now-assist-articles/multilingual-support-for-servicenow-generat...
Disclaimer: I am writing this as a private individual with experience in ServiceNow products, not as an official representative of ServiceNow. For the latest official language support details, please refer to ServiceNow documentation and announcements.
In this article, I will explain the difference between translation strategies, show how to enable native language comprehension in Now Assist, and share configuration guidance and best practices based on testing across multiple languages.
Native language vs dynamic translation
Let’s first cover some background when it comes to language in ServiceNow Now Assist. ServiceNow offers two approaches when it comes to language support:
- Dynamic Translation – Translates user input and source data (e.g., KB articles) to English before LLM processing, then translates the response back to the user’s language.
- Native Language Comprehension aka Native Translation* – a direct processing of the user input and source data without translation of the content. This relies on the LLM having sufficient support for the language used. The user session language will be included in the prompt and the LLM will be asked to respond in the same language.
A major enhancement will now allow allows direct passthrough of the user’s input to the LLM with any language supported by the model of choice.
Dynamic translation was an excellent choice in the early version of our products when LLM support for multiple languages was still limited. And for the record, dynamic translation will still be an offered strategy by ServiceNow and the official primary choice for non-supported languages. The challenge however with dynamic translation is that it can impact the quality of the response. This is visualised here by a simplified view of the process when a Finnish user asks a question:
Any prompt would first translate to English → then go through LLM processing → and finally be translated back to the user’s language -> the effect however is that we could experience semantic drift, subsequently reducing output quality.
Unlocking Native Language as a primary strategy
With recent updates of Now Assist the capability to fully rely on Native Language has been unlocked. I would recommend ZP3+ or YP8+ but this should work as low as YP6. The change introduced is that now admin can add supported languages to the models of choice allowing the query to be passed directly to the LLM omitting Dynamic Translation from the process. This works particularly well when you have a model that has broad language support. In my testing Azure has shown very promising results with multiple languages.
The steps to configure are fairly straight forward. Naturally the very first step is to ensure you activate the language plugin either by installing a language from the application manager or self translating a language through the localisation framework – this step is necessary to ensure that the underlying structures are in place.
Then if you are on a newer version you should be able to go to Now Assist Admin- Settings – Multilingual service. From here you need to set native translation to “true” and for your LLM of choice you can edit to add languages. After this you should either deactivate Dynamic Translation in it’s entirety or at least deactivate the languages you wish to use with Native Translation. With this your instance is set up to directly pass queries to the LLM.
The next step I would recommend is to consider your content. By default ServiceNow will only surface content that’s tagged with the same language as the user’s session language aka their selected language. To override this we have a sys_property - glide.ais.global_searchable_filter.kb_knowledge – here you can add a filter condition for knowledge articles that will enable AI Search to surface articles of any language. For example if you have a global article about benefits written in English this would still be retrieved when the user searches in Norwegian.
Finally, for some older patch versions and in the case of certain languages you may find that the language you are looking for missing in the selection from multilingual service. In this case you will need to add your language in our model configuration tables. There are two tables you need to configure:
sys_generative_ai_model_config.list and sys_generative_ai_native_translation_langs. For both tables you should identify your model of choice e.g. gpt_small and gpt_large for Azure and add the languages in the configuration. In addition for the model_config you should also add your language to the E5FT model as this is the model powering our semantic search. For example, I needed to manually add Arabic to my models as it was not available by default.
At this stage you are good to go and can start testing your multilingual virtual agent response through Now Assist. As you’ll notice from my example below, asking a question in Norwegian gives a good reply even based on an English knowledge article.
Here you can see a Norwegian search query in Now Assist with a response from an English language article.
Some notes on search
Now Assist uses Retrieval Augmented Generation (RAG) in creating responses. This is best practice since it reduces risk of hallucination and incorrect answers. What you should pay attention to is the Retrieval step which utilises AI search. Here it’s good to know that there are two search strategies, keyword and semantic and these are combined to what we call hybrid search. Natalia had a great explanation of this in her article on making AI search work which I want to repeat:
AI Search in ServiceNow is a hybrid model, combining semantic and keyword-based search techniques.
- Semantic search understands the intent and contextual meaning behind user queries, enabling it to return relevant results even if the exact words aren’t present.
- Keyword search focuses on direct matches with the user’s typed input. Typically the keywords or synonyms of the keyword used in the search phrase.
The effect of this hybrid search strategy allows our search to look for articles that both match in terms of keywords as well as semantically to the question asked. The semantic index will also be able to respond across language – so the question “how many vacation days” can I take is the same semantically as “hvor mange feriedager kan jeg ta”. Be mindful that it is the E5FT model that manages the semantic indexing of our content so ensuring that the language is added from sys_generative_ai_model_config.list will be essential.
Business considerations
Language is a continuous operational effort for most organisations and even more so for those with a global workforce. Effective communication and easy access to corporate information are essential, yet maintaining translated content across all languages quickly becomes an administrative burden
The emergence of LLMs with strong multilingual capabilities changes this dynamic. We can now interpret an employee’s query in their native language and retrieve relevant knowledge even when the source content is written in another language. This reduces the need for large-scale translation while still maintaining answer quality.
As multilingual virtual agents mature, organisations can be more selective about what they translate. HR policies may still require full localisation due to regulatory requirements, while IT knowledge can remain in English without loss of clarity. With Native Language Comprehension, both types of content can be delivered seamlessly to employees in their preferred language.
*Although the product feature is called “Native Translation” I prefer “Native Language Comprehension” as it’s not actually translating but relying on the LLM’s innate ability to understand the language. NLU would also be a good term, but this is already used elsewhere 😉
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.