Can we call NOW LLM via Gen Ai Controller for custom use cases while building via flow designer?
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-10-2024 12:28 AM
Hi,
We are building some Gen AI use cases and we want to use NOW LLM for that instead of a third party LLM. I can see in flow designer we have an action called "Now LLM" under Gen AI Controller, Can we use that for same purpose?
How do we use it? Please help.
Regards

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-15-2024 06:26 AM
Hello @TDalal ,
In which family release are you running? Vancouver or Washington?

- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-15-2024 06:54 AM
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-15-2024 09:27 PM
Hi Filipe,
Our instance is in Washintondc. We just see the action as "now llm".
Just a guess, "Now LLM TB call " might be only for virtual agent and you can call "Now LLM topic block" in the LLM topics in VA.
Regards
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
05-15-2024 10:43 AM
Absolutely, you can leverage the Now LLM within ServiceNow for your Gen AI use cases instead of relying on third-party options. The "Now LLM" action you spotted under the Gen AI Controller in Flow Designer is the right tool for the job.
Here's a breakdown on how to use it:
1. Activating Now LLM:
- Now LLM might not be enabled by default. You'll need to activate it through the Now Assist platform configuration. Refer to the ServiceNow documentation for "Now LLM Service updates" to find the steps for configuring API credentials for Now LLM [1].
2. Using the "Now LLM" Action:
- Within Flow Designer, drag the "Now LLM" action into your flow.
- Configure the action by specifying the type of functionality you require from Now LLM. It can handle tasks like summarization, text-to-code generation, and more.
- Provide the input text for Now LLM to process. This could be data retrieved from a ServiceNow record or user input captured earlier in the flow.
3. Processing the Output:
- Now LLM will return its generated text or manipulated data based on your chosen function.
- You can utilize subsequent actions within the flow to process this output further. For instance, you could store the generated code in a variable or display a summary to the user.