Eliza
ServiceNow Employee

 

Eliza_0-1767991740879.png

When using Now Assist products, you may encounter a scenario where you wish to modify the large language model (LLM) providing generative AI capabilities. This guide details the options you have for adjusting the provider for both out-of-the-box (OOTB) and custom Now Assist skills and agents.

Please note: The selection of a provider and model is entirely dependent on your use case, and as such, cannot provide guidance on which provider to select. It is important to consider topics such as data sovereignty and handling, performance, and cost when evaluating candidates.

Article Contents

  1. Provider Connection Types
  2. Feature support for provider connection types
  3. How do I configure each provider connection type?
  4. How do I configure my skills and AI Agents to use my chosen model?

Provider Connection Types

A provider is considered to be the organization delivering the service. This includes Google, Microsoft Azure, AWS, and ServiceNow itself. Each provider offers a range of large language models (LLMs) to help complete actions, with each model having particular benefits and drawbacks.

The table below addresses the different provider types and the features that support them:

Provider Connection Type Definition Supported Features
ServiceNow Managed Connection
Note: This means you do not need to procure any licenses with third parties to gain access to these options
NowLLM ServiceNow's LLM service. Find out more using our model cards.
  • OOTB Now Assist Skills
  • Custom Skills
  • AI Agents
  • Custom AI Agents
ServiceNow integrated model For users operating at least Yokohama Patch 6 or Zurich Patch 0 and above, a range of integrated model providers are available for use without the need for additional licenses.

Administrators can choose from the following integrated model providers:
  • Microsoft Azure Open AI
  • Google Gemini
  • AWS Anthropic Claude

Learn more in our model provider flexibility article.
  • OOTB Now Assist Skills
  • Custom Skills
  • AI Agents
  • Custom AI Agents
Bring-your-own-key (BYOK) for ServiceNow integrated models This option enables Now Assist features to utilise your organisation's instance of Azure, Google AI Studio, or AWS Bedrock in lieu of the ServiceNow managed connection.

You are required to bring your own connection credentials (such as API Key and endpoint), and ServiceNow will route Now Assist requests through your provider account rather than ServiceNow's managed infrastructure.

The specific models available for selection using this method are each listed here.
  • OOTB Now Assist Skills
  • Custom Skills
  • AI Agents
  • Custom AI Agents
Bring-your-own-key (BYOK) for spokes We offer the option to connect to providers using our spokes. These spokes are several pre-built integrations for providers outside of our integrated providers. You are required to bring your own connection credentials (such as API Key and endpoint).

The list of providers can be found in product documentation.
  • Limited OOTB Skills*
  • Custom Skills
Bring-your-own-LLM (BYOLLM) This refers to using a model or provider that ServiceNow does not support or certify.

You are required to configure the connection yourself, and will need to provide credentials such as an API key and endpoint. The configuration process also includes some scripting in order to produce a transformer script, to translate the output from your provider into a ServiceNow recognized format.
  • Limited OOTB Skills*
  • Custom Skills

*Please refer to the section labelled "BYOK for spokes and BYOLLM for OOTB skills" in How do I configure my skills and AI Agents to use my chosen model?

Feature Support for Provider Connection Types

Not all features within Now Assist are able to use all provider connection types. The table below maps the feature type to the supported provider connection types.

Feature Type Definition Provider Connection Type Supported
OOTB Now Assist Skills ServiceNow shipped generative AI features. Managed in the Now Assist Admin console.

Examples: incident/case summarization, knowledge article generation
  • ServiceNow Managed Connection
  • BYOK for ServiceNow integrated models
  • BYOK for Spokes*
  • BYOLLM*
Custom skills User-developed skills, created within Now Assist Skill Kit.
  • ServiceNow Managed Connection
  • BYOK for ServiceNow integrated models
  • BYOK for Spokes
  • BYOLLM
OOTB AI Agents ServiceNow shipped AI Agents.
  • ServiceNow Managed Connection
  • BYOK for ServiceNow integrated models
Custom AI Agents User-developed AI Agents, created within AI Agent Studio.
  • ServiceNow Managed Connection
  • BYOK for ServiceNow integrated models

*Please refer to the section labelled "BYOK for spokes and BYOLLM for OOTB skills" in How do I configure my skills and AI Agents to use my chosen model?

How do I configure each provider connection type?

ServiceNow Managed Connection

No configuration required! Connection to either NowLLM or any of the integrated models is automatic – they are all available for use as soon as a Now Assist-related plugin has been installed.

BYOK for ServiceNow integrated models

Please refer to the support article that addresses how to input credentials for each integrated model provider.

We also have a video walkthrough for those looking to connect to Azure OpenAI:

 

BYOK for Spokes

We provide a number of spokes to external LLMs, including (as of December 2025):

  • Azure OpenAI*
  • OpenAI
  • Aleph Alpha
  • IBM's WatsonX
  • Google's Vertex AI
  • Google's Gemini AI Studio*
  • AWS Bedrock*

Important: If you are to use Google, Azure OpenAI, or AWS Bedrock, we strongly recommend you instead follow the steps above for BYOK for ServiceNow integrated model.

To connect to a spoke, you need to procure your own license and provide ServiceNow with the key and any additional credentials that the API may need to verify the connection.

For providers that offer multiple LLMs, such as Amazon Bedrock and Vertex AI, please note that you are limited to usage of only one LLM at this time.

To see how to connect to one of our spokes, you can view the recording below:

 

BYOLLM

Washington DC, Xanadu, and Zurich patch 1-3 Instances

Instances on at least the Washington DC release are able to use the generic LLM connector to connect to any external LLM not listed above, i.e. BYOLLM. This process requires a fair amount of technical acumen. To integrate with a non-spoke-supported LLM, you need:

  • An API key from the provider
  • Endpoint for the LLM
  • Access to API documentation for that LLM to assist with writing the transformation script to translate the input and response into an acceptable format

An example demonstrating the process of connecting to an external LLM using the generic LLM connector can be seen in the video below:

 

Zurich Patch 4 (Now Assist Skill Kit plugin version 7.0.0)

Configuration of BYOLLM can be performed by going to Now Assist Skill Kit > Providers and Models, and clicking on the Add model button. A guided setup will walk you through the process of connecting your custom model into ServiceNow.

Eliza_3-1767999137441.png

 



How do I configure my skills and AI Agents to use my chosen model?

OOTB Skills

ServiceNow Managed Connection

You can adjust the provider for OOTB skills by first ensuring the provider is enabled within AI Control Tower, then navigating into Now Assist Admin > Settings > Manage Model Provider.

You can choose to modify the provider for all skills, a subset, or even on an individual skill by skill basis.

These steps are covered in detail in the video below (demo starts at 24:00):

 

BYOK for integrated providers

Once configured (refer to the section labelled "How do I configure each provider connection type?"), you need to then perform the following steps:

  1. Navigate to Now Assist Admin Console.
  2. Click on the tab named Settings, then Manage Integration.
  3. For the provider you wish to use, ensure that BYOK has been selected, and click Save.

    Eliza_2-1767999059864.png

     

This means that all requests pointing to that provider will now be routed to your instance, rather than the ServiceNow managed (OEM) version.

Once this step is complete, you can then follow the same steps listed above under ServiceNow Managed Connection.

BYOK for spokes and BYOLLM

The process of connecting OOTB skills to BYOK for spokes and BYOLLM is conducted within Now Assist Skill Kit, by first cloning the skill, then creating a new prompt within that skill, which will allow for you to select a new provider. A walkthrough of this process is available below the table.

Please note: The BYOK for spokes/BYOLLM options are not available for OOTB skills.

To see which of your skills are eligible for this method, you can navigate to the sn_nowassist_skill_config table, and find the skills where is_template = true. The list of skills one can do this for as of December 2025/Zurich Patch 4 is as follows:

Application Skill Name
Custom App Record Summarization Custom app record summarization
Flow Generation Flow Generation with Images
GRC Common GenAI Issue Summarization
GRC Shared GenAI Risk Assessment Summarization
Now Assist for Accounts Payable Operations (APO) Invoice data extraction
Invoice case summarization
Now Assist for Customer Service Management (CSM) Case summarization
KB generation
Resolution notes generation
Now Assist for Employee Experience Case summarization for approvals
Request summarization for approvals
Requested Item summarization for approvals
Now Assist for Enterprise Architecture (EA) Business application insights
Now Assist for Field Service Management (FSM) KB generation
Work Order Task Summarization
Now Assist for Financial Services Operations (FSO) Claim summarization
Now Assist for FSC Common Purchase order summarization
Supplier summarization
Now Assist for HR Service Delivery (HRSD) KB generation
Persona Assistant
Case summarization
Now Assist for IT Service Management (ITSM) Resolution notes generation
Change request risk explanation
KB generation
Incident summarization
Change request summarization
Now Assist for Legal Service Delivery Legal Request summarization
Now Assist for OTSM OT Incident resolution notes generation
Now Assist for OTSM OT Incident summarization
Now Assist for RPA Hub RPA bot generation
Now Assist for Security Incident Response (SIR) Security Incident Quality Assessment
Security incident recommended actions
Generate content for shift handover
Resolution notes generation
Post incident analysis
Correlation insights generation
Security operations metrics analysis
Security incident summarization
Now Assist for Sourcing and Procurement Operations (SPO) Sourcing request summarization
Sourcing request summarization
Purchase order summarization
Procurement case summarization
Purchase requisition summarization
Sourcing event summarization
Negotiation summarization
Purchase requisition summarization
Now Assist for Supplier Lifecycle Operations (SLO) Supplier case summarization
Now Assist for Talent Generate talking points
Now Assist for Telecommunications, Media and Technology (TMT) Service Problem Case summarization
Customer Service Summary


Walkthrough of how to modify OOTB skills:

 

OOTB and Custom AI Agents

ServiceNow Managed Connection

You can adjust the provider for OOTB skills by first ensuring the provider is enabled within AI Control Tower, then navigating into Now Assist Admin > Settings > Manage Model Provider.

You can choose to modify the provider for all skills, a subset, or even on an individual skill by skill basis.

These steps are covered in detail in the video below (demo starts at 24:00):

 

BYOK for ServiceNow integrated models

Once configured (refer to the section labelled "How do I configure each provider connection type?"), you need to then perform the following steps:

  1. Navigate to Now Assist Admin Console.
  2. Click on the tab named Settings, then Manage Integration.
  3. For the provider you wish to use, ensure that BYOK has been selected, and click Save.
    Eliza_1-1767998989685.png

     

This means that all requests pointing to that provider will now be routed to your instance, rather than the ServiceNow managed (OEM) version.

Once this step is complete, you can then follow the same steps listed above under ServiceNow Managed Connection.


Important:
BYOK for Spokes and BYOLLM is currently not supported for both OOTB and custom AI Agents.


Custom Skills

When you are building your own custom generative AI functionality within Now Assist Skill Kit, you have the option of selecting both a provider, and a provider API (i.e. which model from the provider). This list is sourced from the records with external=true on the sys_generative_ai_model_config table.

Eliza_0-1767998972924.png

 

Comments
Prakash53
Tera Contributor

Hii I am trying to create one Custom LLM, but i am getting this error from the skill 

{ "error": { "sourceErrorMessage": "", "sourceErrorResourceId": "", "errorMessage": "Method failed: (/together/v1/chat/completions) with code: 401 - Invalid username/password combo", "sourceErrorFeatureInvocationId": "" }, "status": "ERROR" }

 

Even though i am using the correct Api key for this. can someone help me here ?

Eliza
ServiceNow Employee

@Prakash53,

 

I suggest testing your API key and endpoint using Postman (or even through a curl request in Terminal if using Mac/Linux). If it returns a 200/success code there then you may have to redo your configuration within ServiceNow.

acraus
Tera Explorer

Hello @Eliza,

 

Thank you for creating these guides.

I am trying to connect an agent created on AWS Bedrock to a ServiceNow instance.
Is it possible to this? If yes, can you give me some pointers please?
Or only models can be used?
Thank you.

Eliza
ServiceNow Employee

Hi @acraus,

 

You have a couple of options here (and, as a disclaimer, these are my personal musings, so I cannot report on the validity of the solution without actually seeing your instance) - you can set up a AWS Bedrock MCP server to reach out to ServiceNow or use ServiceNow's A2A (Agent to Agent) protocol. I believe we are releasing an A2A connection to Amazon in the near future, so stay tuned if you wish to utlise that protocol!

 

Depending on the use case, you can also potentially just point your agent to utilise an API that reaches into ServiceNow to fetch the information it may need.

Zack Sargent
ServiceNow Employee

@Eliza :

It seems like only one custom model at a time can be supported this way. Given that we are watching a storm of very capable open source models, this year from China (Deepseek, Qwen, Kimi, and last week, Longcat), I can see ServiceNow customers wanting to use multiple custom LLMs at a time. Especially given the "I'm the best this week!" pace of things. Longcat is apparently really good at "tool use" while Qwen3-Coder and Kimi K2 are on par with Claude for many coding tasks. I don't think we're going to keep up with the "spoke per model" pacing of the market in the coming year, so it would be nice to have a streamlined "New Model" guided setup or similar. Heck, we can scrape most of the pieces for the setup from Huggingface with just a model name, most of the time ... which sounds like future agent work. (ahem)

Chris Yang
Tera Sage

Hi, I followed the steps for BYOLLM but I'm getting an error response, has anyone else seen this?

FYI I'm able to get a response from GenAI Controller, and we are going through a MID server to go to the provider.

 

{
  "error": {
    "sourceErrorMessage": "",
    "sourceErrorResourceId": "",
    "errorMessage": "Persisting an unterminated plan is not supported for plan with id 8fd644809f00ba103ec75d55b5b517c9 and name: Custom LLM",
    "sourceErrorFeatureInvocationId": ""
  },
  "status": "ERROR"
}

 

bhakti_padte
Tera Explorer

Hello, I am also getting same error as below

{
  "error": {
    "sourceErrorMessage": "",
    "sourceErrorResourceId": "",
    "errorMessage": "Persisting an unterminated plan is not supported for plan with id bc1c6e78138cb6503a2720bb2d3514dd and name: Custom LLM",
    "sourceErrorFeatureInvocationId": ""
  },
  "status": "ERROR"
}

@Chris Yang Did you find the solution for this

Chris Yang
Tera Sage

@bhakti_padte not yet, working with support on this issue

Chris Yang
Tera Sage

@bhakti_padte , after setting the system property "com.glide.oneapi.fdih.async.quick.mode" to "false", we were able to connect and resolve the error.

bhakti_padte
Tera Explorer

Thank you so much @Chris Yang ...Its working now 🙂 

KB15
Giga Guru

I'm attempting to connect to Azure OpenAI using a separate external subscription but it seems to be using the OOB Azure connection no matter what I attempt to change. 

 

I have the Azure connection set with the connection point and the key and have set the skill to Azure OpenAI and even added a custom model entry. I know it's not using the external because if I remove the credentials, it still returns a response.

 

Is there an additional setting that needs to be checked?

Eliza
ServiceNow Employee

Hi @KB15 ,

 

Is this for use in custom skills (created within Now Assist Skill Kit) or OOTB skills?

 

You can navigate to Settings -> Manage Integration within Now Assist Admin and set Azure OpenAI to use your connection, but this will mean that all skills using Azure OpenAI (both custom and OOTB) will use your connection.

 

Eliza_0-1765583263683.png

 

Version history
Last update:
3 weeks ago
Updated by:
Contributors