Comments

Hii I am trying to create one Custom LLM, but i am getting this error from the skill 

{ "error": { "sourceErrorMessage": "", "sourceErrorResourceId": "", "errorMessage": "Method failed: (/together/v1/chat/completions) with code: 401 - Invalid username/password combo", "sourceErrorFeatureInvocationId": "" }, "status": "ERROR" }

 

Even though i am using the correct Api key for this. can someone help me here ?

@Prakash53,

 

I suggest testing your API key and endpoint using Postman (or even through a curl request in Terminal if using Mac/Linux). If it returns a 200/success code there then you may have to redo your configuration within ServiceNow.

Hello @Eliza,

 

Thank you for creating these guides.

I am trying to connect an agent created on AWS Bedrock to a ServiceNow instance.
Is it possible to this? If yes, can you give me some pointers please?
Or only models can be used?
Thank you.

Hi @acraus,

 

You have a couple of options here (and, as a disclaimer, these are my personal musings, so I cannot report on the validity of the solution without actually seeing your instance) - you can set up a AWS Bedrock MCP server to reach out to ServiceNow or use ServiceNow's A2A (Agent to Agent) protocol. I believe we are releasing an A2A connection to Amazon in the near future, so stay tuned if you wish to utlise that protocol!

 

Depending on the use case, you can also potentially just point your agent to utilise an API that reaches into ServiceNow to fetch the information it may need.

@Eliza :

It seems like only one custom model at a time can be supported this way. Given that we are watching a storm of very capable open source models, this year from China (Deepseek, Qwen, Kimi, and last week, Longcat), I can see ServiceNow customers wanting to use multiple custom LLMs at a time. Especially given the "I'm the best this week!" pace of things. Longcat is apparently really good at "tool use" while Qwen3-Coder and Kimi K2 are on par with Claude for many coding tasks. I don't think we're going to keep up with the "spoke per model" pacing of the market in the coming year, so it would be nice to have a streamlined "New Model" guided setup or similar. Heck, we can scrape most of the pieces for the setup from Huggingface with just a model name, most of the time ... which sounds like future agent work. (ahem)

Hi, I followed the steps for BYOLLM but I'm getting an error response, has anyone else seen this?

FYI I'm able to get a response from GenAI Controller, and we are going through a MID server to go to the provider.

 

{
  "error": {
    "sourceErrorMessage": "",
    "sourceErrorResourceId": "",
    "errorMessage": "Persisting an unterminated plan is not supported for plan with id 8fd644809f00ba103ec75d55b5b517c9 and name: Custom LLM",
    "sourceErrorFeatureInvocationId": ""
  },
  "status": "ERROR"
}

 

Hello, I am also getting same error as below

{
  "error": {
    "sourceErrorMessage": "",
    "sourceErrorResourceId": "",
    "errorMessage": "Persisting an unterminated plan is not supported for plan with id bc1c6e78138cb6503a2720bb2d3514dd and name: Custom LLM",
    "sourceErrorFeatureInvocationId": ""
  },
  "status": "ERROR"
}

@zayang Did you find the solution for this

@bhakti_padte not yet, working with support on this issue

@bhakti_padte , after setting the system property "com.glide.oneapi.fdih.async.quick.mode" to "false", we were able to connect and resolve the error.

Thank you so much @zayang ...Its working now 🙂 

I'm attempting to connect to Azure OpenAI using a separate external subscription but it seems to be using the OOB Azure connection no matter what I attempt to change. 

 

I have the Azure connection set with the connection point and the key and have set the skill to Azure OpenAI and even added a custom model entry. I know it's not using the external because if I remove the credentials, it still returns a response.

 

Is there an additional setting that needs to be checked?

Hi @KB15 ,

 

Is this for use in custom skills (created within Now Assist Skill Kit) or OOTB skills?

 

You can navigate to Settings -> Manage Integration within Now Assist Admin and set Azure OpenAI to use your connection, but this will mean that all skills using Azure OpenAI (both custom and OOTB) will use your connection.

 

Eliza_0-1765583263683.png

 

Version history
Last update:
‎03-04-2026 11:59 AM
Updated by:
Contributors