Jake Gillespie
Mega Guru

In my previous article, GenAI on a ServiceNow PDI - Part 2, we installed a Linux MID Server into Ubuntu via WSL2 running on my Windows 10 PC. Now it is time to build an integration to call the Ollama API!

 

To keep things simple, when we call the Ollama API we'll call the Generate a Chat Completion endpoint. This endpoint takes just 3 inputs and since we'll disable streaming, it will return our response in one go. Depending on the performance of your hardware this could cause more of a delay than streaming each token in the response.

 

Here is a sample request payload in JSON format. Note that we have to specify the name of the model we've installed, a boolean value for optional streaming, and of course, the prompt we wish to send.

 

 

{
  "model": "llama3.1",
  "prompt": "Why is the sky blue?",
  "stream": false
}

 

 

 

Below is a sample response in JSON format. While there are many elements in the response, the one we care about is "response".

 

 

{
  "model": "llama3",
  "created_at": "2023-08-04T19:22:45.499127Z",
  "response": "The sky is blue because it is the color of the sky.",
  "done": true,
  "context": [1, 2, 3],
  "total_duration": 5043500667,
  "load_duration": 5025959,
  "prompt_eval_count": 26,
  "prompt_eval_duration": 325953000,
  "eval_count": 290,
  "eval_duration": 4709213000
}

 

 

 

To build the outbound REST integration we can either use (the legacy) REST Message, or the shiny new (Flow Action) REST Step. Given my previous post on Understanding Connection & Credential Alias, I'm sure you know which one I'll use!

 

If you've worked with Flow Designer before, then you'll know that Flows and Subflows execute Actions and/or Flow Logic. Before we can build our Flow, we'll need to first build a custom Flow Action. We'll call this "DIY Gen AI: Summarise". We'll use it by passing the text we wish to summarise as an input to the Flow Action, then have it send the request to the Ollama API. We'll map the response as an output of the Flow Action. This should give us a lot of flexibility when building our Flow(s). See below for for example.

 

1. Define the Flow Action Input.

This is just a simple String data type, but it should be marked Mandatory to ensure a Flow always passes a value into it.

JakeGillespie_0-1723544411932.png

 

2. Define a Script Step to build the Request in JSON format.

Note that I hardcoded a statement before the Input Text to set the context for what I'm asking the LLM to do. You may wish to adjust this to your liking.

JakeGillespie_1-1723544483405.png

 

3. Define a REST Step with the HTTP Method, Header values, and of course, the connection details.

You can either define these details externally via Connection & Credential Alias, or you can define them inline. I opted for the former as it is easier to maintain, should my connection details change later on.

JakeGillespie_0-1723544591543.png

 

4. Don't forget to map the Request payload in JSON format from the previous Script Step.

JakeGillespie_1-1723544893135.png

 

5. Following the REST Step we'll then need to parse the Response payload in JSON format and map the LLM response as an output.

I opted to do this as another Script Step, however, you could use the JSON Parser Step. I like to use a Script Step however as it provides an opportunity to add some error handling, as well as optional logging.

JakeGillespie_2-1723545001268.png

 

6. Lastly, we need to map the parsed output as a Flow Action Output.

This is the value that will be passed back to the Flow that called our Flow Action.

JakeGillespie_3-1723545178554.png

 

For the use case, I opted to borrow an idea I saw about a year ago on an episode of Live Coding Happy Hour. In this episode, the team created a Flow with Incident record trigger which summarised the Incident Description back into the Work notes. Let's try and do the same thing but using our Local LLM!

 

1. Define a Flow with Record Trigger.

I opted for a simple condition where Description is not empty AND Description changes. Note that this Flow is just for demo purposes and would require a better condition in the real world!

 

2. Call the new Flow Action we created.

Map the field you wish to be summarised. In my case it was Description from the Incident Trigger record.

JakeGillespie_4-1723545565489.png

 

 

3. Update the Incident Work notes with the summary returned from the Flow Action.

I opted to prefix the value with a statement telling me it is a summary. As with earlier, you might want to tailor this to your liking.

JakeGillespie_5-1723545683038.png

 

Now to test our Flow! First, you'll need to find an Incident with a suitable Description value. Then open the Test Flow feature and select your Incident number.

JakeGillespie_6-1723545990227.png

 

After running the test, your Incident should look something like the one below.

JakeGillespie_7-1723546108757.png

 

Ok, well that wraps it up for part 3. If you've been following along at home, you now have the building blocks of your own DIY Generative AI solution on a ServiceNow PDI. I hope you found this multi-part article interesting! Please let me know in the comments if you'd like to see more on this topic.

 

Links to each article:

GenAI on a ServiceNow PDI - Part 1 - Overview

GenAI on a ServiceNow PDI - Part 2 - MID Server

GenAI on a ServiceNow PDI - Part 3 - Gen AI Integration