
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
In my previous article, GenAI on a ServiceNow PDI - Part 1, we discussed the DIY concept of hosting our own LLM inside our local network and building a REST integration to consume it. Now it is time to build the connective tissue between these two worlds. Enter, the MID Server (where MID stands for Management, Instrument, and Discovery).
If you haven't heard of or used a MID Server before, it is a Java application running on a Windows or Linux server inside a local network, which allows communication between your ServiceNow Instance (in the cloud) and the devices inside your local network that would otherwise be out of reach. So why is it needed, you ask? Well, most local networks sit behind a Firewall that monitors, allows, or blocks traffic with the Internet. Network Administrators don't like to "punch holes" through their Firewall, so the ServiceNow MID Server has an ace up its sleeve. It simply "phones home" to check for work to complete, and to send the results of such work back to the ServiceNow Instance. In other words, no inbound connections are made to the Firewall, only outbound connections are made from the local network to the Internet, and in this case to our ServiceNow Instance. Technically there is a bit more to it, so feel free to read up on ServiceNow Documentation, or watch this MID Server Overview video. In our solution, we simply need the MID Server to make a REST call to our Ollama API on behalf of the ServiceNow PDI (in the cloud). Let's take a look below.
So let's get started! I am running Windows 10 Professional operating system however, this is not one of the supported Windows operating systems! Luckily, I have Windows Subsystem for Linux (WSL2) installed, which means technically I'm also running Linux. And in this case, Ubuntu 22.04.4 LTS is supported for installing a Linux MID Server. On my PDI I navigated to MID Server -> Downloads, and then copied the URL for my version.
After pulling that down to the local drive in Linux, I then followed the rest of the installation instructions. Once my MID Server was up and running, I completed the validation step and everything was ready to roll.
Ok, well that wraps it up for part 2. In the next article, we'll build our integration connector and make a REST call to the Ollama API.
Links to each article:
GenAI on a ServiceNow PDI - Part 1 - Overview
GenAI on a ServiceNow PDI - Part 2 - MID Server
GenAI on a ServiceNow PDI - Part 3 - Gen AI Integration
- 798 Views
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.