Martin Rudack
Mega Sage

header.png

 

This article is based on Zero Copy Connectors version 1.0.2, Workflow Data Fabric Hub version 1.0.2 and Yokohama patch 5

 

Whether you want to fuel your AI agents or prevent your employees from swivel-chairing between multiple systems, Workflow Data Fabric is your friend. It offers several powerful capabilities to integrate data from across your enterprise no matter where it is stored, so that everything is available on the ServiceNow AI Platform. One of the newest capabilities is Zero Copy connectors. It allows you to configure real-time access to enterprise data without actually storing it on your ServiceNow instance.

In the following article, I’ll take you with me on my journey to set up a MariaDB Zero Copy Connector. Together we will explore what it takes to get the connector up and running, which challenges I ran into so you can hopefully avoid them, and what you should keep in mind when using this data in scripts.

 

If you’re curious about how to set up a Zero Copy connector, this article is for you. Let’s dive in!

 

But before we dive deep into the configuration, I want to mention that there are two different kinds of connectors.

Primary connectors are developed by ServiceNow. There are a few primary connectors available for enterprise platforms like Snowflake, Databricks or BigQuery. In addition to primary connectors there are also community connectors available. These connectors are not developed by ServiceNow itself; these are built by partners or the community. The connector for MariaDB which this article is about is a community connector.

 

If you haven’t worked with Zero Copy connectors yet, you might be asking yourself:  “What is this new type of connector? How does it work or how do I set it up?“ Or “What does Zero Copy even mean?”  

These are all great questions and exactly the kind of things I was wondering myself when I first started exploring this feature.

 

Zero Copy Connectors are a powerful new type of integration that provides real-time access to your enterprise data. Using the Workflow Data Fabric Hub, you can establish a connection between your ServiceNow instance and an external data source, such as MariaDB or other enterprise systems. Once the connection is established, you can map the schema from that external system to what’s called a Data Fabric Table on the ServiceNow side. A Data Fabric Table is a virtual table that does not store any data but can be used to access data. The data itself is not stored in the database; it resides only in memory, therefore the name Zero Copy. Every time a user queries the Data Fabric table, the data will be immediately fetched from the external system. This is handled by Trino, a federated query engine running in the datacenter where your instance is located. Trino will translate the query to the data fabric table into a SQL-query which will be executed against the external system. The result of the SQL-query will be sent back to the ServiceNow instance and lives there only as long as the user uses the data. Every query to the data fabric table will result in a new fetch of the data. Data that is no longer used will be automatically deleted.

 

Hint: Zero Copy Connectors can only be used on sub-prod or prod instances. Even though it is possible to install the plugins on PDIs or demo instances you cannot establish a connection to an external system. When you try to connect to a data source you always get the error “There was an issue processing your request. Try again later, or contact support if the issue persists. Additional Details: failed to reserve data fabric engine“.

There is also a KB article explaining this.  https://support.servicenow.com/kb?id=kb_article_view&sysparm_article=KB2291123

 

Once you are on a sub-prod or prod instance it is easy to establish a connection to a MariaDB data source. There are only a few pieces of information you need to provide (the information needed differs from system to system).

  • Username
  • Password
  • JDBC connection string
  • Connection label
  • Connection name

Connection unkenntlich.png

 

If everything is entered correctly you can click on “Connect” to test the connection. If the connection succeeded, you will be redirected to the “Data assets” tab.

 

connection successful.png

 

The data assets tab is the place where you see the available databases and tables from the external system you are connected to. In addition to that you should be able to see the columns of the selected table. These are needed because you need to map the columns from the external table to the data fabric table in ServiceNow. But as you can see in the picture below, I struggled with getting the columns.

 

empty columns.png

 

After checking everything for the 100th time and searching the docs for a while I found the following limitation for data fabric table:

  • “When mapping data, ensure that the data source does not use case-sensitive table names. For example, if both MY_TABLE and my_table exist in the database, the schema will not load because the system cannot differentiate between the tables."

https://www.servicenow.com/docs/bundle/yokohama-integrate-applications/page/administer/workflow-data...

 

Even though this does not exactly describe my use-case it brought me to the idea of only using lower case letters for everything, database name, table names and column names. You can see on the screenshot the database “order” with a lower case “o”, but this database was created with a capital “O”.

 

After this change I immediately saw the column names and so I was able to create a new data fabric table and map the source columns to the data fabric table.

 

order mit columns.png

 

The hardest thing when testing new features and setting up a demo environment is to come up with a good use-case. Therefore, I pretended that my MariaDB will track orders from an external system and this data needs to be available in ServiceNow because customers are able to raise cases in ServiceNow in case they have any questions or issues regarding their orders. So, I created a new database for this (Yes, I know nobody would name it like this, but this resulted in an error that showed me that we need to handle data fabric tables in scripts a little bit differently than normal tables. But you will see it in a bit).

 

The structure of my demo database:

Database name: order

Table name: orders

Columns: date, comment, customer_id, order_id

 

Next step would be to click “Create data fabric table” and map the columns from the source to the new data fabric table on the ServiceNow side. This only takes two steps.

 

First: add a table label and table name.

s1.png

 

Second: select which columns from the source should be mapped to the data fabric table.

 

s2.png

That’s all. Our data fabric table is created, and we should now be able to open the table to see all the orders from the external shop.

 

But when opening the table, we see the following error message at the top of the table.

 

order query error.png

 

I guess it’s not a good idea to name a database “order”, but I would still have expected it to work. But this made me think, when writing a script using GlideRecord to query a data fabric table, how do we differentiate between an error like this and a query that works fine but the result is empty?

When working with data fabric tables we are not querying the local database, instead we are live fetching the data from a different system. There are a lot of new potential error sources which we did not have to worry about before:

  • The connection could break
  • The password of the user could be changed in the external system
  • The user could be deleted/deactivated in the external system
  • The generated SQL-query could be faulty
  • and so on…

 

Consulting the GlideRecord documentation shows that the “query()” method does not return a value. The method “next()” only returns true if the move to the next record was successful and false if there are no more records.

These methods we commonly use in scripts don’t tell us if there is an error. But there is another method that will solve our problem.

getLastErrorMessage(): Retrieves the last error message. If there is no last error message, null is returned.

https://developer.servicenow.com/dev.do#!/reference/api/yokohama/server/no-namespace/c_GlideRecordSc...

 

When working with data fabric tables in script we need to call getLastErrorMessage() after executing the query.

Oh, and after changing the database name from “order” to something else I was able to see my orders in ServiceNow.

 

There is still a lot more to discover and test with Zero Copy Connectors, but that’s for another post.