quentincarton
ServiceNow Employee
ServiceNow Employee

Quentin Carton – Sr Advisory Solution Consultant – ServiceNow, Intelligent Automation

 Xu Jiang – Principal Product Manager – Microsoft, Real-Time Intelligence


April 2025

In This Post

  • Why Integrate Real-Time Intelligence in Microsoft Fabric and ServiceNow?
  • Key Technologies and Concepts
  • The Integration Flow
  • Automating Remediation with ServiceNow AI Agents
  • Sending Resolution Data Back to Microsoft
  • Conclusion

Why Integrate Real-Time Intelligence in Microsoft Fabric and ServiceNow?

Organizations today demand real-time observability and rapid response to critical events across their infrastructure. Real-Time Intelligence in Microsoft Fabric and ServiceNow Stream Connect provide a modern, Kafka-native foundation to enable this.

By connecting the Real-Time Intelligence platform with ServiceNow, businesses can:

  • Streamline IT operations with real-time monitoring
  • Automate incident remediation with intelligent workflows on the ServiceNow platform
  • Close the loop with two-way integration
  • Enable AI-driven decision-making

In this guide, I will walk you through our proof of concept, where Real-Time Intelligence routes monitoring events to ServiceNow. ServiceNow then takes intelligent action and sends results back to Real-Time Intelligence - a complete real-time feedback loop between ServiceNow and Microsoft for event-driven scenarios.

Key Technologies and Concepts

Fabric and Real-Time Intelligence Eventstream

 

Microsoft Fabric brings together a broad spectrum of data analytics capabilities into a unified, end-to-end platform for data storage, processing, and analysis. Within this ecosystem, the eventstreams feature plays a critical role in enabling real-time analytics, offering a scalable, Kafka-compatible interface for seamless ingestion and processing of streaming data.

 

Real-Time Intelligence extends these capabilities by empowering organizations to derive actionable insights and visualize data as it moves. It delivers a comprehensive solution for event-driven applications, continuous streaming, and log-based scenarios. The eventstreams feature, represented by Eventstream items within Real-Time Intelligence, enables the ingestion, transformation, and distribution of real-time data from diverse sources to multiple destinations. Designed to handle high-throughput, low-latency workloads, Eventstream provides advanced stream processing capabilities and integrates deeply with other Microsoft Fabric services, enabling organizations to build sophisticated analytics workflows that seamlessly combine real-time and historical data.

 

ServiceNow Stream Connect for Apache Kafka

Stream Connect enables ServiceNow to natively consume and produce Kafka events — no middleware required. You can define consumers and producers to directly map events to actions and responses in the Now Platform. Stream Connect is the solution from ServiceNow that gives customers access to Kafka end points.  Unlike traditional APIs that rely on frequent polling or complex webhook setups, Kafka enables ServiceNow to consume events in real time with low latency and high scalability. Because both Microsoft Eventstreams and ServiceNow support the native Kafka protocol, the integration is seamless, efficient, and ideal for high-throughput, event-driven architectures.

 

Kafka-Native Integration

The beauty of this integration lies in the simplicity of Kafka compatibility on both ends: Eventstream and ServiceNow. This allows for:

  • Real-time, scalable data exchange
  • Minimal setup overhead
  • Flexible architecture across clouds

The Integration Flow

 

Step 1: Intelligence

 

From the Real-Time Hub in Real-Time Intelligence streaming connectors can be used to get events (e.g. monitoring, telemetry, device status) into Eventstreams. Given Eventstream natively supports the Kafka protocol in its destination custom endpoints, no custom adapters are required. ServiceNow can directly consume events from Eventstream using the Kafka endpoint.

 

Step 2: Stream Connect Consumer in ServiceNow

Using Stream Connect, we configured the Message Replication feature that allows Topic Replication. For the POC we created a destination custom endpoint (a Kafka endpoint with its topic) in Eventstream, and then configured Stream Connect Message Replication to replicate any new messages being produced to that topic to be replicated in a specific Topic in ServiceNow.

 

Step 3: Intelligent Remediation

 

When an event is received, ServiceNow can:

  • Evaluate the context using CMDB data, Knowledge Base and/or AI models
  • Trigger flows using Flow Designer to orchestrate the remediation
  • Involve AI Agents for autonomous action or human collaboration
  • Create or update incidents, problems, or change requests
  • Execute scripts or call external systems (Such as notifying people via the MS Team integration)

 

For example, a VM outage alert can result in:

  • Automatic ticket creation
  • AI Agent triggering a remediation runbook
  • Notification sent to stakeholders

 

Step 4: Sending Events Data Back to Real-Time Intelligence

 

Once remediation is complete, ServiceNow sends a message (via Kafka Producer) back to Eventstream, confirming the resolution and status. From there, you can perform end-to-end analysis, reporting, and trigger actions using Real-Time Intelligence.

 

This integration enables organizations to monitor incident lifecycles in real time, identify recurring patterns, and continuously optimize operational processes. By leveraging Real-Time Intelligence’s processing and visualization capabilities, teams can gain deeper insights into issue trends, response times, and remediation effectiveness — all within a unified analytics environment.

 

rti.png

 

 

Integration Process

 

Prerequisites:

 

Microsoft Power BI or Fabric Account:

 

ServiceNow:

  • Stream Connect for Apache Kafka plugin enabled (refer to official documentation)

 

 

Step 1: Set up an Eventstream in Real-Time Intelligence

 

Eventstream can capture real-time events or messages from various sources. In this step, you will be guided to set up an eventstream with real-time event source added so that the events or messages can flow to ServiceNow via Eventstream’s destination custom endpoint.

  1. Log in to the Microsoft Fabric portal: https://app.fabric.microsoft.com and Select ‘Real-Time Intelligence’ from the homepage after logging in.
  2. Create a new Fabric workspace if you do not have one. Create a workspace
  3. Create an eventstream (named as ‘ES-RTI-to-ServiceNow’) by following the guide: Create an eventstream in Microsoft Fabric
  4. Add a source to your eventstream by selecting one of the three source categories:
    1. ‘Connect data sources’: It provides various sources that eventstream can connect to and pull the data from. Add and manage eventstream sources
    2. ‘Use custom endpoint’:  Create the eventstream broker endpoint that can receive the data from your application. Add a custom endpoint to an eventstream
    3. ‘Use sample data’: If you do not have any existing sources available, you can use the pre-installed sample datasets to quickly get started with eventstreams. Add a sample data source to an eventstream

quentincarton_1-1746024330435.png

 

For this POC, the MQTT broker source is used. The device data from the data center is collected into this MQTT broker and then Eventstream pulls the data into Real-Time Intelligence. If you have MQTT broker, you can follow this guide to add this source: Add MQTT source to an eventstream

 

quentincarton_2-1746024330441.png

 

  1. Prepare the Kafka endpoint in this eventstream for ServiceNow to pull the data by adding the ‘Custom endpoint destination’ in this eventstream: Add a custom endpoint destination to an eventstream
    1. Click ‘Edit’ in ribbon or the placeholder card on the right side of the canvas

quentincarton_3-1746024330444.png

 

  1. Select ‘Custom endpoint’ in Destinations section. Give it a name and then Publish this eventstream.

quentincarton_4-1746024330449.png

 

 

  1. Select the destination and, in the Details pane, select ‘Kafka’, then ‘SAS Key Authentication’ in its bottom pane.

quentincarton_5-1746024330459.png

 

  1. Note the ‘Bootstrap server’, ‘Topic name’ and “Connection string – primary key” for configuring the Kafka topic later in ServiceNow.

 

Step 2: Create a Kafka Topic in ServiceNow

 

That topic will be used to received messages coming from your eventstream in Real-Time Intelligence. The name of the topic does not matter; we have used ‘Microsoft’ as a topic name to identify its purpose.

 

quentincarton_6-1746024330470.png

 

 

Step 3: Create the Connection & Alias Record

 

You need to create a connection and alias record of type Kafka in ServiceNow to securely store your eventstream’s Kafka endpoint details and credentials, allowing Stream Connect to authenticate and establish a reusable link for message replication. Notice we use a connection type ‘Kafka’:

 

quentincarton_7-1746024330485.png

 

 

Associated to that Connection record we created the ‘Credential’ Record with the information provided by your eventstream, which allows

 

quentincarton_8-1746024330542.png

 

 

Step 4: Configure the Message Replication in ServiceNow.

 

ServiceNow natively provides a Message Replication feature that allows Kafka Topic replication from a remote Kafka Cluster to ServiceNow cluster (bi-directionally). More detail can be found in the official documentation.  In this scenario, we want to replicate events between ServiceNow and Eventstream (in both directions).

We created a configuration record in Stream Connect Message Replication, using the connection information created in the previous step.

 

quentincarton_9-1746024330555.png

 

Then we defined which topic from your eventstream should be replicated to which topic in ServiceNow. Notice that we can also choose the replication direction.

quentincarton_2-1746043963096.png

 

 

 

 

 

 

At this point we knew the replication was running, but we have used that ‘Topic Inspector’ feature on the ServiceNow instance to validate that we could see messages coming in real-time from your Fabric eventstream.

 

 

 

quentincarton_5-1746044435868.png

 

 

Step 5: Create the workflow trigger and logic in ServiceNow

 

Now that the “plumbing” was in place — with real-time events streaming from Eventstream into ServiceNow — we could focus on acting based on those events.

In the example below, we show how a ServiceNow Workflow is configured to trigger whenever new messages are produced on the Kafka topic.

 

 

flow trigger.jpg

 

flow.jpg

 

 

 

From there, we defined the necessary business logic. ServiceNow can now consume Kafka messages coming from Real-Time Intelligence and respond intelligently, leveraging both the event payload and critical contextual information already available in the ServiceNow platform. In this example, we’re creating an Incident record in ServiceNow based on the incoming event, while also sending a Microsoft Teams notification with details about the incident — including the assigned agent, SLA, and other key information.

 

Similar workflows were defined to trigger and orchestrate specific remediation actions.

 

Step 6: Sending Event Data Back to Real-Time Intelligence from ServiceNow

 

The Incident records from ServiceNow can be sent back to Real-Time Intelligence via Eventstream’s Kafka endpoint for further analysis or reporting. To do this, another Eventstream and its source Kafka endpoint needs to be prepared for receiving the message sent from ServiceNow.

 

  1. Following the same step in Step#1 to get a new Eventstream created (Named as ‘ES-ServiceNow-to-RTI’). And then Select ‘Use custom endpoint’ in the homepage to add the source custom endpoint. Add a custom endpoint to an eventstream

    design a flow to ingest.jpg

     

  2. Once it is added and published, select this source and select ‘Kafka’, then ‘SAS Key Authentication’ in its bottom pane.

ES-Servicenow.jpg

 

 

  1. Please also note the ‘Bootstrap server’, ‘Topic name’ and “Connection string – primary key” for later use.

 

  1. Add details on creating the connection to send data back to Fabric

 

Now that we created a Eventstream’s Kafka endpoint to receive data from ServiceNow, we need to configure the replication to it via ServiceNow.

 

 

Create a Kafka Topic in ServiceNow

 

That topic will be used to send messages (Incidents records data) coming from the ServiceNow instance, and we will configure the Message Replication, so it gets replicated to the EventStream endpoint we created earlier. The name of the topic does not matter, we have used ‘ToMicrosoft’ as a topic name just so we can identify its purpose.

 

  1. Create the Topic in ServiceNow

 

From the filter Navigator, open the Stream Connect Home page, them go in the Topic section to create a new Topic. In that example we created a topic called ‘ToMicrosoft’.

 

 

createnew.jpg

 

 

 

Step 3: Create the Connection & Alias Record

 

You need to create a connection and alias record of type Kafka in ServiceNow to securely store your eventstream’s Kafka endpoint details and credentials in Real-Time Intelligence, allowing Stream Connect to authenticate and establish a reusable link for message replication. Notice we use a connection type ‘Kafka’.

In the filter navigator, we went to Connection & Alias, then we created a new record as shown

 

connect.jpg

 

Then we went to the ‘Connection’ section of that record then clicked NEW to add the EventStream End point information

 

connectrti.jpg

 

 

This where you enter the Bootstrap server you noted when creating the EventStream End point that will receive data from ServiceNow

 

kafkaconnect.jpg

 

 

Associated to that Connection record we created the ‘Credential’ Record with the information provided by your eventstream (Allows ServiceNow to produce messages to EventStream via Stream Connect Message Replication):

 

cert.jpg

 

 

Step 4: Configure the Message Replication in ServiceNow.

 

In this step we want to replicate data from ServiceNow to EventStream so we created a configuration record in Stream Connect Message Replication, using the connection information we created in the previous step.

 

mess.jpg

 

Then we defined which topic from your Fabric eventstream should be replicated to which topic in ServiceNow. Notice that we can also choose the replication direction.

 

 

repdirection.jpg

 

 

 

 

Now that we configured the replication from ServiceNow to EventStream, it is easy to send information back to Real-Time Intelligence upon remediation of an alert.

 

In ServiceNow, we use the no-code Flow Designer to build this logic. With Stream Connect enabled, the Kafka Producer action is available out of the box.

We use this no-code Kafka Producer step to specify the target topic, define the message format, and select the information to include in the message.

 

 

step.jpg

 

 

In the screenshot below, you can see how we send resolution information back to Real-Time Intelligence We’ve defined a workflow that is triggered by an update (in ServiceNow) to the Incident record’s status. The workflow extracts the relevant information from the incident and uses it to produce a Kafka message that is sent to Real-Time Intelligence.

 

res.jpg

 

 

  1. Once the connection created and the message is produced in ServiceNow, you should be able to check the message in Evenstream by preview the middle node:

sche.jpg

 

  1. Processing, Analyzing and Visualizing ServiceNow Messages with Real-Time Intelligence

 

As ServiceNow messages stream into Microsoft Fabric, an eventhouse efficiently manages this diverse, time-sensitive data. Purpose-built for handling structured, semi-structured, and unstructured data, eventhouse is well-suited for the wide range of message formats typically generated by ServiceNow. Incoming data is automatically indexed and partitioned by ingestion time, enabling fast, efficient querying and analysis. Within an eventhouse, you can create KQL databases to store and query your ServiceNow data. These databases offer a flexible environment for data exploration and management.

 

Additionally, you can export KQL queries as visuals to a Real-Time Dashboard within Real-Time Intelligence, where you can further refine the queries and customize their formatting. This integrated dashboard experience enhances data exploration, improves query performance, and makes it easier to visualize streaming insights in real time.

Conclusion

 

This integration demonstrates how two enterprise-grade platforms — Real-Time Intelligence in Microsoft Fabric and ServiceNow — can work together through a shared Kafka-native protocol to power real-time operations.

We were able to:

  • Seamlessly stream events from Microsoft Fabric into ServiceNow
  • Trigger intelligent remediation with automation and AI
  • Send closure and resolution status back to Microsoft Fabric
  • Create real-time actions and real-time dashboards
  • Build a resilient, closed-loop observability and automation framework

 

As more organizations adopt event-driven architectures, this pattern of integrating Kafka-compatible platforms will become foundational for digital operations.

 

            

q.pngQuentin Carton is a Senior Advisory Solution Consultant at ServiceNow, where he helps enterprises harness the power of real-time data with Workflow Data Fabric and Stream Connect for Apache Kafka. With a strong focus on event-driven automation and intelligent workflows, Quentin works closely with customers to integrate ServiceNow into their broader data ecosystems, enabling high-velocity decision-making and autonomous operations.

 

 

x.pngXu Jiang is a Principal Product Manager at Microsoft with decades of experience in the software industry. Xu currently is working on the product management for Microsoft Fabric eventstreams and messaging connectors, empowering customers to capture, transform, and route real-time data efficiently within Microsoft Fabric.

 

 

 

 

 

 

 

 

 

 

Version history
Last update:
‎05-05-2025 08:39 AM
Updated by:
Contributors