- Post History
- Subscribe to RSS Feed
- Mark as New
- Mark as Read
- Bookmark
- Subscribe
- Printer Friendly Page
- Report Inappropriate Content
6 hours ago
Intro
Telemetry pouring out of cloud platforms. Inventory updates, order events, and configuration changes flow between business systems. Modern enterprises run on streaming data — and they want ServiceNow in the middle of it, reacting to events as they happen instead of waiting for the next batch sync.
That's the job Stream Connect was built for: making ServiceNow a live participant in the customer's streaming data backbone, able to consume or produce Kafka messages in real ti
This article focuses on the Direct Kafka path: what it is, how it's built, when to choose it over Hermes, and the pitfalls to sidestep on the way in. Whether you're new to Stream Connect or already running Hermes and weighing your options, you'll have a clear picture by the end.
Executive Summary
Stream Connect for Direct Kafka is a direct streaming integration that connects ServiceNow with the customer's Kafka environment — whether local, Confluent Cloud, Azure Event Hub, or Amazon MSK — without any intermediary infrastructure. Data flows straight between ServiceNow and the customer's Kafka cluster in a single-cluster architecture.
- Single-cluster architecture: Only the customer's Kafka cluster is involved — no intermediate Hermes (ServiceNow's Kafka) cluster.
- Direct connection: ServiceNow communicates with the customer's Kafka brokers over a direct network path, acting as the Producer or Consumer, reducing operational overhead.
- Same consumers and producers: All existing Stream Connect consumer types (Kafka Flow Trigger, Script, RTE, Transform Map) and producer types (Producer Step, Producer Script API) work identically.
- Built for on-prem, works in the cloud: Designed for on-premises ServiceNow instances but equally usable with cloud-hosted instances connecting to cloud-managed Kafka services.
1. Understanding Stream Connect
Stream Connect connects ServiceNow to streaming data: infrastructure events, security alerts, telemetry from cloud platforms, changes from business systems. It makes ServiceNow a live participant in the customer's streaming data backbone.
There are two ways to implement it, depending on the use case or the needs of the customer.
Stream Connect with Hermes Path
In this path, all traffic is routed through the Hermes Cluster — ServiceNow's own managed Apache Kafka infrastructure. Every message is replicated between the customer's streaming platform (Kafka supported) and Hermes using Message Replication (Mirror Maker, Custom, MID Server).
Stream Connect for Direct Kafka
A direct streaming integration that connects ServiceNow with the customer's Kafka environment (local, Confluent Cloud, Azure Event Hub, Amazon MSK) without any intermediary infrastructure. Data flows straight between ServiceNow and the customer's Kafka cluster, giving a simpler, single-cluster architecture.
2. Architecture at a Glance
The two Stream Connect paths sit side by side architecturally — but the data flow tells two very different stories.
Figure 1: Stream Connect Architecture — Stream Connect for Apache Kafka (left) vs. Stream Connect for Direct Kafka (right)
Quick Comparison
| Aspect | Stream Connect for Apache Kafka | Stream Connect for Direct Kafka |
|---|---|---|
| Intermediate cluster | Yes — Hermes (ServiceNow-managed Kafka Cluster) | No — single-cluster architecture |
| Message Replication | Required — Between Customer's Streaming platform & Hermes | N/A |
| MID Server support | Yes — used for on-prem / firewall behind Kafka connectivity | Not supported |
| Network requirement | Replicator/MID Server → both clusters (Hermes & Customer) | ServiceNow instance → customer's Streaming platform |
| Consumers & Producers | All Stream Connect types | All Stream Connect types |
| Topic Limit | 960 (single partition for single topic) | N/A — Depends on customer's streaming platform |
| Retention period | 36 hours | N/A — Depends on customer's streaming platform |
Operational Responsibility between Hermes and Direct Kafka
| Responsibility | Stream Connect for Apache Kafka | Stream Connect for Direct Kafka |
|---|---|---|
| Streaming Platform (Kafka cluster) uptime & scaling | ServiceNow (managed Hermes) | Customer's Streaming Platform Team |
| Topic lifecycle management | ServiceNow Platform Owner (via ServiceNow UI) | Customer's Streaming Platform Team |
| Access control & ACLs | ServiceNow Platform Owner — must manage who has access to Hermes topics and ensure proper certificate distribution to external clients | Customer's Streaming Platform Team — managed entirely within customer's Streaming platform (e.g., Confluent RBAC, MSK IAM, native ACLs) |
| Certificate management | ServiceNow Platform Owner — certificates generated via Instance PKI Certificate Generator; responsible for rotation and distribution to external clients | Customer's Streaming Platform Team — managed per their Kafka security model |
| Broker/network availability | ServiceNow (Hermes side) | Customer's Streaming Platform Team |
Key takeaway: In the Hermes path, the ServiceNow Platform Owner is the accountable party for topic governance, access control, and certificate lifecycle on the ServiceNow side. In the Direct Kafka path, the Customer's Streaming Platform Team owns the end-to-end infrastructure responsibility — availability, security, and observability are entirely within their domain.
3. When to Use Which Product
So which path is right? It comes down to three questions: Where does ServiceNow live? Where does Kafka live? And can they reach each other directly?
The right product depends on the customer's ServiceNow hosting model, where their Kafka cluster runs, and whether direct network connectivity is feasible.
Key takeaway: When the customer's Kafka sits behind a firewall that cannot accept inbound connections from ServiceNow's hosted infrastructure, Stream Connect with Hermes is the only option. When the Kafka endpoint is directly reachable from the ServiceNow instance (cloud-managed Kafka services, self-hosted instances), Stream Connect for Direct Kafka is preferred for its simpler, single-cluster architecture. In hybrid scenarios, both products can coexist on the same instance.
4. Pros and Cons
Where both products are technically viable (Stream Connect with Hermes also works when the streaming platform is not behind a firewall but requires message replication), the trade-offs below help determine which is the better fit for a given deployment.
Stream Connect with Hermes
Advantages
- Strong SLAs for performance, throughput, and uptime — ServiceNow manages the Hermes cluster infrastructure.
- Lower latency and fewer network issues for hosted customers where the Hermes cluster is co-located with the instance.
- Supports custom integrations and non-streaming data sources through the Replicator layer.
- High availability built into the managed Hermes infrastructure.
Disadvantages
- Not supported for on-prem (self-hosted) customers.
- Added complexity when integrating with Azure Event Hub, Confluent Cloud, and Amazon MSK — requires MID Server (or other replicators) for Message Replication.
- Shared infrastructure imposes constraints on throughput, message size, retention, topics, and naming.
Stream Connect for Direct Kafka
Advantages
- Built for on-prem and equally usable in hosted scenarios — broadest deployment flexibility.
- Customers have full control over the network performance, retention period, and reliability of their Kafka infrastructure.
- Simpler setup for integrating with Azure Event Hub, Confluent Cloud, and Amazon MSK — no MID Server or Replicator needed.
- Single-cluster architecture — no replication, no intermediate hops, no data duplication.
Disadvantages
- Availability and performance depend on the customer's infrastructure.
- WAN connectivity issues can surface as application failures within ServiceNow.
- More retries, error handling, and support effort required due to the customer-managed nature of the connection.
Key takeaway: With the Direct Kafka path, ServiceNow acts as the client to consume or produce messages; the restrictions that came with Hermes are no longer applicable and now depend on the customer's streaming platform — such as retention period, encryption at rest, etc.
5. Direct Kafka — Component Architecture
The diagram below details the components involved in a Stream Connect with Hermes and With Direct Kafka deployment.
Figure 2: Stream Connect for Direct Kafka — Architecture Change
ServiceNow Instance Components
Configuration Objects
Three configuration records anchor the setup. The Kafka Credential stores authentication details (see the Security & Authentication Guide for supported types). The Connection & Credential Alias abstracts the credential and connection into a reusable reference. The Direct Kafka Cluster record ties the alias to a named cluster and provides the connection-test and topic-sync entry points.
Producers
Two producer mechanisms push messages from ServiceNow to the customer's Kafka topics. The Producer Step is used inside IntegrationHub Flows for low-code/no-code publishing. The Producer Script API is available for server-side scripts that need programmatic control over message production.
Consumers
Four consumer types process inbound Kafka messages:
- Kafka Message Flow Trigger — initiates IntegrationHub Flows when a message arrives.
- Script Consumer — allows custom server-side script logic.
- RTE (Real-Time Events) Consumer — feeds into the ServiceNow eventing framework.
- Transform Map Consumer — maps incoming message fields to ServiceNow table columns using standard Transform Maps.
Monitoring & Alerting
The Stream Connect Designer Dashboard (optional, installed from the ServiceNow Store) provides visibility into topic details, producer/consumer activity, and throughput. Stream Connect Alert & Notifications surfaces issues such as connection failures, consumer lag, and synchronization errors.
Customer Systems
On the customer side, the architecture is refreshingly straightforward: Apache Kafka with its local topics. No additional middleware, agents, or replicator processes are required. The customer's applications produce to and consume from their Kafka topics as they normally would; ServiceNow is simply another Kafka client.
6. Pitfalls and Best Practices
This is the section to bookmark. Most failed Direct Kafka deployments don't fail because of the architecture — they fail because of skipped pre-flight checks.
Common Mistakes
- Skipping the network connectivity check — the most frequent wrong-path decision. If the Kafka cluster is behind a firewall the ServiceNow instance cannot reach, Direct Kafka will not work. Validate connectivity first, before any installation or configuration.
- Waiting for topic sync without testing the connection first — use the Test Connection UI action immediately after saving the Direct Kafka Cluster record. Don't assume the connection is healthy and wait 5 minutes for a sync that will silently fail.
- Not validating plugin installation — after activating the installer plugin, confirm all nine dependent plugins are activated successfully before proceeding to configuration.
Recommendations
- Treat the connectivity check and the connection test as mandatory gates — nothing should proceed past either until they pass.
- Use Topic Aliases for all consumer and producer configurations, even for simple deployments — it future-proofs configurations against Kafka topic name changes.
- Install the Stream Connect Designer Dashboard from the ServiceNow Store as part of every implementation, not as an afterthought.
Troubleshooting
- If topics do not appear after 5 minutes, the connection test is the first diagnostic — if it fails, the issue is network or authentication, not the sync job.
- Authentication configuration should mirror the customer's existing Kafka cluster security setup exactly. When in doubt, start with SSL/mTLS as the baseline.
Wrapping up
Stream Connect for Direct Kafka isn't a replacement for Hermes — it's the right tool when the network allows it. One cluster, no replication, the same producers and consumers you already know, and full control handed back to the customer's streaming platform team.
If you're standing up a new Stream Connect deployment, start with the connectivity question. The answer tells you which path you're on — and saves you a long detour.
Have questions? Drop them in the comments — would love to hear what worked, what didn't, and what you wish you'd known on day one.
