Joe Wilmoth
ServiceNow Employee
ServiceNow Employee

Apache Kafka is a distributed event-streaming platform that provides a unified way to exchange data across multiple systems. Stream Connect for Apache Kafka links your Kafka environment to your ServiceNow instance, enabling you to stream data between your instance and your external systems.

 

Below you will find a list of frequently asked questions to help you on your journey to learning more about Stream Connect for Apache Kafka.

 

Note: Stream Connect for Apache Kafka requires an Automation Engine Enterprise subscription. For more information about Automation Engine, see https://www.servicenow.com/products/automation-engine.html

 

FAQ

Please reach out with any further questions: streamconnect@servicenow.com

 

General Questions

  • How is Stream Connect for Apache Kafka licensed?
    • Stream Connect for Apache Kafka is available as an add-on to Automation Engine Enterprise
  • How do I integrate and replicate data between my ServiceNow instances and my other core business systems at scale?
    • Produce and consume data to and from ServiceNow via Stream Connect for Apache Kafka by building Flows that use the included producer and consumer features to process Kafka events
  • What use cases does Stream Connect support?
    • Populate the CMDB from external systems

    • Real-time integration of customer data

    • Streamline integrations

    • Near real-time customer updates of requests

    • Generate incidents from events and publish updates

    • Simplify the push of the requests to the integration platform

    • Export data for reporting

    • Push data to a data lake

  • What do you get with Stream Connect for Apache Kafka?

    • Kafka Producer Step in Flow Designer

    • Kafka Message trigger in Flow Designer

    • ProducerV2 API

    • Robust Transform Engine (RTE) Consumer

    • Transform Map Consumer

    • Script Consumer

     

  • Why is Stream Connect for Apache Kafka valuable?

    • Build Flows that produce and consume Kafka events

    • Import data from Kafka environments and process that data using existing Robust

      Transform Engine or Transform Map configurations

    • Monitor your consumers’ performance with detailed reporting of statistics and

      performance metrics

  • What is the Hermes Messaging Service and how does it relate to Stream Connect for Apache Kafka?

    • The Hermes Messaging Service is a ServiceNow platform capability that enables Stream Connect for Apache Kafka customers to connect their Apache Kafka environments to their ServiceNow instance.

    • The Hermes Messaging Service is a multi-tenant, multi-cluster, data transport, and queuing service built on Apache Kafka that enables your ServiceNow instance to produce and consume large volumes of Apache Kafka events.

    • Unlike other services, such as SnowMirror, Perspectium or other point integrations like REST, Hermes provides a flexible and tightly integrated service with workflows and applications, allowing customers to manage the health and performance of message delivery using Hermes diagnostic tools.

  • What does the Flow Trigger offer that is included with Stream Connect for Apache Kafka?

    • The ability to consume messages using low-code

    • The ability to read from the first/last message in the Kafka message queue

    • Automatically optimize the number of messages

     

  • Where are the Hermes Kafka Clusters located?

    • In the data centers where the ServiceNow instance resides

  • What sort of statistics can be viewed and tracked?

    • Each Consumer can be tracked providing insight into the following:

      • Topic Input Rate the average number of records added to the topic per second

      • Consumer Processing Rate the average number of records processed per second

      • Topic Queue Depth the average number of records remaining to be processed in the topic

  • Are domain separated instances supported?

    • Yes, and you can link specific topics to the appropriate domain by using a namespace.

  • What options are available to produce Kafka events/messages?

    • You can use the Kafka Producer Step within Flow Designer, which allows you to build an action that writes data to a given topic.

    • You can also use scripting and produce events/messages using the ProducerV2 API.

     

  • What options are available for consuming Kafka events/messages?

    • The Flow Trigger is a simple way to consume data. You can select a topic from a

      drop-down menu in Flow Designer then use the data pills in the flow to handle

      the logic for the messages.

    • Other consumers are the Robust Transform Engine Consumer, Transform Map

      Consumer, and the Script Consumer.

    • These allow you to use existing RTE definitions or existing transform map configurations to consume data as well as providing the ability to get more complex data parsing with scripting logic.

     

Technical Questions

  • What is the maximum message size for Kafka messages?

    • 2MB

  • How many topics can a ServiceNow instance have?

    • 30 Topics

  • What is the maximum number of partitions?

    • There can be 32 partitions per topic. If you’re not familiar with partitions; Imagine a partition as a lane on the highway. You could configure the highway to have trucks in one lane, blue cars in another lane and all other cars in the third lane.

  • What is the retention policy on messages?

    • 36 Hours is the amount of time the message retains on the Hermes cluster.

  • How many messages can Stream Connect for Apache Kafka produce per second?

    • Stream Connect can produce messages up to 2MB/second. The number of messages depends on the size of the messages. If the message size is 1KB, you would expect around 2,000 messages per second.

  • How many messages can Stream Connect for Apache Kafka consume and write to a table within a ServiceNow instance?

    • An estimated 150-300 messages per second (per thread for topics with multiple partitions, multiple threads can be used by increasing the setting for “Max Concurrency”)

 

* For More Information

Stream Connect for Apache Kafka Documentation

 

#automationengine #integrationhub #streamconnectforapachekafka #integrations #streamconnect #flowdesigner

2 Comments