Securing the Kafka Server with SSL
- Please follow along with my YouTube Video 03 ServiceNow Kafka Integration: MID Server Replication - Securing Kafka
- Make a directory to which we will create a self-signed CA, keystore, and truststore
mkdir ~/demo cd ~/demo
- Create a self-signed certificate authority
openssl req -new -x509 -keyout ca-key -out ca-cert -days 3650
- Get the private IP of the Kafka Linux server and the server name
- Create the keystore (You will need to modify this command with your server name and private IP Address)
keytool -keystore kafka.server-r.keystore.jks -alias server-r -validity 3650 -genkey -keyalg RSA -ext SAN=dns:kafka,IP:10.0.0.7
- Create a certificate signing request (CSR)
keytool -keystore kafka.server-r.keystore.jks -alias server-r -certreq -file ca-request-server-r.csr
- Sign the request using CA keys
openssl x509 -req -CA ca-cert -CAkey ca-key -in ca-request-server-r.csr -out ca-signed-server-r.crt -days 3650 -CAcreateserial
- Import the root certificate to the new keystore
keytool -keystore kafka.server-r.keystore.jks -alias ca-cert -import -file ca-cert
- Import the signed cert to the new keystore
keytool -keystore kafka.server-r.keystore.jks -alias server-r -import -file ca-signed-server-r.crt
- Make the following directory
mkdir /usr/local/kafka/config/ssl
- Copy the keystore and the truststore to the new ssl directory
cp *.jks /usr/local/kafka/config/ssl
- Modify the server.properties file to point to the keystore and the trust store and add the relevant SSL listeners. A sample file is available here (You will need to modify it)
- Make a new directory to test the SSL connection
mkdir ~/demo/testssl cd ~/demo/testssl
- Create the client-ssl.properties file to point to the keystore and the trust store and add the relevant SSL listeners. A sample file is available here (You will need to modify it)
- Start Zookeeper and Kafka. verify the status.
sudo systemctl start zookeeper sudo systemctl start kafka sudo systemctl status zookeeper sudo systemctl status kafka
- Use SSL to produce massages:
/usr/local/kafka/bin/kafka-console-producer.sh --broker-list kafka:9093 --topic test --producer.config client-ssl.properties
- Use SSL to consume the produced massages:
/usr/local/kafka/bin/kafka-console-consumer.sh --bootstrap-server kafka:9093 --topic test --consumer.config client-ssl.properties --from-beginning
-
Please follow along my YouTube Video 04 ServiceNow Kafka Integration: MID Server Replication - Setup
-
Reference: Stream Connect Message Replication
Pre-requisites:
- A MID server that has access to the Kafka Server (In my demo video both the MID and the Kafka server are on the same Azure Virtual Network)
- The same pre-requisites as described in the following YouTube Video 02 ServiceNow Kafka Integration: StreamConnect Produce and Consume
- Nice to have:
- INSTALL plugin ServiceNow IntegrationHub Flow Trigger - Kafka (com.glide.hub.flow_trigger.kafka)
- Update workflow studio plugin sn_workflow_studio
- Verify that the MID host can connect to the Kafka server
ping kafka
- On the instance: Navigate to Connection & Credential Alias and create a new Kafka Server Alias. Select the Kafka connection type and save the record
- Create a new connection in the Alias record "Kafka Server Connection".
- Create a new Kafka SSL credential record for the connection
- On the Kafka server navigation to the location of the keystore and the truststore, use base64 to encode the files.
cat kafka.server-r.keystore.jks |base64 -w 0 cat kafka.server-r.truststore.jks |base64 -w 0
- Copy the encoded text into the corresponding credential record fields, with the correct passwords.
- Once the credential was saved on the connection record, enter the Kafka server's SSL bootsrap connection
kafka:9093
- On the Kafka server, create a source topic (make sure that the server is running):
/usr/local/kafka/bin/kafka-topics.sh --create --bootstrap-server localhost:9092 --replication-factor 1 --partitions 1 --topic KafkaServerSource
- On the instance topics, create the destination Hermes topic
Hermes Destination
- Create a new message replication record with the connection alias created in step 3.
- Create a new Kafka Topic Replication record with the source topic (Step 😎 and the destination topic (Step 9) and the direction ``To ServiceNow"
- Observe that the replication is initiated and running without errors
- Produce messages on the SSL port targeting the KafkaServerSource topic
cd ~/ssl/demo /usr/local/kafka/bin/kafka-console-producer.sh --broker-list kafka:9093 --topic KafkaServerSource --producer.config client-ssl.properties
- Observe that the messages were replicated by the MID server by inspecting the
Hermes Destination
topic - Open the Kafka server connection (Kafka connection record that was created in Step 3), Observe the Consumer group id field
- On the Kafka server, the following command can be used to list the consumer group IDs
/usr/local/kafka/bin/kafka-consumer-groups.sh --command-config client-ssl.properties --bootstrap-server kafka:9093 --list