How to send message in AVRO format from SN to Kafka using stream connect and 'Producer' step in flow

Animesh Das2
Mega Sage

Hi experts,

 

We need to send message (data) from SN to client side Kafka topic using stream connect subscription. the data should be in AVRO format with the help a flow having 'Producer' step in it.

Can someone please help me how to configure the 'Producer' step in action of the flow to send ServiceNow record/s in AVRO format data to third party Kafka topic?

I really appreciate any urgent help. Thanks in advance.

3 REPLIES 3

Omkar Kumbhar
Mega Sage
Mega Sage

Hello @Animesh Das2 ,

 

first make sure to define AVRO schema in the Schema Management section of Stream connect. This schema must match the structure of the message you want to send.

Below is sample example of AVRO schema for a user record

{
"type": "record",
"name": "User",
"namespace": "com.example",
"fields": [
{
"name": "id",
"type": "string"
},
{
"name": "name",
"type": "string"
},
{
"name": "email",
"type": ["null", "string"],
"default": null
},
{
"name": "age",
"type": ["null", "int"],
"default": null
},
{
"name": "is_active",
"type": "boolean",
"default": true
}
]
}

 

Then in the producer step add the message and Schema according.

Message -  Construct the message using data pills or JSON. This must match the AVRO schema.

Schema- Select the schema you created in Stream Connect.

 

Regards,

Omkar

If I was able to help you with your case, please click the Thumb Icon and mark as Correct.

Thanks @Omkar Kumbhar ,

We did that but still the third party Kafka is getting the data corrupted. By the way what does the, "namespace": "com.example", signify and what value it should hold? Is it mandatory to put? Although we have put that in the schema anyways.

@Animesh Das2 namespace is like a package name. it helps uniquely identify the schema. It is optional! Can you share few screenshots about the data corrupted?

If I was able to help you with your case, please click the Thumb Icon and mark as Correct.