Intro

This is a quick guide on how to implement Kafka in Spring Boot against Azure’s Event Hubs. I found the tutorials online did not provide exactly what I was looking for, and after we finished implementing Kafka for some of our new services, my colleague Nick recommended I write an article on the topic…(ahem). Without further ado, here is a quick guide on how to send/receive java objects as JSON to and from Azure Event Hubs. I am in no way an expert, nor do I think this is the best way to do this, so advise, recommendations, and constructive criticism are welcome and encouraged.

We’re deploying our services on PCF, and using Azure’s Event Hubs service as our pub sub infrastructure. Azure offers a kafka interface on top of Event Hubs. Topics, partitions, and consumer groups are all supported. Microsoft does provide a SDK, but we’re going to ignore it to avoid potentially legal overhead and bureaucratic costs with implementing a 3rd party library or SDK within an organization.

Special Cases

When it comes to sending data over kafka, there are lots of things to consider. We won’t go into detail here, but understand that you may want to use a different implementation depending on your use case. For example:

You don’t have control of both the producer and consumer

If there are multiple schemas being sent to the same topic

Minimizing latency is absolutely critical

Covering those cases is outside the scope of this guide. If you would like more information on how to handle those cases, I recommend visiting Spring Reference, Spring Docs, & Spring Kafka’s Gitter.

Setup

Setup your Azure Event Hubs. Here is a tutorial on how to setup up a kafka-interfaced Event Hubs. Store the “Connection String-primary key” somewhere safe, we will use it later. Add dependencies to your spring boot app (and don’t forget to include lombok :D ).

Implementing the Producer

The spring framework kafka library includes a built in Json serializer and deserializer. You can send and receive entire java objects by first serializing and deserializing the java objects to and from JSON, and configuring the serializer and deserializer into the kafka producer and consumer.

Lets first start with the producer. You’ll need to inject the pre-configured KafkaTemplate, so you can publish the messages.

And your producer data transfer object. This object will be serialized to json by the configured serializer.

Kafka provides serializer and deserializer implementations. You’ll find the JsonSerializer & JsonDeserializer in org.springframework.kafka.support.serializer . Extending the class like so allows you to reference the Serializer in configuration while providing a type to deserialize the object to. I found this was the only way I could reference the class from configuration while still specifying the input type requirement. If you decide to programmatically configure your publisher, this may not be necessary as you can just instantiate a new JsonSerializer<>(ProducerMessage) .

And your producer configuration. bootstrap-servers is the url of the kafka queue. client-id isn’t necessary, but helps identity what services are doing the publishing and subscribing. The properties under kafka are specific for Azure Event Hubs and are required. You have flexibility here — you can programmatically configure your beans if you’d like by creating them yourself, specifically the prducerConfigs , ProducerFactory , and KafkaTemplate . Look for KafkaAutoConfiguration in org.springframework.boot.autoconfigure.kafka for reference.

Implementing the Consumer

Your consumer. This is where you’d make your service call.

Your consumer dto. Make sure you have a @NoArgsConstructor , as the deserializer will be creating an instance of the object.

And the consumer configuration. Similar to the producer configuration, you can set these programatically. By defualt, the built in deserializer requires you declare the original package the json was serialized from. This creates tight coupling. The spring.json.use.type.headers config allows you to deserialize without explicitly declaring the json as an object from a specific package. The spring.json.value.default.type config is the java type the json deserializes to.

Testing

Write a test controller, call it via Postman, and check your consumer service for the log.

And that is it :)

Tweet me at @KarakiDev