Skip to main content

Java Spring Boot - REST API and Kafka Message Publishing

🕓 40 minutes

What you’ll learn:

How to set up your application for :

  • connecting to Kafka and publishing messages to its’ topic,
  • getting data from REST API,
  • providing data to REST API.

In this tutorial, we will create a simple java component with the Java Spring Boot scaffolder. We want to expose a single REST endpoint for getting client data. Client data is provided by another REST component client-data-db, so we need to configure a spring rest call for it. Any access to client data should be logged in the Kafka topic, so we need a Kafka client configuration as well.


Project source

This example project can be cloned from:


  • Prepare your local development environment for CodeNOW with Java Spring Boot.
  • Run Apache Kafka locally.
    • You can run Kafka directly or using docker compose.
    • Configuration file for docker-compose can be downloaded from the link that can be found in the section Docker compose and third-party tools of the Java Spring Boot Local Development tutorial.
  • Create a new component


Open your IDE, import created component and start coding:

  • Define the message payload. Here is an example of the Client, which is a simple POJO with basic client data:

    • generate getters and setters with your IDE


      import java.time.LocalDate;

      public class Client {
      private String username;
      private String firstname;
      private String surname;
      private LocalDate birthdate;
  • Next prepare the configuration for the Kafka logging client:

    • Go to the Kafka administration console (http://localhost:9000 if using Kafdrop from our Java Spring Local Development manual.) and create a new topic client-logging

    • Add maven dependency to your pom.xml

  • For more details about spring-kafka, see:

  • Now add the configuration for the Kafka template to your (package

    @Value("${}") private String kafkaBrokerUrl;

    public ProducerFactory<String, String> producerFactory() {
    return new DefaultKafkaProducerFactory<>(producerConfigs());

    public Map<String, Object> producerConfigs() {
    Map<String, Object> props = new HashMap<>();
    props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaBrokerUrl);
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    // See for more properties
    return props;

    public KafkaTemplate<String, String> kafkaTemplate() {
    return new KafkaTemplate<String, String>(producerFactory());
  • Next, create a new controller and put all the parts together

    • For more details about the spring REST controller, see:


      import org.slf4j.Logger;
      import org.slf4j.LoggerFactory;
      import org.springframework.beans.factory.annotation.Value;
      import org.springframework.kafka.core.KafkaTemplate;
      import org.springframework.web.bind.annotation.GetMapping;
      import org.springframework.web.bind.annotation.PathVariable;
      import org.springframework.web.bind.annotation.RequestMapping;
      import org.springframework.web.bind.annotation.RestController;
      import org.springframework.web.reactive.function.client.WebClient;

      import reactor.core.publisher.Flux;

      public class ClientDataController {
      private static final Logger LOG = LoggerFactory.getLogger(ClientDataController.class);

      private String clientDataDBURL;
      private String kafkaTopicName;
      private String kafkaTopicKey;
      private KafkaTemplate<String, String> kafkaTemplate;

      public ClientDataController(@Value("${}") String clientDataDBURL, @Value("${}") String kafkaTopicName, KafkaTemplate<String, String> kafkaTemplate, @Value("${kafka.topic.key}") String kafkaTopicKey) {
      this.clientDataDBURL = clientDataDBURL;
      this.kafkaTopicName = kafkaTopicName;
      this.kafkaTemplate = kafkaTemplate;
      this.kafkaTopicKey = kafkaTopicKey;

      private Flux<Client> getClientData(@PathVariable String username) {"Get data for username: {}", username);
      kafkaTemplate.send(kafkaTopicName, kafkaTopicKey, username);

      Flux<Client> clientFlux = WebClient.create().get().uri(clientDataDBURL + "/db/clients/" + username).retrieve()

      clientFlux.subscribe(client ->;
      return clientFlux;

  • Last but not least, append the configuration for Kafka to codenow/config/application.yaml

    • Note that this configuration depends on your local development setup for Kafka can differ case-by-case.

    • Make sure you follow yaml syntax (especially whitespaces)

      db: http://client-data-db
      url: localhost:29092
      name: client-logging
      key: client-data-service
  • Try to build and run the application in your IDE. After startup, you should be able to access your new controller’s swagger: http://localhost:8080/swagger/index.html


Deploy to CodeNOW

If your code works in the local development, you are ready to push your changes to GIT and try to build and deploy your new component version to the CodeNOW environment.

What`s next?

See our other developer guide: