Java Spring Boot - REST API and Kafka Message Publishing
🕓 40 minutes
What you’ll learn:​
How to set up your application for :
- connecting to Kafka and publishing messages to its’ topic,
- getting data from REST API,
- providing data to REST API.
In this tutorial, we will create a simple java component with the Java Spring Boot scaffolder. We want to expose a single REST endpoint for getting client data. Client data is provided by another REST component client-data-db, so we need to configure a spring rest call for it. Any access to client data should be logged in the Kafka topic, so we need a Kafka client configuration as well.

Project source​
This example project can be cloned from: http://gitlab.cloud.codenow.com/public-docs/client-authorization-demo/client-data-service.git
Prerequisites​
- Prepare your local development environment for CodeNOW with Java Spring Boot.
- Follow the tutorial instructions in Java Spring Boot Local Development.
- Run Apache Kafka locally.
- You can run Kafka directly or using docker compose.
- Configuration file for docker-compose can be downloaded from the link that can be found in the section Docker compose and third-party tools of the Java Spring Boot Local Development tutorial.
- Create a new component
- For details see the section Prerequisites of the Java Spring Boot Local Development tutorial.
Steps​
Open your IDE, import created component and start coding:
Define the message payload. Here is an example of the
Client
, which is a simple POJO with basic client data:generate getters and setters with your IDE
package io.codenow.client.data.service.model;
import java.time.LocalDate;
public class Client {
private String username;
private String firstname;
private String surname;
private LocalDate birthdate;
}
Next prepare the configuration for the Kafka logging client:
Go to the Kafka administration console (http://localhost:9000 if using Kafdrop from our Java Spring Local Development manual.) and create a new topic client-logging
Add maven dependency to your
pom.xml
<dependency>
<groupId>org.springframework.kafka</groupId>
<artifactId>spring-kafka</artifactId>
</dependency>
For more details about spring-kafka, see: https://spring.io/projects/spring-kafka
Now add the configuration for the Kafka template to your
Application.java
(packageio.codenow.client.data.service
):@Value("${kafka.broker.url}") private String kafkaBrokerUrl;
@Bean
public ProducerFactory<String, String> producerFactory() {
return new DefaultKafkaProducerFactory<>(producerConfigs());
}
@Bean
public Map<String, Object> producerConfigs() {
Map<String, Object> props = new HashMap<>();
props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaBrokerUrl);
props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
// See https://kafka.apache.org/documentation/#producerconfigs for more properties
return props;
}
@Bean
public KafkaTemplate<String, String> kafkaTemplate() {
return new KafkaTemplate<String, String>(producerFactory());
}Next, create a new controller and put all the parts together
For more details about the spring REST controller, see: https://spring.io/guides/gs/rest-service/
package io.codenow.client.data.service.controller;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.kafka.core.KafkaTemplate;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.PathVariable;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import org.springframework.web.reactive.function.client.WebClient;
import io.codenow.client.data.service.model.Client;
import reactor.core.publisher.Flux;
@RestController
@RequestMapping("/data")
public class ClientDataController {
private static final Logger LOG = LoggerFactory.getLogger(ClientDataController.class);
private String clientDataDBURL;
private String kafkaTopicName;
private String kafkaTopicKey;
private KafkaTemplate<String, String> kafkaTemplate;
public ClientDataController(@Value("${endpoint.client.data.db}") String clientDataDBURL, @Value("${kafka.topic.name}") String kafkaTopicName, KafkaTemplate<String, String> kafkaTemplate, @Value("${kafka.topic.key}") String kafkaTopicKey) {
super();
this.clientDataDBURL = clientDataDBURL;
this.kafkaTopicName = kafkaTopicName;
this.kafkaTemplate = kafkaTemplate;
this.kafkaTopicKey = kafkaTopicKey;
}
@GetMapping("/{username}")
private Flux<Client> getClientData(@PathVariable String username) {
LOG.info("Get data for username: {}", username);
kafkaTemplate.send(kafkaTopicName, kafkaTopicKey, username);
Flux<Client> clientFlux = WebClient.create().get().uri(clientDataDBURL + "/db/clients/" + username).retrieve()
.bodyToFlux(Client.class);
clientFlux.subscribe(client -> LOG.info(client.toString()));
return clientFlux;
}
}
Last but not least, append the configuration for Kafka to
codenow/config/application.yaml
Note that this configuration depends on your local development setup for Kafka can differ case-by-case.
Make sure you follow yaml syntax (especially whitespaces)
endpoint:
client:
data:
db: http://client-data-db
kafka:
broker:
url: localhost:29092
topic:
name: client-logging
key: client-data-service
Try to build and run the application in your IDE. After startup, you should be able to access your new controller’s swagger: http://localhost:8080/swagger/index.html

Deploy to CodeNOW​
If your code works in the local development, you are ready to push your changes to GIT and try to build and deploy your new component version to the CodeNOW environment.
- Check Get New Apache Kafka for setup in the CodeNOW environment.
- Make sure to change the application.yaml properties from the local to the production setup.
- For more information about application deployment see Application Deployment and Deployment Monitoring tutorials.
What`s next?​
See our other developer guide: