Java Spring Boot REST Server with Redis and Kafka

đź•“ 40 minutes

What you’ll learn#

How to setup your application for :

  • connecting to Redis,
  • connecting to Kafka and publishing messages to its topic,
  • getting data from REST API,
  • providing data to REST API.

In this tutorial, we will create a simple java component with the Java Spring Boot scaffolder. We want to expose a single REST endpoint for getting user authorization roles. As roles are stored in Redis key-value store, we need a client configuration for our component. Any access to information about user roles should be logged in a Kafka topic, so we need a Kafka client configuration too.


Project source#

This example project can be cloned from:


  • Prepare your local development environment for CodeNOW with Spring Boot.
  • Run Kafka and Redis locally.
    • You can run them directly or using docker compose.
    • The configuration file for docker-compose can be downloaded from the link that can be found in the section Docker compose and third-party tools of the Java Spring Boot Local Development tutorial.
  • Create a new component


Open your IDE, import the created component and start coding:

  • Define the message payload. Here is an example of UserAuthorizationResponse, which is a simple POJO with user roles:

    • generate getters and setters with your IDE

      package io.codenow.client.authorization.service.model;
      import java.util.Set;
      public class UserAuthorizationResponse {
      private Set<String> roles;
      //getters and setters...
  • Next prepare the configuration for the kafka logging client:

    • Go to the Kafka administration console (http://localhost:9000 if using kafdrop from our Local development manual) and create a new topic client-logging

    • Add maven dependency to your pom.xml

  • For more details about Spring Boot Kafka, see:

  • Now add this code below to your Application.class to create bean configuration for kafka:

    @Value("${}") private String kafkaBrokerUrl;
    RouterFunction<ServerResponse> routerFunction() {
    return route(GET("/swagger/index.html"), req ->
    WebFilter traceResponseFilter() {
    return new TraceResponseFilter();
    public ProducerFactory<String, String> producerFactory() {
    return new DefaultKafkaProducerFactory<>(producerConfigs());
    public Map<String, Object> producerConfigs() {
    Map<String, Object> props = new HashMap<>();
    props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, kafkaBrokerUrl);
    props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
    // See for more properties
    return props;
    public KafkaTemplate<String, String> kafkaTemplate() {
    return new KafkaTemplate<String, String>(producerFactory());
  • Next prepare the configuration for the Redis client:

    • For more details about Spring Boot Redis, see:

    • Add maven dependency to your pom.xml.

    • These dependencies are the starter for using Redis key-value data store with Spring Data Redis and the Lettuce client

  • Create a new configuration class for redis RedisConfig.class:

    import io.lettuce.core.RedisClient;
    import io.lettuce.core.RedisURI;
    import io.lettuce.core.api.StatefulRedisConnection;
    import io.lettuce.core.api.sync.RedisCommands;
    import org.springframework.beans.factory.annotation.Value;
    import org.springframework.context.annotation.Bean;
    import org.springframework.context.annotation.Configuration;
    public class RedisConfig {
    private String uri;
    public RedisCommands connectionFactory() {
    RedisURI redisURI = RedisURI.create(uri);
    RedisClient redisClient = RedisClient.create(redisURI);
    StatefulRedisConnection<String, String> redisConnection = redisClient.connect();
    return redisConnection.sync();
  • Create a new controller and put all the parts together

    • For more details about Spring Boot controller, see:

      import io.lettuce.core.api.sync.RedisCommands;
      import org.example.service.model.UserAuthorizationResponse;
      import org.slf4j.Logger;
      import org.slf4j.LoggerFactory;
      import org.springframework.beans.factory.annotation.Autowired;
      import org.springframework.beans.factory.annotation.Value;
      import org.springframework.kafka.core.KafkaTemplate;
      import org.springframework.web.bind.annotation.*;
      import java.util.List;
      import java.util.TreeSet;
      public class UserAuthorizationController {
      Logger log = LoggerFactory.getLogger(UserAuthorizationController.class);
      private RedisCommands<String,String> commands;
      private String kafkaTopicName;
      private String kafkaTopicKey;
      private KafkaTemplate<String, String> kafkaTemplate;
      public UserAuthorizationController(@Value("${}") String kafkaTopicName,
      KafkaTemplate<String, String> kafkaTemplate,
      @Value("${kafka.topic.key}") String kafkaTopicKey)
      this.kafkaTopicName = kafkaTopicName;
      this.kafkaTopicKey = kafkaTopicKey;
      this.kafkaTemplate = kafkaTemplate;
      private UserAuthorizationResponse user(@PathVariable String username) {"Get data for username: {}", username);
      kafkaTemplate.send(kafkaTopicName, kafkaTopicKey, username);
      final UserAuthorizationResponse response = new UserAuthorizationResponse();
      List<String> privileges = commands.lrange(username, 0L, 1000L);
      response.setRoles(new TreeSet<>(privileges));
      return response;
  • Last but not least, append the configuration for Kafka and Redis to config/application.yaml

    • Note that this configuration depends on your local development setup for Kafka and Redis and can be different case-by-case

    • Make sure you follow yaml syntax (especially whitespaces)

      url: redis://localhost:6379
      servers: localhost:29092
      name: client-logging
      key: client-authorization-service
  • Try to build and run the application in your IDE. After startup, you should be able to access your new controller’s swagger: http://localhost:8080/swagger/index.html


Deploy to CodeNOW#

If your code works in the local development, you are ready to push your changes to GIT and try to build and deploy your new component version to the CodeNOW environment.

What’s next?#

See our other developer tutorials: