Java Micronaut REST Server with Redis and Kafka
đź•“ 40 minutes
What you’ll learn​
How to set up your application for :
- connecting to Redis,
- connecting to Kafka and publishing messages to its topic,
- getting data from REST API,
- providing data to REST API.
In this tutorial, we will create a simple java component with the Java Micronaut scaffolder. We want to expose a single REST endpoint for getting user authorization roles. As roles are stored in Redis key-value store, we need a client configuration for our component. Any access to information about user roles should be logged in a Kafka topic, so we need a Kafka client configuration too.
Project source​
This example project can be cloned from: http://gitlab.cloud.codenow.com/public-docs/client-authorization-demo/client-authorization-service.git
Prerequisites​
- Prepare your local development environment for CodeNOW with Micronaut.
- Follow the tutorial instructions in the Java Micronaut Local Development tutorial.
- Run Kafka and Redis locally.
- You can run Kafka and Redis directly or using docker compose.
- Configuration files for docker-compose for both Kafka and Redis can be downloaded from the link that can be found in the section Docker compose and third-party tools of the Java Micronaut Local Development tutorial.
- Create a new component
- For details see the section Prerequisites of the Java Micronaut Local Development tutorial.
Steps​
Open your IDE, import created component and start coding:
Define message payload. Here is an example of
UserAuthorizationResponse
, which is a simple POJO with user roles:generate getters and setters with your IDE
package io.codenow.client.authorization.service.model;
import java.util.Set;
public class UserAuthorizationResponse {
private Set<String> roles;
}
Next prepare the configuration for the kafka logging client:
Go to the Kafka administration console (http://localhost:9000 if using kafdrop from our Local development manual) and create a new topic client-logging
Add maven dependency to your
pom.xml
<dependency>
<groupId>io.micronaut.kafka</groupId>
<artifactId>micronaut-kafka</artifactId>
<version>2.0.0</version>
</dependency>
For more details about micronaut-kafka, see: https://micronaut-projects.github.io/micronaut-kafka/latest/guide/
Now use the code below to create a logging client:
package io.codenow.client.authorization.service.logging;
import io.micronaut.configuration.kafka.annotation.KafkaClient;
import io.micronaut.configuration.kafka.annotation.KafkaKey;
import io.micronaut.configuration.kafka.annotation.Topic;
@KafkaClient
public interface LoggingClient {
void log(@Topic String topic, @KafkaKey String key, String msg);
}Next prepare the configuration for the Redis client:
For more details about Micronaut Redis, see: https://micronaut-projects.github.io/micronaut-redis/snapshot/guide/
Add maven dependency to your
pom.xml
<dependency>
<groupId>io.micronaut.redis</groupId>
<artifactId>micronaut-redis-lettuce</artifactId>
<version>2.0.0</version>
</dependency>
Create a new controller and put all the parts together
For more details about Micronaut controller, see: https://docs.micronaut.io/latest/guide/index.html#httpServer
package io.codenow.client.authorization.service.controller;
import java.util.List;
import java.util.TreeSet;
import javax.inject.Inject;
import io.codenow.client.authorization.service.logging.LoggingClient;
import io.codenow.client.authorization.service.model.UserAuthorizationResponse;
import io.lettuce.core.api.StatefulRedisConnection;
import io.lettuce.core.api.sync.RedisCommands;
import io.micronaut.context.annotation.Value;
import io.micronaut.http.annotation.Consumes;
import io.micronaut.http.annotation.Controller;
import io.micronaut.http.annotation.Get;
import io.micronaut.http.annotation.PathVariable;
import io.micronaut.http.annotation.Produces;
import io.micronaut.validation.Validated;
import io.reactivex.Single;
/**
* UserAuthorizationController.
*/
@Validated
@Controller("/user")
public class UserAuthorizationController {
@Inject private StatefulRedisConnection<String, String> connection;
@Inject private LoggingClient loggingClient;
@Value("${kafka.topic.name}") private String kafkaTopicName;
@Value("${kafka.topic.key}") private String kafkaTopicKey;
@Get("/{username}")
@Produces
@Consumes
public Single<UserAuthorizationResponse> greeting(@PathVariable String username) {
loggingClient.log(kafkaTopicName, kafkaTopicKey, username);
final UserAuthorizationResponse response = new UserAuthorizationResponse();
RedisCommands<String, String> commands = connection.sync();
List<String> privileges = commands.lrange(username, 0L, 1000L);
response.setRoles(new TreeSet<>(privileges));
return Single.just(response);
}
}
Last but not least, append the configuration for Kafka and Redis to
codenow/config/application.yaml
Note that this configuration depends on your local development setup for Kafka and Redis and can be different case-by-case
Make sure you follow yaml syntax (especially whitespaces)
redis:
uri: redis://localhost:6379
kafka:
bootstrap:
servers: localhost:29092
topic:
name: client-logging
key: client-authorization-service
Do not forget to change the
swagger.yaml
. Check it in the example project:src/main/resources/META-INF/swagger/swagger.yaml
Try to build and run application in your IDE. After startup, you should be able to access your new controller’s swagger: http://localhost:8080/swagger/index.html
- For correct setup, check
README.md
in project root or see tutorial Java Micronaut Local Development, section Prepare local development IDE
- For correct setup, check
Deploy to CodeNOW​
If your code works in the local development, you are ready to push your changes to GIT and try to build and deploy your new component version to the CodeNOW environment.
- Check Get New Redis and Get New Apache Kafka for setup in the CodeNOW environment.
- Make sure to change the application.yaml properties from the local to the production setup.
- For more information about application deployment see Application Deployment and Deployment Monitoring tutorials.
What's next?​
See our other developer tutorials: