Skip to content

Latest commit

 

History

History

quarkus-kafka

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 

Using Quarkus applications with Kafka instances in OpenShift Streams for Apache Kafka

As a developer of applications and services, you can connect Quarkus applications to Kafka instances in OpenShift Streams for Apache Kafka. Quarkus is a Kubernetes-native Java framework made for Java virtual machines (JVMs) and native compilation, and optimized for serverless, cloud, and Kubernetes environments. Quarkus is designed to work with popular Java standards, frameworks, and libraries such as Eclipse MicroProfile and Spring, as well as Apache Kafka, RESTEasy (JAX-RS), Hibernate ORM (JPA), Infinispan, Camel, and many more.

In this quick start, you use the Streams for Apache Kafka web console to collect connection information for a Kafka instance. Then you manually configure a connection from an example Quarkus application to the Kafka instance and start producing and consuming messages.

Note
When you’ve completed this quick start and understand the required connection configuration for a Kafka instance, you can use the OpenShift Application Services command-line interface (CLI) to generate this type of configuration in a more automated way. To learn more, see Connecting client applications to OpenShift Application Services using the rhoas CLI.
Prerequisites

Importing the Quarkus sample code

For this quick start, you use the Quarkus sample code from the OpenShift Streams for Apache Kafka Guides and Samples repository in GitHub.

Procedure
  1. On the command line, clone the Streams for Apache Kafka Guides and Samples repository from GitHub.

    git clone https://github.com/redhat-developer/app-services-guides app-services-guides
  2. In your IDE, open the code-examples/quarkus-kafka-quickstart directory from the repository that you cloned.

Configuring the Quarkus example application to connect to a Kafka instance

To enable your Quarkus application to access a Kafka instance, configure the connection using the bootstrap server endpoint, the generated credentials for your OpenShift Streams for Apache Kafka service account, and the SASL/OAUTHBEARER token endpoint for the Kafka instance. For Quarkus, you can configure connection information by using the application.properties configuration file. The example in this task sets environment variables and then references them in the application.properties file.

Quarkus applications use MicroProfile Reactive Messaging to produce messages to and consume messages from your Kafka instances in Streams for Apache Kafka. For more information about Quarkus configuration options for Kafka and Reactive Messaging, see Using Apache Kafka with Reactive Messaging in the Quarkus documentation.

Prerequisites
  • You have the bootstrap server endpoint and the SASL/OAUTHBEARER token endpoint for your Kafka instance. To get this information, find your Kafka instance in the Streams for Apache Kafka web console, click the options icon (three vertical dots), and click Connection. Copy the Bootstrap server and Token endpoint URL values.

  • You have the generated credentials for your service account. To reset the credentials, use the Service Accounts page in the Application Services section of the Red Hat Hybrid Cloud Console. Copy the Client ID and Client secret values.

  • You’ve set the permissions for your service account to access the Kafka instance resources. To verify the current permissions, click your Kafka instance in the Streams for Apache Kafka web console and use the Access page to find your service account permission settings.

Procedure
  1. On the command line, set the Kafka instance bootstrap server and client credentials as environment variables to be used by Quarkus or other applications.

    Replace the values in angle brackets (< >) with your own server and credential information, as follows:

    • The <bootstrap_server> value is the Bootstrap server endpoint for your Kafka instance.

    • The <oauth_token_endpoint_url> value is the SASL/OAUTHBEARER Token endpoint URL for your Kafka instance.

    • The <client_id> and <client_secret> values are the generated credentials for your service account.

      Setting environment variables for server and credentials
      $ export KAFKA_HOST=<bootstrap_server>
      $ export RHOAS_SERVICE_ACCOUNT_CLIENT_ID=<client_id>
      $ export RHOAS_SERVICE_ACCOUNT_CLIENT_SECRET=<client_secret>
      $ export RHOAS_SERVICE_ACCOUNT_OAUTH_TOKEN_URL=<oauth_token_endpoint_url>
  2. In the Quarkus example application, review the src/main/resources/application.properties file to understand how the environment variables you set in the previous step are used in your application. This example uses the dev configuration profile in the application.properties file.

Creating a Kafka topic in Streams for Apache Kafka

The Quarkus application in this quick start uses a Kafka topic called prices to produce and consume messages. In this task, you create the prices topic in your Kafka instance.

Prerequisites
  • You have a running Kafka instance in OpenShift Streams for Apache Kafka.

Procedure
  1. In the Streams for Apache Kafka web console, click Kafka Instances and then click the name of the Kafka instance that you want to add a topic to.

  2. Click the Topics tab.

  3. Click Create topic and specify the following topic properties:

    1. Topic name: For this quick start, enter prices as the topic name. Click Next.

    2. Partitions: Set the number of partitions for the topic. For this quick start, set the value to 1. Click Next.

    3. Message retention: Set the message retention time and size. For this quick start, set the retention time to A week and the retention size to Unlimited. Click Next.

    4. Replicas: For this release of Streams for Apache Kafka, the replica values are preconfigured. The number of partition replicas for the topic is set to 3 and the minimum number of follower replicas that must be in sync with a partition leader is set to 2. For a trial Kafka instance, the number of replicas and the minimum in-sync replica factor are both set to 1. Click Finish.

After you complete the setup, the new topic appears on the Topics page. You can now run the Quarkus application to start producing and consuming messages to and from this topic.

Verification
  • Verify that the prices topic is listed on the Topics page.

Running the Quarkus example application

After you configure your Quarkus application to connect to a Kafka instance and you create the Kafka topic, you can run the Quarkus application to start producing and consuming messages to and from the topic.

The Quarkus example application in this quick start has the following application-scoped Java classes:

  • A class that generates a random number between 0 and 100 and produces it to a Kafka topic.

  • Another class that consumes the number from the Kafka topic.

  • A final class that exposes the number as a REST UI (using Server Sent events).

Prerequisites
  • You’ve configured the Quarkus example application to connect to the Kafka instance.

  • You’ve created the prices topic.

Procedure
  1. On the command line, navigate to the code-examples/quarkus-kafka-quickstart directory that you imported and run the Quarkus example application in developer mode.

    Running the Quarkus example application
    $ cd ~/code-examples/quarkus-kafka-quickstart
    $ ./mvnw quarkus:dev
  2. When the application is running, perform the following actions:

    1. In a web browser, go to http://localhost:8080/prices.html.

    2. Verify that the Last price value is updated.

      Note
      You can also use the OpenShift Streams for Apache Kafka web console to browse messages in the Kafka topic. For more information, see Browsing messages in the OpenShift Streams for Apache Kafka web console.

      If the Quarkus application fails to run, review the error log in the terminal and address any problems. Also review the steps in this quick start to ensure that the Quarkus application and Kafka topic are configured correctly.