Skip to content

Latest commit

 

History

History

kafka-bin-scripts-kafka

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 

Configuring and connecting Kafka scripts with OpenShift Streams for Apache Kafka

As a developer of applications and services, you can use Kafka scripts to produce and consume messages for Kafka instances in OpenShift Streams for Apache Kafka. This is a useful way to test and debug your Kafka instances. The Kafka scripts are a set of shell scripts that are included with the Apache Kafka distribution.

When you download and extract the Apache Kafka distribution, the bin/ directory of the distribution (or the bin\windows\ directory if you’re using Windows) contains a set of shell scripts that enable you to interact with your Kafka instance. With the scripts, you can produce and consume messages using your Kafka instances. You can also perform various other operations against the Kafka APIs to administer topics, consumer groups, and other resources.

Note
The command examples in this quick start show how to use the Kafka scripts on Linux and macOS. If you’re using Windows, use the Windows versions of the scripts. For example, instead of the <Kafka-distribution-dir>/bin/kafka-console-producer.sh script, use the <Kafka-distribution-dir>\bin\windows\kafka-console-producer.bat script.
Prerequisites
  • You have a running Kafka instance in Streams for Apache Kafka (see Getting started with OpenShift Streams for Apache Kafka).

  • You have a command-line terminal application.

  • JDK 11 or later is installed. (The latest LTS version of OpenJDK is recommended.)

  • You’ve downloaded the latest supported binary version of the Apache Kafka distribution. You can check your Kafka version using the following command:

    $ ./kafka-console-producer.sh --version

Configuring the Kafka scripts to connect to a Kafka instance

To enable the Kafka scripts to access a Kafka instance, you must configure a connection using the generated credentials for your OpenShift Application Services service account. In this task, you create a configuration file that specifies these credential values.

Prerequisites
  • You have the generated credentials for your service account. To reset the credentials, use the Service Accounts page.

  • You’ve set permissions for your service account to access resources in the Kafka instance. To verify the current permissions, select your Kafka instance in the OpenShift Streams for Apache Kafka web console and click the Access tab. To learn more about setting permissions, see Managing account access in OpenShift Streams for Apache Kafka.

Procedure
  1. In your Kafka distribution, navigate to the config/ directory.

  2. Create a file called app-services.properties.

  3. In the app-services.properties file, set the SASL connection mechanism and the Kafka instance client credentials as shown in the following configuration. Replace the values with your own credential information. Streams for Apache Kafka supports the SASL/OAUTHBEARER mechanism for authentication, which is the recommended authentication mechanism to use.

    Setting server and credential values
    sasl.mechanism=OAUTHBEARER
    security.protocol=SASL_SSL
    
    sasl.oauthbearer.token.endpoint.url= https://sso.redhat.com/auth/realms/redhat-external/protocol/openid-connect/token
    
    sasl.jaas.config=org.apache.kafka.common.security.oauthbearer.OAuthBearerLoginModule required \
      scope="openid" \
      clientId="<client-id>" \
      clientSecret="<client-secret>" ;
    
    sasl.login.callback.handler.class=org.apache.kafka.common.security.oauthbearer.secured.OAuthBearerLoginCallbackHandler
  4. Save the file. You’ll use this file in the next task to connect to your Kafka instance and produce messages.

Producing messages using Kafka scripts

In this task, you use the kafka-console-producer script to produce messages to a Kafka topic.

Prerequisites
  • You have a running Kafka instance in OpenShift Streams for Apache Kafka (see Getting started with OpenShift Streams for Apache Kafka).

  • You have the bootstrap server endpoint for your Kafka instance. To get the server endpoint, select your Kafka instance in the OpenShift Streams for Apache Kafka web console, select the options icon (three vertical dots), and click Connection.

  • You’ve created the app-services.properties file to store your service account credentials.

Procedure
  1. On the command line, navigate to the bin/ directory of your Kafka distribution.

  2. Use the kafka-topics script to create a Kafka topic, as shown in the following example. The example creates a topic called my-other-topic with the default settings. Replace <bootstrap_server> with the bootstrap server endpoint for your own Kafka instance.

    Note
    For trial Kafka instances, the default value for the replication factor is 1.
    Using the kafka-topics script to create a Kafka topic
    $ ./kafka-topics.sh --create --topic my-other-topic --partitions 1 --replication-factor 3 --bootstrap-server <bootstrap_server> --command-config ../config/app-services.properties
  3. Start the kafka-console-producer script, as shown in the following example. The example uses the SASL/OAUTHBEARER authentication mechanism with the credentials that you saved in the app-services.properties file. The command prepares a producer to send messages to the my-other-topic topic that you previously created.

    Starting the kafka-console-producer script
    $ ./kafka-console-producer.sh --topic my-other-topic --bootstrap-server "<bootstrap_server>" --producer.config ../config/app-services.properties
  4. With the kafka-console-producer script running, enter messages that you want to produce to the Kafka topic.

    Example messages to produce to the Kafka topic
    >First message
    >Second message
    >Third message
  5. Keep the producer running so that you can use it again later, when you create a consumer.

Verification
  • Verify that the kafka-console-producer script is still running without any errors in the terminal.

Consuming messages using Kafka scripts

In this task, you use the kafka-console-consumer script to consume the messages that you previously produced with the kafka-console-producer script.

Prerequisites
  • You used the kafka-console-producer script to produce example messages to a topic.

Procedure
  1. Open a second terminal window or tab, separate from the producer.

  2. On the command line, navigate to the bin/ directory of your Kafka distribution.

  3. Start the kafka-console-consumer script, as shown in the following example. The example uses the SASL/OAUTHBEARER authentication mechanism with the credentials that you saved in the app-services.properties file. The command consumes and displays messages from the my-other-topic topic.

    Starting the kafka-console-consumer script
    $ ./kafka-console-consumer.sh --topic my-other-topic --bootstrap-server "<bootstrap_server>" --from-beginning --consumer.config ../config/app-services.properties

    You see output like the following example:

    First message
    Second message
    Third message
  4. If your producer is still running in a separate terminal, continue entering messages in the producer terminal and observe the messages being consumed in the consumer terminal.

Note
You can also use the OpenShift Streams for Apache Kafka web console to browse messages in the Kafka topic. For more information, see Browsing messages in the OpenShift Streams for Apache Kafka web console.
Verification
  1. Verify that the kafka-console-consumer script is running without any errors in the terminal.

  2. Verify that the kafka-console-consumer script displays the messages from the my-other-topic example topic.