Skip to content

Latest commit

History

History

connect-error-handling

Folders and files

NameName
Last commit message
Last commit date

parent directory

..

Kafka Connect - Error Handling

Overview

This is an environment for testing Kafka Connect鈥檚 error handling behaviour. It鈥檚 vanilla Confluent Platform 5.1 plus:

  • Three test data producers

    • json-corrupt-producer / json-producer - uses kafkacat to send JSON to test_topic_json

    • avro-producer - uses kafka-avro-console-producer to send JSON to test_topic_avro

  • jmxtrans / InfluxDB / Grafana - for extracting/storing/visualising JMX metrics from Kafka Connect

  • Elasticsearch for testing Sink behaviour

Launching the environment

docker-compose up -d

Connector definitions

Each connector鈥檚 definition is available in the main text of the supporting article

Generating data

Each producer is a separate Docker container that will run once. To run them again run

docker-compose start json-producer
docker-compose start json-corrupt-producer
docker-compose start avro-producer

Importing Grafana dashboard

This command should create the InfluxDB data source in Grafana:

curl --user admin:admin -X POST http://localhost:3000/api/datasources -H "Content-Type: application/json" -d '{"orgId":1,"name":"InfluxDB","type":"influxdb","typeLogoUrl":"","access":"proxy","url":"http://influxdb:8086","password":"","user":"","database":"influx","basicAuth":false,"basicAuthUser":"","basicAuthPassword":"","withCredentials":false,"isDefault":true,"jsonData":{"keepCookies":[]},"secureJsonFields":{},"version":2,"readOnly":false}'

Manually import the JSON dashboard definition: config/grafana-dashboard_by_connector.json