Skip to content

Latest commit

 

History

History

transactional-events

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 
 
 
 
 

Transactional Events (MySQL to Kafka)

This example shows how to use the SQL Subscriber from the SQL Pub/Sub.

Background

When producing domain events, you may stumble on a dilemma: should you first persist the aggregate to the storage and then publish a domain event or the other way around? Whatever order you choose, one of the operations can fail and you will end up with inconsistent state.

For more detailed description, see When an SQL database makes a great Pub/Sub.

Solution

This example presents a solution to this problem: saving domain events in transaction with the aggregate in the same database and publishing it asynchronously.

The SQL subscriber listens for new records on a MySQL table. Each new record will result in a new event published on the Kafka topic. Kafka Publisher is used just as an example and any other publisher can be used instead.

The example uses DefaultMySQLSchema as the schema adapter, but you can define your own table definition and queries. See SQL Pub/Sub documentation for details.

Requirements

To run this example you will need Docker and docker-compose installed. See installation guide at https://docs.docker.com/compose/install/

Running

docker-compose up

Observe the log output. You will notice new events, generated by the example.

In another terminal, run the following command to consume events produced on the Kafka topic.

docker-compose exec kafka kafka-console-consumer --bootstrap-server kafka:9092 --topic events