Skip to content

marcosschroh/faust-docker-compose-example

Repository files navigation

Faust-Docker-Compose

Build Status License

An example to show how to include a faust project as a service using docker compose, with Kafka, Zookeeper and Schema Registry

Notice that everything runs using docker-compose, including the faust example application. For local development is preferable to run the kafka cluster separate from the faust app.

If you want to generate a faust project from scratch, please use the cookiecutter-faust

Read more about Faust here: https://github.com/robinhood/faust

Project

The project skeleton is defined as a medium/large project according to faust layout

The setup.py has the entrypoint to resolve the entrypoint problem

Applications

Faust Project Dockerfile

The Dockerfile is based on python:3.7-slim. The most important here is that the entrypoint will wait for kafka too be ready and after that execute the script run.sh

Docker compose

docker-compose.yaml includes zookeepeer, kafka and schema-registry based on confluent-inc. For more information you can go to confluentinc and see the docker compose example here

Useful ENVIRONMENT variables that you may change:

Variable description example
WORKER Entrypoint in setup.py example
WORKER_PORT Worker port 6066
KAFKA_BOOSTRAP_SERVER Kafka servers kafka://kafka:9092
KAFKA_BOOSTRAP_SERVER_NAME Kafka server name kafka
KAFKA_BOOSTRAP_SERVER_PORT Kafka server port 9092
SCHEMA_REGISTRY_SERVER Schema registry server name schema-registry
SCHEMA_REGISTRY_SERVER_PORT Schema registry server port 8081
SCHEMA_REGISTRY_URL Schema Registry Server url http://schema-registry:8081

Commands

  • Start application: make run-dev. This command start both the Page Views and Leader Election applications
  • Stop and remove containers: make clean
  • List topics: make list-topics
  • Send events to page_view topic/agent: make send-page-view-event payload='{"id": "foo", "user": "bar"}'

Avro Schemas, Custom Codecs and Serializers

Because we want to be sure that the message that we encode are valid we use Avro Schemas. Avro is used to define the data schema for a record's value. This schema describes the fields allowed in the value, along with their data types.

For our demostration in the Users application we are using the following schema:

{
    "type": "record",
    "namespace": "com.example",
    "name": "AvroUsers",
    "fields": [
        {"name": "first_name", "type": "string"},
        {"name": "last_name", "type": "string"}
    ]
}

In order to use avro schemas with Faust we need to define a custom codec, a custom serializer and be able to talk with the schema-registry. You can find the custom codec called avro_users registered using the codec registation approach described by faust. The AvroSerializer is in charge to encode and decode messages using the schema registry client.

Now the final step is to integrate the faust model with the AvroSerializer.

# users.models

class UserModel(faust.Record, serializer='avro_users'):
    first_name: str
    last_name: str

Now our application is able to send and receive message using arvo schemas!!!! :-)

Tests

Run tests with tox. Make sure that you have installed it.

tox

Achievements

  • Application examples
  • Integration with Schma Registry
  • Schema Registry Client
  • Custom codecs
  • Custom Serializers
  • Avro Schemas
  • Make Schema Registry Client and Serializers a python package