Skip to content

marcosmarxm/airflow-testing-ci-workflow

Repository files navigation

Airflow DAG development with tests + CI workflow

CI MondayBuilding

This code is complementar to the article How to develop data pipeline in Airflow through TDD (test-driven development). I suggest you to read to better understand the code and the way I think how to setup the project.

Step-by-step: How to develop a DAG using TDD (english version)
Passo-a-passo: Como desenvolver uma DAG usando TDD (portuguese version)

The project

Below is a summary of what will be accomplished in this project. We'll simulate the transfer of some fake transaction data from an ecommerce. A simple task transfering data from the otlp-db database to the olap-db database.

Diagram

To help in the development we use a local development environment to build the pipeline with tests and also a Continuous Integration pipeline with Github Action to ensure that tests are applied at each change.

Containers

  • airflow: container running local setup for development;
  • oltp-db and olap-db: container that simulate database in a production environment and receive fake data;

In this tutorial we won't developt the dashboard part only the pipeline.

Dependencies?

Docker, docker-compose and makefile.

How to run?

The command below will setup the environment using docker-compose. Wait a few minutes (240s, yeah omg right?) to Airflow initialize its internal configuration, then the command will create credentials and connections.

make setup

By running the above command it is possible to access Airflow at localhost: 8080. A user of test user: admin / password: admin is created. At this stage you can develop your DAGs and test them as you modify them. And finally, the command that calls the pytest to perform tests.

make testing

Containers

Some resources about Airflow testing and DataOps: