Your resources' & corresponding endpoints will be defined as part of this microservice, but requests will be handled by the apigateway
microservice. To accomplish that, the following steps are needed:
- Make sure the container is mounting your resource file at the
APIGATEWAY_RESOURCES_DIR/<your-resource>.py
location (ie.${APIGATEWAY_RESOURCES_DIR}/generic.py
). - Include the configuration for your endpoints/resources in the configuration file, according to the example below:
apis:
your_api:
prefix: /your/api/prefix
generic: # This should be the name of your module, available at `app.resources` package
generic_models_list:
resource: GenericModelsList
urls: /generics
generic_models_details:
resource: GenericModelsDetails
urls: /generics/<string:uuids>
- Define the name of the Celery queues where your workers will be listening to dispatch task requests (This is usually an
ENV
variable or a field in your config file). - Finally, register your Celery tasks in
APIGATEWAY_RESOURCES_DIR/tasks.py
. Note that your tasks can be named anything, as long as a celery worker can math thename
property with a real celery task. Refer to the Celery docs for more detailed information.
Migrations scripts allow you to:
- Create the database(if it doesen't exist yet).
- Create the needed tables for your model.
- Update all the tables that might need change after the model is change.
Note that migrations are not tracked by this repository, you might want to change that behavior.
- To initialize the migrations, create the db, etc, you need to run the commands below:
docker-compose run generic bash -c "cd /generic/src/database && python manage.py db init"
docker-compose run generic bash -c "cd /generic/src/database && python manage.py db migrate"
- If you are using sqlalchemy-utils as part of your model, in the version created by the
migrate
command, you need to import the library:import sqlalchemy_utils
docker-compose run generic bash -c "cd /generic/src/database && python manage.py db upgrade"