Skip to content

nicolkill/dbb

Repository files navigation

GitHub release Docker Image Size Docker stars Docker pulls

Index

General description

dbb (means nothing, it's just a sticky name) its a basic API-CRUD that provides a unique datasource oriented by schemas, you don't need to create your own API and create controller by controller, you just need create an container of this project or clone-setup this repo, configure your config.json file and the system it's setted up, if you need more fields or change some field data type to another, change your config file, restart and it's done

The point of dbb it's, if you don't want spend much time in a project, you can use dbb as a prototype/beta backend and if you see that your backed will need more work, you can spend the time creating the micro service or another backend that will contain your personal processes

The limits of this concept of project must be tested to know how much in prod can be used, but the main idea of dbb its just for prototypes or small/medium projects

config.json file example:

{
  "ui": {
    "title": "some system title"
  },
  "schemas": [
    {
      "name": "users",
      "fields": {
        "name": "string",
        "age": "number",
        "male": "boolean",
        "flags": ["string"],
        "sku": "string",
        "product_id": "string"
      },
      "generate": {
        "sku": "$str(4)$-$num(4)$-$sym(4)$-$str_num(4)$-$any(4)$"
      },
      "relations": {
        "product_id:mandatory": "products"
      },
      "hooks": [
        {
          "events": ["index"],
          "url": "",
          "method": "get",
          "headers": {
            "key": "value"
          }
        }
      ]
    },
    {
      "name": "products",
      "fields": {
        "name": "string",
        "expiration": "datetime"
      }
    }
  ]
}

Available data types

  • number
  • boolean
  • string
  • datetime
  • map
  • list
  • uuid
  • autogenerated (string, number, symbol, string and number, any)
  • related to other record

More would be added in future updates Details:

  • Relations must contain the postfix _id
  • To mark as mandatory a relation you need add :mandatory at the end of your key
  • Autogenerated values will be ignored if you send a value in params
    • func(length) and the func names are str, num, sym, str_num, any

Hooks

In every event it's possible call to any url, the use it's to use dbb as a data layer and external services as specific action services

{
  "schemas": [
    {
      "name": "users",
      "fields": {
        "field": "string",
      },
      "hooks": [
        {
          "events": ["index"],
          "url": "",
          "method": "get",
          "headers": {
            "key": "value"
          }
        }
      ]
    }
  ]
}

Payload

Must give a JSON with 2 fields responses and params

{
  "params": {},
  "responses": {
    "data" => %{
      "field": "value"
     },
     "deleted_at" => nil,
     "id" => "68522739-10dd-4f74-ab9c-cd7a31eb59c5",
     "inserted_at" => "2024-04-30T07:21:43",
     "reference" => nil,
     "schema" => "users",
     "updated_at" => "2024-04-30T07:21:43"
  }
}

Multiple hooks can be added to a single event events: ["index", "show", "create", "update", "delete"]

API Usage

Like all API's, exist a basic usage on how to use it, the basic routes and operations are

  • GET /:schema - get the list of records
  • GET /:schema/:id - get an individual record by id
  • POST /:schema - save the record to database, using the body
  • PUT /:schema/:id - updates the record data using the body (overrides the whole data)
  • DELETE /:schema/:id - deletes the record data

Search (GET)

Its possible query using your own data

The syntax it's the next:

GET /api/v1/:schema?q=field:value                  # contains text
GET /api/v1/:schema?q=field:value;field2:value2    # multiple fields
GET /api/v1/:schema?q=field:null                   # is null or not exists
GET /api/v1/:schema?q=field:not_null               # not null or exists

Pagination (GET)

The params to paginate its simple, page and count

  • page its the number of the page
  • count number of elements of the page
GET /api/v1/:schema?page=0&count=20

Load relations

To load relations in the list/get of your schema, you need add on the query param relations as divided by commas and must be your relation_field_id, just remove the _id and must load the relation

GET /api/v1/:schema?relations=relation_1,relation_2

Body (POST/PUT)

{
    "data": {                                                // the rule
        "your_field": "value"
    }
}

Swagger

To open the Swagger UI just go to the path api_docs/v1 or click on the link in the UI and it's ready to use

How to run:

Using docker image

Use the public docker image nicolkill/dbb:latest and add the env vars listed bellow

docker-compose.yml example

version: '3.4'
services:
  prod:
    image: nicolkill/dbb:latest
    depends_on:
      - postgres
    ports:
      - 4001:443
    volumes:
      - ./prod_test.json:/app/prod_test.json    # important add the volume
    environment:
      PORT: 443
      ALLOWED_SITES: "*"
      CONFIG_SCHEMA: prod_test.json             # must match with volume
      PHX_SERVER: true
      SECRET_KEY_BASE: example_SECRET_KEY_BASE

      # db config
      POSTGRES_USERNAME: postgres
      POSTGRES_PASSWORD: postgres
      POSTGRES_DATABASE: dbb_test_prod
      POSTGRES_HOSTNAME: postgres

  postgres:
    image: postgres:13.3-alpine
    environment:
      POSTGRES_DB: postgres
      POSTGRES_USER: postgres
      POSTGRES_PASSWORD: postgres
      POSTGRES_HOST_AUTH_METHOD: trust
    ports:
      - 5432:5432

IMPORTANT! You need already created your database in the db server, migrations must be run once the server starts

Cloning the repo

Requirements

  • Docker with Compose
  • Maketool

Steps

  1. Clone the repo
  2. Run make to download deps and build docker images (runs every time that you want to reload deps or download new ones)
  3. Run make up to run create the container and run in the http://localhost:4000

Configure

The schema config exists on his own file, but the server it's configured by env vars, so here the list

Mandatory

Env var Value Default Description
CONFIG_SCHEMA* string Your definition schema file name
POSTGRES_USERNAME string postgres PostgreSQL username
POSTGRES_PASSWORD string postgres PostgreSQL password
POSTGRES_DATABASE string postgres PostgreSQL database name
POSTGRES_HOSTNAME string postgres PostgreSQL hostname, in docker-compose config must be the name of the container

Mandatory on prod

Env var Value Default Description
ALLOWED_SITES string http://localhost:4000 Cors allowed sites
PHX_SERVER boolean Tells to Phoenix that must be a http-server, otherwise the server wouldn't serve the routes
PHX_HOST string example.com Domain name

Optional

Env var Value Default Description
PORT integer 4000 Port number, just for dev
ALLOWED_API_KEY string Static API Key to secure calls on API CRUD

In case that prod deployment, the port it's 443 (https) and the port it's for standart calls, but this will be redirected

Links