Skip to content

Commit

Permalink
Add documentation
Browse files Browse the repository at this point in the history
  • Loading branch information
Fred Snyder committed Apr 18, 2023
1 parent 098b18d commit d2b2a50
Showing 1 changed file with 159 additions and 78 deletions.
237 changes: 159 additions & 78 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,131 +3,212 @@
How to reproduce this project.


## Prerequisites
## Git repo
-----------------------------------------------

- Python 3.9
- Node 18 LTS with `pnpm`
- AWS CLI v2
- optional: Rclone
```sh
# clone the repo
git clone git@github.com:kaboomshebang/kbsb-demodash.git
```

> Optional: use the Nix package manager with `direnv` for an automated dev. environment.

## Git
## Prerequisites
-----------------------------------------------

- Docker
- Python3 (3.9.16)
- `pip`
- Node (18.15.0)
- `pnpm`
- AWS CLI v2
- Rclone (optional)

```sh
git clone XXXXXXXXXXXXXXXXXXXXXXx
# install instructions for the Nix package manager
nix-env -iA \
nixpkgs.python39 \
nixpkgs.python39Packages.pip \
nixpkgs.nodejs-18_x \
nixpkgs.nodePackages_latest.pnpm \
nixpkgs.awscli2 \
nixpkgs.rclone
```

> Optional: use the Nix package manager with `direnv` for an automated dev. environment.

## Project settings
-----------------------------------------------

- Set the following environment variables
- set a value for the CORS policy origin
- `.env`
- set credentials for Airtable
- Airtable schema
- column id (autoID)
- column text (single line)
- column checkbox

- Airtable
- create a new base
- create a `todos` table with the following columns
- `id` (Autonumber)
- `description` (Long text)
- `label` (Single select)
- `done` (Checkbox)
- create a [personal access token](https://airtable.com/create/tokens)
- scope: `data.records:read`, `data.records:write`
- access: only the current base
- create a `.env` file in `lambdas/todos/`
- refer to the `.env.example` for more info
- create a shared view link and copy the url

## Local development
-----------------------------------------------

- Which Makefile commands???
> optional: create a new AWS account with `AWS Organizations`
- AWS
- login to the AWS console
- create an IAM user for deployment and Github CI/CD
- for example: `projectname-deploy`
- attach the following policies:
- Lambda: `AWSLambdaBasicExecutionRole`
- Lambda: `AWSXRayDaemonWriteAccess`
- ECR: `AmazonElasticContainerRegistryPublicPowerUser`
- create an access key for the new user
- store key in `lambdas/todos/.aws/.env` and/or Github Actions
- refer to the `.env.template` for more info

- frontend `.env.development`
- set the `VITE_ENDPOINT`
- for example: `http://localhost:8000/new_todo`
- set the `VITE_AIRTABLE_BASE`
- copy-paste the Airtable shared view URL
- frontend `env.production`
- set the production environment later in this guide

## Deploy backend to production
-----------------------------------------------
- CORS domain
- if you want to deploy to production later:
- set `CORS_DOMAIN` in `lambdas/todos/.env`

Deploy to AWS
- Pytest config
- set `url_local` in `lambdas/todos/.env.make.pytest`
- for example: `http://localhost:8000`

### AWS settings
> The project is now ready for local testing without Docker
- optional: in `AWS Organizations` create a new AWS account and login
- create an IAM user for deployment and Github CI/CD
- for example: `projectname-deploy`
- attach the following policies:
- Lambda: `AWSLambdaBasicExecutionRole`
- Lambda: `AWSXRayDaemonWriteAccess`
- ECR: `AmazonElasticContainerRegistryPublicPowerUser`
- create an access key
- store key in `lambdas/todos/.aws/.env` and/or Github Actions
- login??????????

### Create a private AWS ECR container repository

```sh
# https://docs.aws.amazon.com/AmazonECR/latest/userguide/getting-started-cli.html#cli-create-repository
## Local development
-----------------------------------------------

# set region and repo name
aws ecr create-repository \
--repository-name hello-repository \
--image-scanning-configuration scanOnPush=true \
--region region
```
### Backend

- [ ] Add this to the Make file as a target
Start with the backend.

```sh
# create Python environment
cd lambdas/todos
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt

# run the development server
make api
# test the endpoint
make pytest-local
# verify in Airtable
```
aws_account_id.dkr.ecr.region.amazonaws.com/your-repo-name

### Frontend

```sh
# install node_modules and run the development server
make dev
# open the Vite development url
```

- go to AWS ECR, go to repositories
- create a private repository
- `XXXXXXXXXXX.dkr.ecr.eu-central-1.amazonaws.com/XXXXXXXXXXXX`
- [ ] what is the CLI command to create a private repo?
> Verify the endpoint with the todo widget.
### Create a Airtable database

- with the following schema
- [ ] look up the column names
- create an API key
- give limited persmissions
- read/write access to your table
## Deploy to production
-----------------------------------------------

> Make sure Docker is installed
### Environment config
### Create a private AWS ECR container repository

config `lambdas/todos/.env.make.docker`:
- set `region`
- set `docker-repo` (or use default value)
- set `image_name` (or use default value)

```sh
# create ECR repository
make repo
# save the output
```
AIRTABLE_API_KEY=api_key
AIRTABLE_BASE_ID=base_id
AIRTABLE_TABLE_NAME=table_name

config `lambdas/todos/.env.make.docker`:
- set `ecr` value to the output of the make command
- `aws_account_id.dkr.ecr.region.amazonaws.com`

[AWS Docs](https://docs.aws.amazon.com/AmazonECR/latest/userguide/getting-started-cli.html#cli-create-repository)

#### Test Docker locally

```sh
# build the image
make build
# run the container
make run
# test the function
make pytest-docker
```

Config the environment
- Set all your values in `.env.make`
### Deploy to production

### Makefiles
- [ ] How can you find the Lambda URL with AWS CLI

Ready to run all Make commands
- run the following Make command to deploy the backend
- `make auth` for container registry login
- `make update` for serverless function deployment
Copy paste the function endpoint url and set the `url_lambda` value in:
- `lambdas/todos/.env.make.pytest` for the backend
- and `.env.production` for the frontend

> The project is ready to deploy.
## Deploy frontend to production
```sh
# login in to the registry
make auth
# deploy container
make update
# test production endpoint
make pytest-prod
```

> Verify the deployment in Airtable

### Deploy frontend to production
-----------------------------------------------

Next set the correct endpoint in `.env.production`
Set the correct `.env` files.
- `.env.developemnt for local development
Test the build process.

```sh
# from the frontend directory
# build the app
make build
# test build
make preview
# note: the production URL blocks CORS from localhost
```

- AWS S3 (or similar)
- copy the `dist` folder to your bucket
- enable bucket websites
- test the production URL

> Remember to set the correct endpoint in `.env.production`
#### Rclone

Final step: run:
- `make build`
You can use Rclone to automate the synchronization with S3.

requirements: Rclone
- [ ] create make commands to deploy frontend to S3 with Rclone
- [ ] create

## Update CI/CD
-----------------------------------------------

- [ ] Set all the CI/CD environment variables in Githb ACtions
- [ ] Fork repo, and push to main.
- [ ] Go to your bucket website.
- [ ] Or Vercel.
Automate the prodcution with Github Actions:
- fork the repository
- edit `.github/workflows/deploy.yml`
- set all the correct environment variables

0 comments on commit d2b2a50

Please sign in to comment.