How to reproduce this project.
Note: review the
Makefile
s to determine which AWS and Docker commands are executed.
# clone the repo with SSH
git clone [email protected]:kaboomshebang/kbsb-demodash.git
# without SSH
git clone https://github.com/kaboomshebang/kbsb-demodash.git
The project depends on the following packages.
- Docker
- Python3 (3.9.16)
pip
- Node (18.15.0)
pnpm
- GNU Make
- AWS CLI v2
- Rclone (optional)
Install instructions for the Nix package manager:
nix-env -iA \
nixpkgs.python39 \
nixpkgs.python39Packages.pip \
nixpkgs.nodejs-18_x \
nixpkgs.nodePackages_latest.pnpm \
nixpkgs.awscli2 \
nixpkgs.rclone
Optional: use the Nix package manager with
direnv
for an automated dev. environment.
First, setup all the necessary configuration.
-
Airtable database
- create a new base
- create a
todos
table with the following columnsid
(Autonumber)description
(Long text)label
(Single select)done
(Checkbox)
- create a personal access token
- scope:
data.records:read
,data.records:write
- access: only the current base
- scope:
- create a
.env
file inlambdas/todos/
- store the credentials
- refer to the
.env.example
for more info
- create a shared view link and copy the url
-
AWS credentials
- optional: create a new AWS account with
AWS Organizations
- login to the AWS console
- create an IAM user for deployment and Github CI/CD
- for example:
projectname-deploy
- add the inline policies from the code block below
- for example:
- create an access key for the new user
- store key in
lambdas/todos/.aws/.env
and/or Github Actions- refer to the
.env.template
for more info
- refer to the
- optional: create a new AWS account with
Note: updating policies can take some time!
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "demodash",
"Effect": "Allow",
"Action": [
"lambda:UpdateFunctionCode",
"lambda:GetFunction",
"lambda:GetFunctionConfiguration",
"lambda:GetFunctionUrlConfig",
"lambda:InvokeFunction",
"lambda:CreateFunction",
"lambda:CreateFunctionUrlConfig",
"lambda:AddPermission",
"lambda:UpdateFunctionConfiguration",
"iam:PassRole",
"iam:CreateRole",
"iam:AttachRolePolicy",
"ecr:CreateRepository",
"ecr:SetRepositoryPolicy",
"ecr:GetDownloadUrlForLayer",
"ecr:BatchGetImage",
"ecr:CompleteLayerUpload",
"ecr:DescribeImages",
"ecr:DescribeRepositories",
"ecr:UploadLayerPart",
"ecr:ListImages",
"ecr:InitiateLayerUpload",
"ecr:BatchCheckLayerAvailability",
"ecr:GetRepositoryPolicy",
"ecr:PutImage",
"ecr:GetAuthorizationToken"
],
"Resource": [
"*"
]
}
]
}
Make sure that the key works:
# verify AWS credentials
source lambdas/todos/.aws/.env && aws sts get-caller-identity
-
frontend
app/.env.development
- set the
VITE_ENDPOINT
- for example:
http://localhost:8000/new_todo
- or replace
localhost
with your vm/devserver ip
- for example:
- set the
VITE_AIRTABLE_BASE
- copy-paste the Airtable shared view URL
- set the
-
frontend
app/.env.production
-
set the production environment later in this guide
-
-
CORS domain
- if you also want to deploy to production later:
- set
CORS_DOMAIN
inlambdas/todos/.env
- set
- if you also want to deploy to production later:
-
Pytest config
- set
PYTEST_URL_LOCAL
inlambdas/todos/.make.pytest
- default:
http://localhost:8000
- default:
- set
The project is now ready for local testing without Docker
Before deploying let's test the configuration with a local setup.
Start with the backend. Run a FastAPI dev server.
# create Python environment
cd lambdas/todos
python -m venv .venv
source .venv/bin/activate
pip install -r requirements.txt
# run the development server
make api
# test the endpoint
make pytest-local # make sure the .venv is activated
# verify in Airtable
Open a React/Vite development server.
# install node_modules and run the development server
make dev
# open the Vite development url
Verify the endpoint with the todo widget.
Make sure Docker is installed
We need to store the Docker container image in AWS. Let's create a repository.
- config
lambdas/todos/.make.docker
:- set
AWS_ARN
- set
AWS_REGION
- set
AWS_ACCOUNT_ID
(same as ARN) - set
ECR_DOCKER_REPO_NAME
(or use default value) - set
ECR_DOCKER_IMAGE_NAME
(or use default value)
- set
You can find your Arn/AccountId with
aws sts get-caller-identity
# create ECR repository
make repo
Now we can test the Docker build commands and the container locally.
# build the image
make build
# run the container
make run
# test the function
make pytest-docker
# verify the Airtable shared view
The project is ready to deploy.
If everything works, lets continue with the production deployment.
We need a role for the Lambda execution.
# create an IAM role
make iamrole
Before we can deploy the image/container we need to create the function.
# create the function
make lambda-create
# test
make invoke
# create a function url
make lambda-url
# allow public access
make lambda-public
# set the Airtable environment variables
make lambda-env
Now we're finally ready to push the Docker image and update the function.
# login in to the registry
make auth
# deploy container
make update
- Copy the
FunctionUrl
and set the value to thePYTEST_URL_LAMBDA
variable in:lambdas/todos/.make.pytest
for the backend- and
app/.env.production
for the frontend
# test production endpoint
make pytest-prod
Verify the deployment in Airtable
Last step, deploy the React app to a S3 bucket. First, test the build process.
# get the function URL
aws lambda get-function-url-config --function-name <your_function_name>
# or,
make lambda-get-url
Remember to set the correct endpoint in
app/.env.production
Currently, in the frontend, the VITE URL has to end with
/new_todo
# from the frontend directory
# build the app
make build
# test build
make preview
# note: the production URL blocks CORS from localhost
- AWS S3 (or similar)
- open the AWS console
- open S3 page
- create a bucket
- copy the
dist
folder to your bucket - enable bucket websites
- test the production URL
- create a CNAME record in your DNS provider
- point it to the S3 bucket-website
You can use Rclone to automate the synchronization with S3. Review the CI/CD config in the .github
directory for some reference.
- Automate the deployment with Github Actions:
- fork the repository
- first follow all the above steps to create a manual deployment
- so that all the environment variables and policies are set
- edit
.github/workflows/deploy.yml
- set all the correct environment variables
- set all the correct variables in Github Actions
- modify code
- push to or merge with
main