Skip to content

Commit 8545452

Browse files
authored
Implement persistent storage for PSQL (#746)
* Docker: Persist PSQL Data in Volumes * Docker: env loaded in compose + docs * Docker: Persistent Data Final Review
1 parent e0efed7 commit 8545452

5 files changed

+97
-139
lines changed

README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -19,8 +19,6 @@ Chayn's Bloom service offers several key features designed to support individual
1919

2020
## Bloom Backend Technical Documentation
2121

22-
Read our [Bloom Backend Tech Wiki Docs](https://github.com/chaynHQ/bloom-backend/wiki) for overviews of key concepts and data & database architecture.
23-
2422
Technologies Used:
2523

2624
- [NestJS](https://nestjs.com/) - NodeJs framework for building scalable and reliable server-side applications
@@ -38,7 +36,9 @@ Technologies Used:
3836
- [GitHub Actions](https://github.com/features/actions) - CI pipeline
3937
- [ESLint](https://eslint.org/) and [Prettier](https://prettier.io/) for linting and formatting.
4038

41-
## Local Development
39+
Read our [Bloom Backend Tech Wiki Docs](https://github.com/chaynHQ/bloom-backend/wiki) for overviews of key concepts and data & database architecture.
40+
41+
## Local Development Directions:
4242

4343
Making an open-source contribution you have agreed to our [Code of Conduct](/CODE_OF_CONDUCT.md).
4444

docker-compose.yml

+8
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,10 @@ services:
66
build:
77
context: .
88
dockerfile: Dockerfile
9+
env_file: '.env'
910
environment:
11+
# To connect to local your local psql db, replace DATABASE_URL with:
12+
# postgres://postgres:[email protected]:5432/bloom
1013
DATABASE_URL: postgres://postgres:postgres@db:5432/bloom
1114
ports:
1215
- 35001:35001
@@ -25,3 +28,8 @@ services:
2528
POSTGRES_USER: postgres
2629
POSTGRES_PASSWORD: postgres
2730
POSTGRES_DB: bloom
31+
volumes:
32+
- postgres-data:/var/lib/postgresql/data # path for named volume
33+
34+
volumes:
35+
postgres-data: # named volume for persistent postgres data

docs/database-guide.md

+60-33
Original file line numberDiff line numberDiff line change
@@ -2,65 +2,92 @@
22

33
## How to Populate the Database
44

5-
**Prerequisites:**
5+
### Prerequisites
66

7-
- [Postgres 16](https://www.postgresql.org/download/) \*technically not required if running in Docker
8-
- Running Bloom’s backend
7+
- Bloom's backend is running
98

109
### Summary
1110

12-
Most open-source contributions (like running Cypress integration tests from the frontend) require adding test data to your local database. To do this, download Bloom's test data dump file, connect to the database server, then populate the database with the backup data.
11+
Populating your database with test data is essential for a fully functional development environment, making full-stack contributions, and running Cypress integration tests from the frontend. However, it is not necessary for smaller, isolated contributions.
12+
13+
First, download Bloom's test data dump file. Next, connect to the database server, restore your database with the dump file, then verify with a query.
1314

1415
### Download Test Data Dump File
1516

16-
Download the test data dump file [linked here from our Google Drive](https://drive.google.com/file/d/1y6KJpAlpozEg3GqhK8JRdmK-s7uzJXtB/view?usp=drive_link).
17+
First, download the test data dump file [linked here from our Google Drive](https://drive.google.com/file/d/1y6KJpAlpozEg3GqhK8JRdmK-s7uzJXtB/view?usp=drive_link). Then place this dump file in the project directory.
1718

1819
### Connect to Server and Add Data
1920

20-
There are multiple methods you can use here. For simplicity, these directions assume you are using Docker.
21+
Next, connect to the database server and add test data from the dump file, using the appropriate commands based on how you are running the app - fully containerized, containerized app with local database, or manually without Docker.
2122

22-
Run the following command to restore the database from the dump file using pg_restore:
23+
1. Restore the database from the dump file by running these pg_restore commands.
2324

24-
```
25-
docker exec -i <container_name> pg_restore -U <username> -d <database_name> --clean --if-exists < </path/to/dumpfile.dump>
26-
```
25+
**Fully Containerized App Command:**
2726

28-
`container_name`, `username`, and `database_name` are defined in the `docker-compose.yml` file under ‘db’.
27+
```
28+
docker exec -i <container_name> pg_restore -U <username> -d <database_name> --clean --if-exists < /path/to/dumpfile.dump
29+
```
2930

30-
Start the bloom psql database server:
31+
`container_name`, `username`, and `database_name` are defined in the `docker-compose.yml` under the ‘db’ service. Here is the same command with the default values:
3132

32-
```
33-
docker exec -it <container_name> psql -U <username> -d <database_name>
34-
```
33+
```
34+
docker exec -i bloom-local-db pg_restore -U postgres -d bloom --clean --if-exists < /path/to/dumpfile.dump
35+
```
3536

36-
This will open the psql server for bloom, where you can run queries to verify the restore.
37+
**Docker with Local DB or Running Manually Command:**
3738

38-
You can verify the restore by running a SQL query to test if one of our test user's data has been properly populated into the database:
39+
```
40+
pg_restore -U postgres -d bloom --clean --if-exists /path/to/dumpfile.dump
41+
```
3942

40-
```
41-
SELECT * FROM public."user" users WHERE users."email" = '[email protected]';
42-
```
43+
2. Next, start the bloom psql database server.
4344

44-
If the user exists, the database has successfully been seeded!
45+
**Fully Containerized App Command:**
46+
47+
```
48+
docker exec -it <container_name> psql -U <username> -d <database_name>
49+
50+
# same command with default values added:
51+
docker exec -it bloom-local-db psql -U postgres -d bloom
52+
```
53+
54+
**Docker with Local DB or Running Manually Command:**
55+
56+
```
57+
psql -U <username> -h localhost -p 5432 -d <database_name>
58+
```
59+
60+
3. Verify the restore by running queries in the psql server.
61+
62+
```
63+
SELECT \* FROM public."user" users WHERE users."email" = '[email protected]';
64+
```
65+
66+
If the user exists, your database has successfully been populated with test data!
4567

4668
### Troubleshooting
4769

70+
- Persistent storage is configured in the `docker-compose.yml` file using [named volumes](https://docs.docker.com/engine/storage/volumes/). This maintains your data, even if you delete your container. If you have issues with accessing persistent db storage, try replacing the volume path with an absolute path, or update your firewall settings if using WSL (especially if running integration tests). If issues with volumes persist, remove the named volumes from `docker-compose.yml` and populate your database manually as needed.
71+
- Ensure both the 'db' and 'api' containers are running.
72+
- Hard reset Docker containers `docker-compose up -d --force-recreate`.
4873
- If you remove **`--clean`** from the restore command but encounter duplicate object errors, the existing schema may conflict with the restore. In that case, clean the specific objects manually or use **`DROP SCHEMA public CASCADE`** before restoring.
49-
- Verify that the dump file is valid by running: `pg_restore --list yourfile.dump` If it fails to list contents, the dump file may be corrupted or incomplete.
50-
- In the psql server, verify the tables and columns exist with `\dt` , `\dt public.*` , and `\d public."user";`
74+
- Verify that the dump file is valid by running: `pg_restore --list yourfile.dump` If it fails to list contents, the dump file may be corrupted or incomplete. Please notify our team if this happens.
75+
- Verify the tables and columns exist within the psql server with `\dt` , `\dt public.*` , and `\d public."user";`
5176
- Run a **`DROP SCHEMA`** or truncate tables before running **`pg_restore`:**
77+
5278
```
5379
DROP SCHEMA public CASCADE;
5480
CREATE SCHEMA public;
5581
```
56-
- Try the following: delete the existing db, create a new db with the same name, and try the restore on this new db. The db drop may throw an error, if so run the following command first.
57-
58-
`SELECT pg_terminate_backend(pid) FROM pg_stat_activity WHERE datname = 'bloom';`
5982

83+
- To hard reset the database in the psql server, first delete the existing db, then create a new db with the same name, and try the restore on this new db. The db drop may throw an error, if so run the following command first:
84+
```
85+
SELECT pg_terminate_backend(pid) FROM pg_stat_activity WHERE datname = 'bloom';`
86+
```
6087
Then drop the database using:
61-
62-
`DROP DATABASE bloom;`
63-
88+
```
89+
DROP DATABASE bloom;
90+
```
6491
- If the sql dump file is outdated, you can update it by running `docker compose down` then `docker compose up` again as this is configured to run migrations.
6592

6693
### Chayn Staff - Heroku Directions
@@ -72,11 +99,11 @@ Chayn staff with access to Heroku, you also have the option to seed the database
7299
3. Replace <HEROKU_APP_NAME> with the correct Heroku app name in the `seed-local-db.sh file`
73100
4. Run `chmod +x ./seed-local-db.sh` in your terminal to make the file executable
74101

75-
After the above has been confirmed, run
102+
After the above has been confirmed, run
76103

77-
```bash
78-
bash seed-local-db.sh
79-
```
104+
```bash
105+
bash seed-local-db.sh
106+
```
80107

81108
## Database Migrations
82109

docs/local-development.md

+25-30
Original file line numberDiff line numberDiff line change
@@ -2,35 +2,26 @@
22

33
## Summary
44

5-
**The develop branch is our source of truth.** Fork from develop, create new feature branch, then when your PR is merged, develop will automatically merge into the main branch for deployment to production.
6-
75
To run Bloom's backend:
86

97
1. Install prerequisites
108
2. Configure environment variables
119
3. Install dependencies
12-
4. Run in a Dev Container, with Docker, or manually.
13-
5. Populate the database (required for most fullstack contributions and running integration tests from the frontend)
10+
4. Run the app using Docker, Dev Containers, or Manually
11+
5. Populate the database
1412

1513
To test the backend:
1614

1715
- Run unit tests
18-
- Run e2e integration tests from the frontend for fullstack contributions \*requires populating the database with data first
16+
- Run e2e integration tests from the frontend for full-stack contributions
1917

2018
## Prerequisites
2119

2220
- NodeJS v20.x
2321
- Yarn v1.x
24-
- Docker
25-
- PostgreSQL 16
26-
27-
#### Recommended Minimum System Requirements:
22+
- Docker and / or PostgreSQL
2823

29-
- CPU: Quad-core 2.5 GHz (i5/Ryzen 5)
30-
- Memory: 16 GB RAM
31-
- Storage: 512 GB
32-
- OS: Linux, macOS, Windows, or WSL2 (latest versions)
33-
- Internet Connection: For accessing dependencies and external APIs/services
24+
_Recommended Minimum System Requirements: CPU: Quad-core 2.5 GHz (i5/Ryzen 5), Memory: 16 GB RAM, Storage: 512 GB, OS: Linux, macOS, Windows, or WSL2 (latest versions), Internet Connection: For accessing dependencies and external APIs/services._
3425

3526
## Configure Environment Variables
3627

@@ -46,15 +37,17 @@ yarn
4637

4738
There are 3 methods you can use to run Bloom’s backend locally:
4839

49-
1. **Using Docker (recommended)** - runs postgres in a container.
50-
2. **Visual Studio Code Dev Container (recommended for Visual Studio users)** - installs all dependencies and the postgres database container automatically.
51-
3. **Manually** - manage postgres locally.
40+
1. **Using Docker (recommended)** - the backend app is fully containerized, installing PostgreSQL is optional.
41+
2. **Visual Studio Code Dev Container (recommended for Visual Code users)** - installs all dependencies and the PostgreSQL database container automatically.
42+
3. **Manually (recommended for PostgreSQL users)** - run the app with yarn and manage PostgreSQL locally.
5243

53-
### With Docker - Recommended
44+
### Run with Docker - Recommended
5445

55-
Bloom's backend is containerized and can be run solely in Docker - both the PostgreSQL database and NestJS app. This uses the least resources on your computer. To run the backend locally, make sure your system has Docker installed - you may need Docker Desktop if using a Mac or Windows.
46+
Prequisites: Docker (we recommend [Docker Desktop](https://docs.docker.com/desktop/)), PostgreSQL (optional).
5647

57-
First, make sure the Docker app is running (just open the app). Then run
48+
Bloom's backend is fully containerized - both PostgreSQL and NestJS app. This does not require PostgreSQL to be installed locally. To connect to a local PostgreSQL database instead, modify the `DATABASE_URL` in the `docker-compose.yml` file. This will enable communications between Docker and your local database.
49+
50+
To start the Docker container run:
5851

5952
```bash
6053
docker-compose up
@@ -66,9 +59,7 @@ You should see this in the shell output:
6659
Listening on localhost:35001, CTRL+C to stop
6760
```
6861

69-
_Note: you can use an application like Postman to test the apis locally_
70-
71-
### Run in Dev Container - Recommended for Visual Studio Users
62+
### Run with Dev Container - Recommended for Visual Studio Users
7263

7364
This method will automatically install all dependencies, IDE settings, and postgres container in a Dev Container (Docker container) within Visual Studio Code.
7465

@@ -88,9 +79,13 @@ The dev Container is configured in the `.devcontainer` directory:
8879

8980
See [Visual Studio Code Docs: Developing Inside a Dev Container](https://code.visualstudio.com/docs/devcontainers/containers) for more info.
9081

91-
### Run Manually
82+
### Run Manually - Recommended for PostgreSQL Users
9283

93-
Manage postgres locally to [populate the database](#populate-database), then run:
84+
Prerequisites: PostgreSQL
85+
86+
Log into PostgreSQL and create a database called "bloom". Ensure it is running on port `35000` (or your desired port). Finally, start the PostgreSQL server on your machine.
87+
88+
With the psql server running, start the app:
9489

9590
```bash
9691
yarn start:dev
@@ -148,16 +143,16 @@ See the [database-guide.md](database-guide.md) for instructions.
148143

149144
# Git Flow and Deployment
150145

151-
**The develop branch is our source of truth, not main.**
146+
**The develop branch is our source of truth, not main.** Fork from `develop`, create new feature branch, then when your PR is merged, `develop` will automatically merge into the main branch for deployment to production. Keep your branch updated by rebasing and merging feature/bug branches into `develop` as you code.
152147

153-
Create new branches from the `develop` base branch. There is no need to run the build command before pushing changes to GitHub, simply push and create a pull request for the new branch. GitHub Actions will run build and linting tasks automatically. Rebase and merge feature/bug branches into `develop`.
154-
155-
This will trigger an automatic deployment to the staging app by Heroku.
148+
Once your PR is merged to `develop`, this will trigger an automatic deployment to the staging app by Heroku.
156149

157150
When changes have been tested in staging, merge `develop` into `main`. This will trigger an automatic deployment to the production app by Heroku.
158151

159-
# Swagger
152+
# APIs
160153

161154
Swagger automatically reflects all of the endpoints in the app, showing their urls and example request and response objects.
162155

163156
To access Swagger simply run the project and visit http://localhost:35001/swagger
157+
158+
For testing APIs, we recommend using tools like Postman.

0 commit comments

Comments
 (0)