You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardexpand all lines: docs/database-guide.md
+60-33
Original file line number
Diff line number
Diff line change
@@ -2,65 +2,92 @@
2
2
3
3
## How to Populate the Database
4
4
5
-
**Prerequisites:**
5
+
### Prerequisites
6
6
7
-
-[Postgres 16](https://www.postgresql.org/download/)\*technically not required if running in Docker
8
-
- Running Bloom’s backend
7
+
- Bloom's backend is running
9
8
10
9
### Summary
11
10
12
-
Most open-source contributions (like running Cypress integration tests from the frontend) require adding test data to your local database. To do this, download Bloom's test data dump file, connect to the database server, then populate the database with the backup data.
11
+
Populating your database with test data is essential for a fully functional development environment, making full-stack contributions, and running Cypress integration tests from the frontend. However, it is not necessary for smaller, isolated contributions.
12
+
13
+
First, download Bloom's test data dump file. Next, connect to the database server, restore your database with the dump file, then verify with a query.
13
14
14
15
### Download Test Data Dump File
15
16
16
-
Download the test data dump file [linked here from our Google Drive](https://drive.google.com/file/d/1y6KJpAlpozEg3GqhK8JRdmK-s7uzJXtB/view?usp=drive_link).
17
+
First, download the test data dump file [linked here from our Google Drive](https://drive.google.com/file/d/1y6KJpAlpozEg3GqhK8JRdmK-s7uzJXtB/view?usp=drive_link). Then place this dump file in the project directory.
17
18
18
19
### Connect to Server and Add Data
19
20
20
-
There are multiple methods you can use here. For simplicity, these directions assume you are using Docker.
21
+
Next, connect to the database server and add test data from the dump file, using the appropriate commands based on how you are running the app - fully containerized, containerized app with local database, or manually without Docker.
21
22
22
-
Run the following command to restore the database from the dump file using pg_restore:
23
+
1. Restore the database from the dump file by running these pg_restore commands.
`container_name`, `username`, and `database_name` are defined in the `docker-compose.yml` under the ‘db’ service. Here is the same command with the default values:
3. Verify the restore by running queries in the psql server.
61
+
62
+
```
63
+
SELECT \* FROM public."user" users WHERE users."email" = '[email protected]';
64
+
```
65
+
66
+
If the user exists, your database has successfully been populated with test data!
45
67
46
68
### Troubleshooting
47
69
70
+
- Persistent storage is configured in the `docker-compose.yml` file using [named volumes](https://docs.docker.com/engine/storage/volumes/). This maintains your data, even if you delete your container. If you have issues with accessing persistent db storage, try replacing the volume path with an absolute path, or update your firewall settings if using WSL (especially if running integration tests). If issues with volumes persist, remove the named volumes from `docker-compose.yml` and populate your database manually as needed.
71
+
- Ensure both the 'db' and 'api' containers are running.
72
+
- Hard reset Docker containers `docker-compose up -d --force-recreate`.
48
73
- If you remove **`--clean`** from the restore command but encounter duplicate object errors, the existing schema may conflict with the restore. In that case, clean the specific objects manually or use **`DROP SCHEMA public CASCADE`** before restoring.
49
-
- Verify that the dump file is valid by running: `pg_restore --list yourfile.dump` If it fails to list contents, the dump file may be corrupted or incomplete.
50
-
-In the psql server, verify the tables and columns exist with `\dt` , `\dt public.*` , and `\d public."user";`
74
+
- Verify that the dump file is valid by running: `pg_restore --list yourfile.dump` If it fails to list contents, the dump file may be corrupted or incomplete. Please notify our team if this happens.
75
+
-Verify the tables and columns exist within the psql server with `\dt` , `\dt public.*` , and `\d public."user";`
51
76
- Run a **`DROP SCHEMA`** or truncate tables before running **`pg_restore`:**
77
+
52
78
```
53
79
DROP SCHEMA public CASCADE;
54
80
CREATE SCHEMA public;
55
81
```
56
-
- Try the following: delete the existing db, create a new db with the same name, and try the restore on this new db. The db drop may throw an error, if so run the following command first.
57
-
58
-
`SELECT pg_terminate_backend(pid) FROM pg_stat_activity WHERE datname = 'bloom';`
59
82
83
+
- To hard reset the database in the psql server, first delete the existing db, then create a new db with the same name, and try the restore on this new db. The db drop may throw an error, if so run the following command first:
84
+
```
85
+
SELECT pg_terminate_backend(pid) FROM pg_stat_activity WHERE datname = 'bloom';`
86
+
```
60
87
Then drop the database using:
61
-
62
-
`DROP DATABASE bloom;`
63
-
88
+
```
89
+
DROP DATABASE bloom;
90
+
```
64
91
- If the sql dump file is outdated, you can update it by running `docker compose down` then `docker compose up` again as this is configured to run migrations.
65
92
66
93
### Chayn Staff - Heroku Directions
@@ -72,11 +99,11 @@ Chayn staff with access to Heroku, you also have the option to seed the database
72
99
3. Replace <HEROKU_APP_NAME> with the correct Heroku app name in the `seed-local-db.sh file`
73
100
4. Run `chmod +x ./seed-local-db.sh` in your terminal to make the file executable
Copy file name to clipboardexpand all lines: docs/local-development.md
+25-30
Original file line number
Diff line number
Diff line change
@@ -2,35 +2,26 @@
2
2
3
3
## Summary
4
4
5
-
**The develop branch is our source of truth.** Fork from develop, create new feature branch, then when your PR is merged, develop will automatically merge into the main branch for deployment to production.
6
-
7
5
To run Bloom's backend:
8
6
9
7
1. Install prerequisites
10
8
2. Configure environment variables
11
9
3. Install dependencies
12
-
4. Run in a Dev Container, with Docker, or manually.
13
-
5. Populate the database (required for most fullstack contributions and running integration tests from the frontend)
10
+
4. Run the app using Docker, Dev Containers, or Manually
11
+
5. Populate the database
14
12
15
13
To test the backend:
16
14
17
15
- Run unit tests
18
-
- Run e2e integration tests from the frontend for fullstack contributions\*requires populating the database with data first
16
+
- Run e2e integration tests from the frontend for full-stack contributions
19
17
20
18
## Prerequisites
21
19
22
20
- NodeJS v20.x
23
21
- Yarn v1.x
24
-
- Docker
25
-
- PostgreSQL 16
26
-
27
-
#### Recommended Minimum System Requirements:
22
+
- Docker and / or PostgreSQL
28
23
29
-
- CPU: Quad-core 2.5 GHz (i5/Ryzen 5)
30
-
- Memory: 16 GB RAM
31
-
- Storage: 512 GB
32
-
- OS: Linux, macOS, Windows, or WSL2 (latest versions)
33
-
- Internet Connection: For accessing dependencies and external APIs/services
24
+
_Recommended Minimum System Requirements: CPU: Quad-core 2.5 GHz (i5/Ryzen 5), Memory: 16 GB RAM, Storage: 512 GB, OS: Linux, macOS, Windows, or WSL2 (latest versions), Internet Connection: For accessing dependencies and external APIs/services._
34
25
35
26
## Configure Environment Variables
36
27
@@ -46,15 +37,17 @@ yarn
46
37
47
38
There are 3 methods you can use to run Bloom’s backend locally:
48
39
49
-
1.**Using Docker (recommended)** - runs postgres in a container.
50
-
2.**Visual Studio Code Dev Container (recommended for Visual Studio users)** - installs all dependencies and the postgres database container automatically.
51
-
3.**Manually** - manage postgres locally.
40
+
1.**Using Docker (recommended)** - the backend app is fully containerized, installing PostgreSQL is optional.
41
+
2.**Visual Studio Code Dev Container (recommended for Visual Code users)** - installs all dependencies and the PostgreSQL database container automatically.
42
+
3.**Manually (recommended for PostgreSQL users)** - run the app with yarn and manage PostgreSQL locally.
52
43
53
-
### With Docker - Recommended
44
+
### Run with Docker - Recommended
54
45
55
-
Bloom's backend is containerized and can be run solely in Docker - both the PostgreSQL database and NestJS app. This uses the least resources on your computer. To run the backend locally, make sure your system has Docker installed - you may need Docker Desktop if using a Mac or Windows.
First, make sure the Docker app is running (just open the app). Then run
48
+
Bloom's backend is fully containerized - both PostgreSQL and NestJS app. This does not require PostgreSQL to be installed locally. To connect to a local PostgreSQL database instead, modify the `DATABASE_URL` in the `docker-compose.yml` file. This will enable communications between Docker and your local database.
49
+
50
+
To start the Docker container run:
58
51
59
52
```bash
60
53
docker-compose up
@@ -66,9 +59,7 @@ You should see this in the shell output:
66
59
Listening on localhost:35001, CTRL+C to stop
67
60
```
68
61
69
-
_Note: you can use an application like Postman to test the apis locally_
70
-
71
-
### Run in Dev Container - Recommended for Visual Studio Users
62
+
### Run with Dev Container - Recommended for Visual Studio Users
72
63
73
64
This method will automatically install all dependencies, IDE settings, and postgres container in a Dev Container (Docker container) within Visual Studio Code.
74
65
@@ -88,9 +79,13 @@ The dev Container is configured in the `.devcontainer` directory:
88
79
89
80
See [Visual Studio Code Docs: Developing Inside a Dev Container](https://code.visualstudio.com/docs/devcontainers/containers) for more info.
90
81
91
-
### Run Manually
82
+
### Run Manually - Recommended for PostgreSQL Users
92
83
93
-
Manage postgres locally to [populate the database](#populate-database), then run:
84
+
Prerequisites: PostgreSQL
85
+
86
+
Log into PostgreSQL and create a database called "bloom". Ensure it is running on port `35000` (or your desired port). Finally, start the PostgreSQL server on your machine.
87
+
88
+
With the psql server running, start the app:
94
89
95
90
```bash
96
91
yarn start:dev
@@ -148,16 +143,16 @@ See the [database-guide.md](database-guide.md) for instructions.
148
143
149
144
# Git Flow and Deployment
150
145
151
-
**The develop branch is our source of truth, not main.**
146
+
**The develop branch is our source of truth, not main.** Fork from `develop`, create new feature branch, then when your PR is merged, `develop` will automatically merge into the main branch for deployment to production. Keep your branch updated by rebasing and merging feature/bug branches into `develop` as you code.
152
147
153
-
Create new branches from the `develop` base branch. There is no need to run the build command before pushing changes to GitHub, simply push and create a pull request for the new branch. GitHub Actions will run build and linting tasks automatically. Rebase and merge feature/bug branches into `develop`.
154
-
155
-
This will trigger an automatic deployment to the staging app by Heroku.
148
+
Once your PR is merged to `develop`, this will trigger an automatic deployment to the staging app by Heroku.
156
149
157
150
When changes have been tested in staging, merge `develop` into `main`. This will trigger an automatic deployment to the production app by Heroku.
158
151
159
-
# Swagger
152
+
# APIs
160
153
161
154
Swagger automatically reflects all of the endpoints in the app, showing their urls and example request and response objects.
162
155
163
156
To access Swagger simply run the project and visit http://localhost:35001/swagger
157
+
158
+
For testing APIs, we recommend using tools like Postman.
0 commit comments