Skip to content

Commit d30de98

Browse files
Merge pull request #52 from adamcohenhillel/supabase-migrations
Add Supabase migrations, and docs
2 parents 9c313da + e87f5d0 commit d30de98

File tree

3 files changed

+97
-17
lines changed

3 files changed

+97
-17
lines changed

docs/getting_started.md

+6-9
Original file line numberDiff line numberDiff line change
@@ -46,19 +46,16 @@ We will use Supabase as our database (with vector search, pgvector), authenticat
4646

4747
<img src="../images/supabase_new_user.png" width="200">
4848

49-
6. From there, go to the SQL Editor tab (<img src="../images/supabase_sql_editor.png" width="100">) and paste the [schema.sql](/supabase/schema.sql) from this repo, and execute. This will enable all the relevant extensions (pgvector) and create the two tables:
49+
6. By now, you should have 4 things: `email` & `password` for your supabase user, and the `Supabase URL` and `API Anon Key`.
5050

51-
<img src="../images/supabase_tables.png" width="150">
51+
7. If so, go to your terminal, and cd to the supabase folder: `cd ./supabase`
5252

53-
7. By now, you should have 4 things: `email` & `password` for your supabase user, and the `Supabase URL` and `API Anon Key`.
54-
55-
8. If so, go to your terminal, and cd to the supabase folder: `cd ./supabase`
56-
57-
9. Install Supabase and set up the CLI. You should follow thier [guide here](https://supabase.com/../guides/cli/getting-started?platform=macos#installing-the-supabase-cli), but in short:
53+
8. Install Supabase and set up the CLI. You should follow thier [guide here](https://supabase.com/../guides/cli/getting-started?platform=macos#installing-the-supabase-cli), but in short:
5854
- run `brew install supabase/tap/supabase` to install the CLI (or [check other options](https://supabase.com/../guides/cli/getting-started))
5955
- Install [Docker Desktop](https://www.docker.com/products/docker-desktop/) on your computer (we won't use it, we just need docker daemon to run in the background for deploying supabase functions)
60-
10. Now when we have the CLI, we need to login with oour Supabase account, running `supabase login` - this should pop up a browser window, which should prompt you through the auth
61-
11. And link our Supabase CLI to a specific project, our newly created one, by running `supabase link --project-ref <your-project-id>` (you can check what the project id is from the Supabase web UI, or by running `supabase projects list`, and it will be under "reference id") - you can skip (enter) the database password, it's not needed.
56+
9. Now when we have the CLI, we need to login with our Supabase account, running `supabase login` - this should pop up a browser window, which should prompt you through the auth
57+
10. And link our Supabase CLI to a specific project, our newly created one, by running `supabase link --project-ref <your-project-id>` (you can check what the project id is from the Supabase web UI, or by running `supabase projects list`, and it will be under "reference id") - you can skip (enter) the database password, it's not needed.
58+
11. Now we need to apply the Adeus DB schema on our newly created, and empty database. We can do this by simply run: `supabase db push`. We can verify it worked by going to the Supabase project -> Tables -> and see that new tables were created.
6259
12. Now let's deploy our functions! ([see guide for more details](https://supabase.com/../guides/functions/deploy)) `supabase functions deploy --no-verify-jwt` (see [issue re:security](https://github.com/adamcohenhillel/AdDeus/issues/3))
6360
13. If you're planning to first use OpenAI as your Foundation model provider, then you'd need to also run the following command, to make sure the functions have everything they need to run properly: `supabase secrets set OPENAI_API_KEY=<your-openai-api-key>` (Ollama setup guide is coming out soon)
6461
14. If you want access to tons of AI Models, both Open & Closed Source, set up your OpenRouter API Key. Go to [OpenRouter](https://openrouter.ai/) to get your API Key, then run `supabase secrets set OPENROUTER_API_KEY=<your-openrouter-api-key>`.

docs/guides/make_db_migration.md

+79
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
---
2+
title: Make a DB Migration
3+
description: add description
4+
layout: default
5+
parent: How to Guides
6+
---
7+
8+
# Make a DB Migration
9+
{: .no_toc }
10+
11+
## Table of contents
12+
{: .no_toc .text-delta }
13+
14+
1. TOC
15+
{:toc}
16+
17+
---
18+
19+
## Intro
20+
If you're working on a new feature that requires changes to the database, then you need to generate a migration file for those changes, so when your feature is merged to the main branch, and start being used by other people, they will be able to update their database accordingly.
21+
22+
This guide provides step-by-step instructions for how to make migration file from your Supabaase database changes.
23+
24+
25+
## Create the migration
26+
27+
Let's say you edited the database in your Supabase project. You added the column "new_data" to the table.
28+
29+
Now you need to make sure others will have that column as well.
30+
31+
32+
1. Go to the supabase folder in your local cloned repo
33+
```bash
34+
cd supabase
35+
```
36+
37+
2. Make sure you're linked to the right Supabase project:
38+
```bash
39+
supabase link --project-ref <YOUR_REMOTE_SUPABASE_PROJECT_ID>
40+
```
41+
42+
3. Create a new migration from the remote Supabase instance:
43+
```bash
44+
supabase db pull
45+
```
46+
47+
This will generate a new file in the folder `supabase/migrations` named <timestamp>_remote_commit.sql
48+
49+
50+
Add it to your branch, and push it with the rest of the feature code to your PR.
51+
52+
53+
## Sync your database with all existing migrations
54+
55+
In case there are new migrations for Adeus, and you need to sync your own database with the latest migrations, follow these instructions:
56+
57+
58+
1. Go to the supabase folder in your local cloned repo
59+
```bash
60+
cd supabase
61+
```
62+
63+
2. Make sure you're linked to the right Supabase project:
64+
```bash
65+
supabase link --project-ref <YOUR_REMOTE_SUPABASE_PROJECT_ID>
66+
```
67+
68+
3. Have a dry run:
69+
70+
```bash
71+
supabase db push --dry-run
72+
```
73+
This will tell you what migrations will need to run, but without executing. This is useful way to see upfront what the migration changes are.
74+
75+
4. Push to Prod!!!!!!!!
76+
```bash
77+
supabase db push
78+
```
79+

supabase/schema.sql supabase/migrations/20240214211830_remote_schema.sql

+12-8
Original file line numberDiff line numberDiff line change
@@ -26,7 +26,7 @@ CREATE EXTENSION IF NOT EXISTS "uuid-ossp" WITH SCHEMA "extensions";
2626

2727
CREATE EXTENSION IF NOT EXISTS "vector" WITH SCHEMA "extensions";
2828

29-
CREATE OR REPLACE FUNCTION "public"."match_records_embeddings_similarity"("query_embedding" "extensions"."vector", "match_threshold" double precision, "match_count" integer) RETURNS TABLE("id" integer, "raw_text" "text", "similarity" double precision)
29+
CREATE OR REPLACE FUNCTION "public"."match_records_embeddings_similarity"(query_embedding extensions.vector, match_threshold double precision, match_count integer) RETURNS TABLE(id integer, raw_text text, similarity double precision)
3030
LANGUAGE "sql" STABLE
3131
AS $$
3232
select
@@ -39,16 +39,16 @@ CREATE OR REPLACE FUNCTION "public"."match_records_embeddings_similarity"("query
3939
limit match_count;
4040
$$;
4141

42-
ALTER FUNCTION "public"."match_records_embeddings_similarity"("query_embedding" "extensions"."vector", "match_threshold" double precision, "match_count" integer) OWNER TO "postgres";
42+
ALTER FUNCTION "public"."match_records_embeddings_similarity"(query_embedding extensions.vector, match_threshold double precision, match_count integer) OWNER TO "postgres";
4343

4444
SET default_tablespace = '';
4545

4646
SET default_table_access_method = "heap";
4747

4848
CREATE TABLE IF NOT EXISTS "public"."conversations" (
4949
"id" bigint NOT NULL,
50-
"created_at" timestamp with time zone DEFAULT "now"() NOT NULL,
51-
"context" "json" DEFAULT '[]'::"json"
50+
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
51+
"context" json DEFAULT '[]'::json
5252
);
5353

5454
ALTER TABLE "public"."conversations" OWNER TO "postgres";
@@ -64,9 +64,9 @@ ALTER TABLE "public"."conversations" ALTER COLUMN "id" ADD GENERATED BY DEFAULT
6464

6565
CREATE TABLE IF NOT EXISTS "public"."records" (
6666
"id" bigint NOT NULL,
67-
"created_at" timestamp with time zone DEFAULT "now"() NOT NULL,
68-
"raw_text" "text",
69-
"embeddings" "extensions"."vector"
67+
"created_at" timestamp with time zone DEFAULT now() NOT NULL,
68+
"raw_text" text,
69+
"embeddings" extensions.vector
7070
);
7171

7272
ALTER TABLE "public"."records" OWNER TO "postgres";
@@ -86,7 +86,7 @@ ALTER TABLE ONLY "public"."conversations"
8686
ALTER TABLE ONLY "public"."records"
8787
ADD CONSTRAINT "records_pkey" PRIMARY KEY ("id");
8888

89-
CREATE POLICY "Enable access for all authed" ON "public"."conversations" TO "authenticated" USING (true);
89+
CREATE POLICY "Enable access for all authed" ON "public"."conversations" TO authenticated USING (true);
9090

9191
ALTER TABLE "public"."conversations" ENABLE ROW LEVEL SECURITY;
9292

@@ -95,6 +95,10 @@ GRANT USAGE ON SCHEMA "public" TO "anon";
9595
GRANT USAGE ON SCHEMA "public" TO "authenticated";
9696
GRANT USAGE ON SCHEMA "public" TO "service_role";
9797

98+
GRANT ALL ON FUNCTION "public"."match_records_embeddings_similarity"(query_embedding extensions.vector, match_threshold double precision, match_count integer) TO "anon";
99+
GRANT ALL ON FUNCTION "public"."match_records_embeddings_similarity"(query_embedding extensions.vector, match_threshold double precision, match_count integer) TO "authenticated";
100+
GRANT ALL ON FUNCTION "public"."match_records_embeddings_similarity"(query_embedding extensions.vector, match_threshold double precision, match_count integer) TO "service_role";
101+
98102
GRANT ALL ON TABLE "public"."conversations" TO "anon";
99103
GRANT ALL ON TABLE "public"."conversations" TO "authenticated";
100104
GRANT ALL ON TABLE "public"."conversations" TO "service_role";

0 commit comments

Comments
 (0)