diff --git a/.env.example b/.env.example
index 7d4434c..9da9da8 100644
--- a/.env.example
+++ b/.env.example
@@ -13,6 +13,3 @@ DIRECT_URL= ''
# url for background task
NEXT_BG_TASK_URL='http://localhost:3001'
-# redist host
-REDIS_HOST='localhost'
-
diff --git a/README.md b/README.md
index c3908a1..ba3a13d 100644
--- a/README.md
+++ b/README.md
@@ -1,4 +1,3 @@
-
## Important
@@ -13,7 +12,6 @@ I started this open-source project because I found that most existing solutions,
I realized that leveraging AI to read and extract text from credit card PDF statements is a more efficient way to store and categorize expenses. By storing this data in a database, we can streamline the entire process. My goal is to create a self-hosted, local-first solution—much like Firefly or ActualBudget—where users have more control over their data.
-
## Features
The main feature to upload credit card PDFs or zip files containing credit card PDFs will work as follows:
@@ -45,21 +43,18 @@ The main feature to upload credit card PDFs or zip files containing credit card
1. **Background Job Processing**
PDF statements are sent to the background job server for processing.
-2. **Parsing Workflow**
+1. **Parsing Workflow**
+
- **Convert PDFs to Images**: The PDF files are converted into images.
- **OCR (Optical Character Recognition)**: OCR is run on the images to extract text from the statement.
- **OpenAI API**: The extracted text is sent to the OpenAI API, which returns parsed expenses, categorized according to custom rules.
-3. **Temporary Storage in Redis**
- The parsed expenses are temporarily stored in Redis.
-
-4. **Frontend Options**
+1. **Frontend Options**
On the frontend, users can:
- Store the parsed data in the database
- Download it as a CSV file
- Edit the data before taking any further action
-
## Database and OpenAI key
### Open AI Key
@@ -76,51 +71,52 @@ This [medium](https://medium.com/@lorenzozar/how-to-get-your-own-openai-api-key-
Make sure you have sufficient credit in your OpenAI account for billing purposes. It costs approximately $0.03 USD per credit card statement. Detailed cost calculations will be provided here.
-
### Postgresql DB Connection
You can setup postgresql locally, but here is the guide on setting up in supabase
1. Create a Supabase Account
- - Go to the [Supabase website](https://supabase.com/) and sign up for an account.
- - If you already have an account, simply log in.
+
+ - Go to the [Supabase website](https://supabase.com/) and sign up for an account.
+ - If you already have an account, simply log in.
1. Create a New Project
- - After logging in, click the **New Project** button on your dashboard.
- - Enter your project details:
- - **Name**: Choose a unique name for your project.
- - **Organization**: Select your organization or create a new one if necessary.
- - **Database Password**: Create a secure password for your database. Keep this password safe since you'll need it to connect to your database later.
- - **Region**: Choose the closest server location to optimize performance.
- Once done, click **Create new project**.
+ - After logging in, click the **New Project** button on your dashboard.
+ - Enter your project details:
+ - **Name**: Choose a unique name for your project.
+ - **Organization**: Select your organization or create a new one if necessary.
+ - **Database Password**: Create a secure password for your database. Keep this password safe since you'll need it to connect to your database later.
+ - **Region**: Choose the closest server location to optimize performance.
-3. Wait for the Database to be Set Up
- - It may take a few minutes for Supabase to set up your PostgreSQL database. Once ready, you’ll be taken to your project dashboard.
+ Once done, click **Create new project**.
-4. Retrieve Your Database Connection String
- - From your project dashboard, navigate to the **Settings** tab.
- - Under **Database**, you’ll see details for connecting to your database.
- - Look for **Connection String** or **Database URL**. It will look something like this:
+1. Wait for the Database to be Set Up
- ```plaintext
- postgres://username:password@host:port/database
- ```
+ - It may take a few minutes for Supabase to set up your PostgreSQL database. Once ready, you’ll be taken to your project dashboard.
-You will need this connection string to connect your application to the Supabase database.
+1. Retrieve Your Database Connection String
-5. Save Your Connection String
- - Make sure to copy and securely store the connection string. You will need it to set up the database in your app.
+ - From your project dashboard, navigate to the **Settings** tab.
+ - Under **Database**, you’ll see details for connecting to your database.
+ - Look for **Connection String** or **Database URL**. It will look something like this:
+ ```plaintext
+ postgres://username:password@host:port/database
+ ```
+You will need this connection string to connect your application to the Supabase database.
+
+5. Save Your Connection String
+ - Make sure to copy and securely store the connection string. You will need it to set up the database in your app.
## Docker Setup
### Prerequisite
1. **Docker**
- - Install docker
- - Run Docker
+ - Install docker
+ - Run Docker
### Installation
@@ -135,8 +131,8 @@ You will need this connection string to connect your application to the Supabase
```sh
cd self-hosted-expense-tracker
```
-
-1. Run npm install
+
+1. Run npm install
```sh
npm install
@@ -151,7 +147,7 @@ You will need this connection string to connect your application to the Supabase
1. Run prisma migrate for the first time to migrate database.
```sh
- npx prisma migrate deploy --schema apps/web/prisma/schema.prisma
+ npx prisma migrate deploy --schema apps/web/prisma/schema.prisma
```
1. Run Docker Compose and build
@@ -160,17 +156,14 @@ You will need this connection string to connect your application to the Supabase
docker compose up --build
```
-
## Development Setup
-
### Prerequisite
+
1. NodeJs
1. Ghostscript and graphicsmagick
- - Both Ghostscript and Graphicsmagick are require to PDF parsing on the background.
- - Follow this [guide](https://github.com/yakovmeister/pdf2image/blob/HEAD/docs/gm-installation.md) to install both of them
-1. Redis
- - Setup and run redis on port 6379. Guide [here](https://redis.io/docs/latest/operate/oss_and_stack/install/install-redis/).
+ - Both Ghostscript and Graphicsmagick are require to PDF parsing on the background.
+ - Follow this [guide](https://github.com/yakovmeister/pdf2image/blob/HEAD/docs/gm-installation.md) to install both of them
### Installation
@@ -185,8 +178,8 @@ You will need this connection string to connect your application to the Supabase
```sh
cd self-hosted-expense-tracker
```
-
-1. Run npm install
+
+1. Run npm install
```sh
npm install
@@ -201,7 +194,7 @@ You will need this connection string to connect your application to the Supabase
1. Run prisma migrate for the first time to migrate database.
```sh
- npx prisma migrate deploy --schema apps/web/prisma/schema.prisma
+ npx prisma migrate deploy --schema apps/web/prisma/schema.prisma
```
1. Run in development mode
@@ -209,14 +202,12 @@ You will need this connection string to connect your application to the Supabase
```sh
npm run dev
```
-
-
- ## Start Up Guide
-
- 1. Go to `/sign-up` to create account with `username` and `password` after everything has set up.
-
- 1. The categories will be seeded from `/apps/web/prisma/categoryList.json`.
-
+
+## Start Up Guide
+
+1. Go to `/sign-up` to create account with `username` and `password` after everything has set up.
+
+1. The categories will be seeded from `/apps/web/prisma/categoryList.json`.
## Roadmap
diff --git a/apps/background-job/package.json b/apps/background-job/package.json
index e4caf26..9acfe87 100644
--- a/apps/background-job/package.json
+++ b/apps/background-job/package.json
@@ -9,12 +9,12 @@
"@hono/node-server": "^1.13.1",
"bullmq": "^5.14.0",
"dotenv": "^16.4.5",
+ "fastq": "^1.17.1",
"file-type": "^19.6.0",
"hono": "^4.6.3",
- "ioredis": "^5.4.1",
+ "node-cache": "^5.1.2",
"openai": "^4.67.0",
"pdf2pic": "^3.1.3",
- "redis": "^4.7.0",
"tesseract.js": "^5.1.1",
"yauzl": "^3.1.3"
},
diff --git a/apps/background-job/src/index.ts b/apps/background-job/src/index.ts
index 7a80ddc..72d8692 100644
--- a/apps/background-job/src/index.ts
+++ b/apps/background-job/src/index.ts
@@ -1,22 +1,25 @@
-import { fromBuffer } from "pdf2pic";
-import { createWorker } from "tesseract.js";
+import crypto from "crypto";
+import dotenv from "dotenv";
import { serve } from "@hono/node-server";
-import Redis from "ioredis";
import { cors } from "hono/cors";
-import { Queue, Worker } from "bullmq";
-import yauzl from "yauzl";
import { Hono } from "hono";
+import NodeCache from "node-cache";
import OpenAI from "openai";
import { fileTypeFromBuffer } from "file-type";
-import { generateParsingPrompt } from "./generate-parsing-prompt.js";
-import { trimPdfText } from "./trim-pdf-text.js";
+import * as fastq from "fastq";
+import type { queueAsPromised } from "fastq";
+import { fromBuffer } from "pdf2pic";
+import { createWorker } from "tesseract.js";
+import yauzl from "yauzl";
-import dotenv from "dotenv";
import { calculateTokenPricing } from "./calculate-token-pricing.js";
+import { generateParsingPrompt } from "./generate-parsing-prompt.js";
+import { trimPdfText } from "./trim-pdf-text.js";
dotenv.config({ path: "../../.env" });
type StatementTask = {
+ id: string;
name: string;
buffer: Buffer;
userId: string;
@@ -34,13 +37,8 @@ type CompletedTask = {
pricing: number;
};
-const redis = new Redis({
- host: process.env.REDIS_HOST, // The Redis service name defined in Docker Compose
- port: 6379,
-});
-
+const taskCache = new NodeCache();
const app = new Hono();
-
const client = new OpenAI({
apiKey: process.env.OPENAI_API_KEY,
});
@@ -54,19 +52,11 @@ app.use("*", async (c, next) => {
await next();
});
-// Create a new connection in every instance
-const statementQueue = new Queue("process_file", {
- connection: {
- host: process.env.REDIS_HOST,
- port: 6379,
- },
-});
-
-const myWorker = new Worker(
- "process_file",
- async (task) => {
+async function pdfProcessWorker(task: StatementTask): Promise {
+ try {
let pdfText = "";
- const fileBuffer = Buffer.from(task.data.buffer);
+ taskCache.set("active-task", task);
+ const fileBuffer = Buffer.from(task.buffer);
const bufferResponses = await fromBuffer(fileBuffer, {
density: 200,
format: "png",
@@ -74,10 +64,10 @@ const myWorker = new Worker(
height: 2000,
}).bulk(-1, { responseType: "buffer" });
- const categoryStr = await redis.get(`category:${task.data.userId}`);
- const category = JSON.parse(categoryStr || "");
+ const categoryStr = taskCache.get(`category:${task.userId}`);
+ const category = JSON.parse(categoryStr);
- console.log(task.data.name, ": start OCR");
+ console.log(task.name, ": start OCR");
for (let i = 0; i < bufferResponses.length; i++) {
let image = bufferResponses[i].buffer;
@@ -95,7 +85,7 @@ const myWorker = new Worker(
// generate prompt
const prompt = generateParsingPrompt(pdfText, category);
- console.log(task.data.name, ": start prompting");
+ console.log(task.name, ": start prompting");
// prompt open AI
const chatCompletion = await client.chat.completions.create({
@@ -104,7 +94,7 @@ const myWorker = new Worker(
});
let promptValue = chatCompletion.choices[0].message.content || "";
- let fileValue = task.data.buffer;
+ let fileValue = task.buffer;
const completionTokens = chatCompletion.usage?.completion_tokens || 0;
const promptTokens = chatCompletion.usage?.prompt_tokens || 0;
@@ -113,8 +103,8 @@ const myWorker = new Worker(
state: "completed",
completion: promptValue,
file: fileValue,
- name: task.data.name,
- userId: task.data.userId,
+ name: task.name,
+ userId: task.userId,
completedAt: new Date(),
completionTokens,
promptTokens,
@@ -122,31 +112,34 @@ const myWorker = new Worker(
calculateTokenPricing(completionTokens, promptTokens, "gpt-4o") || 0,
};
- await redis.set(
- `done:${task.data.userId}:${task.id}`,
- JSON.stringify(value)
- );
- return promptValue;
- },
- {
- connection: {
- host: process.env.REDIS_HOST,
- port: 6379,
- },
+ taskCache.set(`done:${task.userId}:${task.id}`, value);
+
+ const completedTask =
+ taskCache.get(`completed-task:${task.userId}`) || [];
+ completedTask?.push(task.id);
+
+ taskCache.set(`completed-task:${task.userId}`, completedTask);
+
+ taskCache.del("active-task");
+ } catch (e) {
+ console.log(e);
}
-);
+}
-myWorker.on("failed", (reason) => {
- console.log(reason?.failedReason);
-});
+const taskQueue: queueAsPromised = fastq.promise(
+ pdfProcessWorker,
+ 1
+);
app.use("/*", cors());
app.get("/tasks/:userid", async (c) => {
const userId = c.req.param("userid");
- const activeTasks = await statementQueue.getActive();
- const waitingTasks = await statementQueue.getWaiting();
- const completedTasks = await statementQueue.getCompleted();
+
+ const waitingTasks = taskQueue.getQueue();
+ const completedTasks =
+ taskCache.get(`completed-task:${userId}`) || [];
+ const activeTask = taskCache.get(`active-task`);
let tasks: {
status: string;
@@ -158,65 +151,73 @@ app.get("/tasks/:userid", async (c) => {
pricing?: number;
}[] = [];
- activeTasks.forEach((task) => {
- if (task.data.userId === userId) {
- tasks.push({
- status: "active",
- title: task.data.name,
- key: task.id || "",
- });
- }
- });
+ if (activeTask && activeTask.userId === userId) {
+ tasks.push({
+ status: "active",
+ title: activeTask.name,
+ key: activeTask.id || "",
+ });
+ }
waitingTasks.forEach((task) => {
- if (task.data.userId === userId) {
+ if (task.userId === userId) {
tasks.push({
status: "waiting",
- title: task.data.name,
+ title: task.name,
key: task.id || "",
});
}
});
for (let i = 0; i < completedTasks.length; i++) {
- if (completedTasks[i].data.userId === userId) {
- const filterTask = await redis.get(
- `done:${userId}:${completedTasks[i].id}`
- );
- if (filterTask) {
- const completedTask: CompletedTask = JSON.parse(filterTask);
-
- tasks.push({
- status: "completed",
- title: completedTasks[i].data.name,
- key: completedTasks[i].id || "",
- completedAt: completedTask.completedAt,
- promptTokens: completedTask.promptTokens,
- completionTokens: completedTask.completionTokens,
- pricing: completedTask.pricing,
- });
- }
+ const completed = taskCache.get(
+ `done:${userId}:${completedTasks[i]}`
+ );
+ if (completed) {
+ tasks.push({
+ status: "completed",
+ title: completed.name,
+ key: completedTasks[i] || "",
+ completedAt: completed.completedAt,
+ promptTokens: completed.promptTokens,
+ completionTokens: completed.completionTokens,
+ pricing: completed.pricing,
+ });
}
}
return c.json(tasks);
});
-app.get("/tasks/:userid/done", async (c) => {
+app.get("/tasks/:userid/done", (c) => {
const userId = c.req.param("userid");
const taskIds = c.req.query("ids")?.split(",") || [];
const taskKey = taskIds.map((id: string) => `done:${userId}:${id}`) || [];
- const completedTasks = await redis.mget(taskKey);
- return c.json(completedTasks);
+ const completedTasks = taskCache.mget(taskKey);
+ const completedTasksArray = Object.values(completedTasks);
+
+ return c.json(completedTasksArray);
});
-app.delete("/tasks/:userid/done", async (c) => {
+app.delete("/tasks/:userid/done", (c) => {
const userId = c.req.param("userid");
const taskIds = c.req.query("ids")?.split(",") || [];
const taskKey = taskIds.map((id: string) => `done:${userId}:${id}`) || [];
- const deleteKey = await redis.del(taskKey);
+
+ const deleteKey = taskCache.del(taskKey);
+
+ // delete fromm taskCache as well.
+ const completedTask =
+ taskCache.get(`completed-task:${userId}`) || [];
+
+ const newCompletedTask = completedTask.filter(
+ (task) => !taskIds.includes(task)
+ );
+
+ taskCache.set(`completed-task:${userId}`, newCompletedTask);
+
return c.json(deleteKey);
});
@@ -231,18 +232,15 @@ app.post("/upload", async (c) => {
const fileBuffer = await file.arrayBuffer();
const fileType = await fileTypeFromBuffer(fileBuffer);
- await redis.set(`category:${userId}`, categoryStr);
+ taskCache.set(`category:${userId}`, categoryStr);
if (fileType?.mime === "application/pdf") {
- await statementQueue.add(
- "process_file",
- {
- name: fileName,
- buffer: Buffer.from(fileBuffer),
- userId: userId,
- },
- { removeOnComplete: 30 }
- );
+ taskQueue.push({
+ id: crypto.randomUUID(),
+ name: fileName,
+ buffer: Buffer.from(fileBuffer),
+ userId: userId,
+ });
}
if (fileType?.mime === "application/zip") {
@@ -278,20 +276,16 @@ app.post("/upload", async (c) => {
entry.fileName.split("/").pop() || entry.fileName;
try {
- // Store the buffer in Redis (using the file name as the key)
if (fileName[0] !== ".") {
- await statementQueue.add(
- "process_file",
- {
- name: fileName,
- buffer: fileBuffer,
- userId: userId,
- },
- { removeOnComplete: 30 }
- );
+ taskQueue.push({
+ id: crypto.randomUUID(),
+ name: fileName,
+ buffer: Buffer.from(fileBuffer),
+ userId: userId,
+ });
}
- } catch (redisErr) {
- console.error(`Failed to store file in Redis: ${redisErr}`);
+ } catch (queueError) {
+ console.error(`Failed to push file to queue: ${queueError}`);
}
zipfile.readEntry(); // Proceed to the next entry
diff --git a/apps/web/pages/api/task.ts b/apps/web/pages/api/task.ts
index 36761e4..1b12c8b 100644
--- a/apps/web/pages/api/task.ts
+++ b/apps/web/pages/api/task.ts
@@ -83,7 +83,7 @@ async function handler(req: NextApiRequest & Request, res: NextApiResponse & Res
for (let i = 0; i < tasks.length; i++) {
const task = tasks[i];
if (task) {
- const { completion, file, name } = JSON.parse(task);
+ const { completion, file, name } = task;
if (file && completion) {
const [parsedStatment, parsedExpenses] = completionToParsedDate(completion, categoryResult);
diff --git a/apps/web/pages/api/task/[id].ts b/apps/web/pages/api/task/[id].ts
index 592ea61..9843f48 100644
--- a/apps/web/pages/api/task/[id].ts
+++ b/apps/web/pages/api/task/[id].ts
@@ -17,13 +17,8 @@ export default async function handler(req: NextApiRequest, res: NextApiResponse)
if (req.method === "GET") {
const response = await fetchCompletedTasks(userId, id as string);
const completedTaskJson = await response.json();
-
if (completedTaskJson) {
- const resJson = completedTaskJson.map((task: any) => {
- return JSON.parse(task);
- });
-
- res.status(200).json(resJson);
+ res.status(200).json(completedTaskJson);
} else {
res.status(404).json({ message: "Task not found" });
}
diff --git a/docker-compose.yml b/docker-compose.yml
index 1a0f67a..afaf17b 100644
--- a/docker-compose.yml
+++ b/docker-compose.yml
@@ -23,22 +23,13 @@ services:
context: .
dockerfile: ./apps/background-job/Dockerfile
restart: always
- depends_on:
- - redis
ports:
- 3001:3001
networks:
- app_network
environment:
- OPENAI_API_KEY=${OPENAI_API_KEY}
- - REDIS_HOST=${REDIS_HOST}
- NEXT_BG_TASK_KEY=${NEXT_BG_TASK_KEY}
- redis:
- image: "redis:alpine"
- ports:
- - "6379"
- networks:
- - app_network
# Define a network, which allows containers to communicate
# with each other, by using their container name as a hostname
diff --git a/package-lock.json b/package-lock.json
index 93af4df..075f5d0 100644
--- a/package-lock.json
+++ b/package-lock.json
@@ -23,12 +23,12 @@
"@hono/node-server": "^1.13.1",
"bullmq": "^5.14.0",
"dotenv": "^16.4.5",
+ "fastq": "^1.17.1",
"file-type": "^19.6.0",
"hono": "^4.6.3",
- "ioredis": "^5.4.1",
+ "node-cache": "^5.1.2",
"openai": "^4.67.0",
"pdf2pic": "^3.1.3",
- "redis": "^4.7.0",
"tesseract.js": "^5.1.1",
"yauzl": "^3.1.3"
},
@@ -3591,71 +3591,6 @@
"integrity": "sha512-A9+lCBZoaMJlVKcRBz2YByCG+Cp2t6nAnMnNba+XiWxnj6r4JUFqfsgwocMBZU9LPtdxC6wB56ySYpc7LQIoJg==",
"license": "MIT"
},
- "node_modules/@redis/bloom": {
- "version": "1.2.0",
- "resolved": "https://registry.npmjs.org/@redis/bloom/-/bloom-1.2.0.tgz",
- "integrity": "sha512-HG2DFjYKbpNmVXsa0keLHp/3leGJz1mjh09f2RLGGLQZzSHpkmZWuwJbAvo3QcRY8p80m5+ZdXZdYOSBLlp7Cg==",
- "license": "MIT",
- "peerDependencies": {
- "@redis/client": "^1.0.0"
- }
- },
- "node_modules/@redis/client": {
- "version": "1.6.0",
- "resolved": "https://registry.npmjs.org/@redis/client/-/client-1.6.0.tgz",
- "integrity": "sha512-aR0uffYI700OEEH4gYnitAnv3vzVGXCFvYfdpu/CJKvk4pHfLPEy/JSZyrpQ+15WhXe1yJRXLtfQ84s4mEXnPg==",
- "license": "MIT",
- "dependencies": {
- "cluster-key-slot": "1.1.2",
- "generic-pool": "3.9.0",
- "yallist": "4.0.0"
- },
- "engines": {
- "node": ">=14"
- }
- },
- "node_modules/@redis/client/node_modules/yallist": {
- "version": "4.0.0",
- "resolved": "https://registry.npmjs.org/yallist/-/yallist-4.0.0.tgz",
- "integrity": "sha512-3wdGidZyq5PB084XLES5TpOSRA3wjXAlIWMhum2kRcv/41Sn2emQ0dycQW4uZXLejwKvg6EsvbdlVL+FYEct7A==",
- "license": "ISC"
- },
- "node_modules/@redis/graph": {
- "version": "1.1.1",
- "resolved": "https://registry.npmjs.org/@redis/graph/-/graph-1.1.1.tgz",
- "integrity": "sha512-FEMTcTHZozZciLRl6GiiIB4zGm5z5F3F6a6FZCyrfxdKOhFlGkiAqlexWMBzCi4DcRoyiOsuLfW+cjlGWyExOw==",
- "license": "MIT",
- "peerDependencies": {
- "@redis/client": "^1.0.0"
- }
- },
- "node_modules/@redis/json": {
- "version": "1.0.7",
- "resolved": "https://registry.npmjs.org/@redis/json/-/json-1.0.7.tgz",
- "integrity": "sha512-6UyXfjVaTBTJtKNG4/9Z8PSpKE6XgSyEb8iwaqDcy+uKrd/DGYHTWkUdnQDyzm727V7p21WUMhsqz5oy65kPcQ==",
- "license": "MIT",
- "peerDependencies": {
- "@redis/client": "^1.0.0"
- }
- },
- "node_modules/@redis/search": {
- "version": "1.2.0",
- "resolved": "https://registry.npmjs.org/@redis/search/-/search-1.2.0.tgz",
- "integrity": "sha512-tYoDBbtqOVigEDMAcTGsRlMycIIjwMCgD8eR2t0NANeQmgK/lvxNAvYyb6bZDD4frHRhIHkJu2TBRvB0ERkOmw==",
- "license": "MIT",
- "peerDependencies": {
- "@redis/client": "^1.0.0"
- }
- },
- "node_modules/@redis/time-series": {
- "version": "1.1.0",
- "resolved": "https://registry.npmjs.org/@redis/time-series/-/time-series-1.1.0.tgz",
- "integrity": "sha512-c1Q99M5ljsIuc4YdaCwfUEXsofakb9c8+Zse2qxTadu8TalLXuAESzLvFAvNVbkmSlvlzIQOLpBCmWI9wTOt+g==",
- "license": "MIT",
- "peerDependencies": {
- "@redis/client": "^1.0.0"
- }
- },
"node_modules/@repo/eslint-config": {
"resolved": "packages/eslint-config",
"link": true
@@ -5742,6 +5677,15 @@
"integrity": "sha512-IV3Ou0jSMzZrd3pZ48nLkT9DA7Ag1pnPzaiQhpW7c3RbcqqzvzzVu+L8gfqMp/8IM2MQtSiqaCxrrcfu8I8rMA==",
"license": "MIT"
},
+ "node_modules/clone": {
+ "version": "2.1.2",
+ "resolved": "https://registry.npmjs.org/clone/-/clone-2.1.2.tgz",
+ "integrity": "sha512-3Pe/CF1Nn94hyhIYpjtiLhdCoEoz0DqQ+988E9gmeEdQZlojxnOb74wctFyuwWQHzqyf9X7C7MG8juUpqBJT8w==",
+ "license": "MIT",
+ "engines": {
+ "node": ">=0.8"
+ }
+ },
"node_modules/clsx": {
"version": "2.1.1",
"resolved": "https://registry.npmjs.org/clsx/-/clsx-2.1.1.tgz",
@@ -7986,15 +7930,6 @@
"url": "https://github.com/sponsors/ljharb"
}
},
- "node_modules/generic-pool": {
- "version": "3.9.0",
- "resolved": "https://registry.npmjs.org/generic-pool/-/generic-pool-3.9.0.tgz",
- "integrity": "sha512-hymDOu5B53XvN4QT9dBmZxPX4CWhBPPLguTZ9MMFeFa/Kg0xWVfylOVNlJji/E7yTZWFd/q9GO5TxDLq156D7g==",
- "license": "MIT",
- "engines": {
- "node": ">= 4"
- }
- },
"node_modules/gensync": {
"version": "1.0.0-beta.2",
"resolved": "https://registry.npmjs.org/gensync/-/gensync-1.0.0-beta.2.tgz",
@@ -9904,6 +9839,18 @@
"integrity": "sha512-AGK2yQKIjRuqnc6VkX2Xj5d+QW8xZ87pa1UK6yA6ouUyuxfHuMP6umE5QK7UmTeOAymo+Zx1Fxiuw9rVx8taHQ==",
"license": "MIT"
},
+ "node_modules/node-cache": {
+ "version": "5.1.2",
+ "resolved": "https://registry.npmjs.org/node-cache/-/node-cache-5.1.2.tgz",
+ "integrity": "sha512-t1QzWwnk4sjLWaQAS8CHgOJ+RAfmHpxFWmc36IWTiWHQfs0w5JDMBS1b1ZxQteo0vVVuWJvIUKHDkkeK7vIGCg==",
+ "license": "MIT",
+ "dependencies": {
+ "clone": "2.x"
+ },
+ "engines": {
+ "node": ">= 8.0.0"
+ }
+ },
"node_modules/node-domexception": {
"version": "1.0.0",
"resolved": "https://registry.npmjs.org/node-domexception/-/node-domexception-1.0.0.tgz",
@@ -11196,23 +11143,6 @@
"node": ">=8.10.0"
}
},
- "node_modules/redis": {
- "version": "4.7.0",
- "resolved": "https://registry.npmjs.org/redis/-/redis-4.7.0.tgz",
- "integrity": "sha512-zvmkHEAdGMn+hMRXuMBtu4Vo5P6rHQjLoHftu+lBqq8ZTA3RCVC/WzD790bkKKiNFp7d5/9PcSD19fJyyRvOdQ==",
- "license": "MIT",
- "workspaces": [
- "./packages/*"
- ],
- "dependencies": {
- "@redis/bloom": "1.2.0",
- "@redis/client": "1.6.0",
- "@redis/graph": "1.1.1",
- "@redis/json": "1.0.7",
- "@redis/search": "1.2.0",
- "@redis/time-series": "1.1.0"
- }
- },
"node_modules/redis-errors": {
"version": "1.2.0",
"resolved": "https://registry.npmjs.org/redis-errors/-/redis-errors-1.2.0.tgz",
@@ -13206,141 +13136,6 @@
"eslint": "^8.57.0",
"typescript": "^5.3.3"
}
- },
- "node_modules/@next/swc-darwin-arm64": {
- "version": "14.2.15",
- "resolved": "https://registry.npmjs.org/@next/swc-darwin-arm64/-/swc-darwin-arm64-14.2.15.tgz",
- "integrity": "sha512-Rvh7KU9hOUBnZ9TJ28n2Oa7dD9cvDBKua9IKx7cfQQ0GoYUwg9ig31O2oMwH3wm+pE3IkAQ67ZobPfEgurPZIA==",
- "cpu": [
- "arm64"
- ],
- "optional": true,
- "os": [
- "darwin"
- ],
- "engines": {
- "node": ">= 10"
- }
- },
- "node_modules/@next/swc-darwin-x64": {
- "version": "14.2.15",
- "resolved": "https://registry.npmjs.org/@next/swc-darwin-x64/-/swc-darwin-x64-14.2.15.tgz",
- "integrity": "sha512-5TGyjFcf8ampZP3e+FyCax5zFVHi+Oe7sZyaKOngsqyaNEpOgkKB3sqmymkZfowy3ufGA/tUgDPPxpQx931lHg==",
- "cpu": [
- "x64"
- ],
- "optional": true,
- "os": [
- "darwin"
- ],
- "engines": {
- "node": ">= 10"
- }
- },
- "node_modules/@next/swc-linux-arm64-gnu": {
- "version": "14.2.15",
- "resolved": "https://registry.npmjs.org/@next/swc-linux-arm64-gnu/-/swc-linux-arm64-gnu-14.2.15.tgz",
- "integrity": "sha512-3Bwv4oc08ONiQ3FiOLKT72Q+ndEMyLNsc/D3qnLMbtUYTQAmkx9E/JRu0DBpHxNddBmNT5hxz1mYBphJ3mfrrw==",
- "cpu": [
- "arm64"
- ],
- "optional": true,
- "os": [
- "linux"
- ],
- "engines": {
- "node": ">= 10"
- }
- },
- "node_modules/@next/swc-linux-arm64-musl": {
- "version": "14.2.15",
- "resolved": "https://registry.npmjs.org/@next/swc-linux-arm64-musl/-/swc-linux-arm64-musl-14.2.15.tgz",
- "integrity": "sha512-k5xf/tg1FBv/M4CMd8S+JL3uV9BnnRmoe7F+GWC3DxkTCD9aewFRH1s5rJ1zkzDa+Do4zyN8qD0N8c84Hu96FQ==",
- "cpu": [
- "arm64"
- ],
- "optional": true,
- "os": [
- "linux"
- ],
- "engines": {
- "node": ">= 10"
- }
- },
- "node_modules/@next/swc-linux-x64-gnu": {
- "version": "14.2.15",
- "resolved": "https://registry.npmjs.org/@next/swc-linux-x64-gnu/-/swc-linux-x64-gnu-14.2.15.tgz",
- "integrity": "sha512-kE6q38hbrRbKEkkVn62reLXhThLRh6/TvgSP56GkFNhU22TbIrQDEMrO7j0IcQHcew2wfykq8lZyHFabz0oBrA==",
- "cpu": [
- "x64"
- ],
- "optional": true,
- "os": [
- "linux"
- ],
- "engines": {
- "node": ">= 10"
- }
- },
- "node_modules/@next/swc-linux-x64-musl": {
- "version": "14.2.15",
- "resolved": "https://registry.npmjs.org/@next/swc-linux-x64-musl/-/swc-linux-x64-musl-14.2.15.tgz",
- "integrity": "sha512-PZ5YE9ouy/IdO7QVJeIcyLn/Rc4ml9M2G4y3kCM9MNf1YKvFY4heg3pVa/jQbMro+tP6yc4G2o9LjAz1zxD7tQ==",
- "cpu": [
- "x64"
- ],
- "optional": true,
- "os": [
- "linux"
- ],
- "engines": {
- "node": ">= 10"
- }
- },
- "node_modules/@next/swc-win32-arm64-msvc": {
- "version": "14.2.15",
- "resolved": "https://registry.npmjs.org/@next/swc-win32-arm64-msvc/-/swc-win32-arm64-msvc-14.2.15.tgz",
- "integrity": "sha512-2raR16703kBvYEQD9HNLyb0/394yfqzmIeyp2nDzcPV4yPjqNUG3ohX6jX00WryXz6s1FXpVhsCo3i+g4RUX+g==",
- "cpu": [
- "arm64"
- ],
- "optional": true,
- "os": [
- "win32"
- ],
- "engines": {
- "node": ">= 10"
- }
- },
- "node_modules/@next/swc-win32-ia32-msvc": {
- "version": "14.2.15",
- "resolved": "https://registry.npmjs.org/@next/swc-win32-ia32-msvc/-/swc-win32-ia32-msvc-14.2.15.tgz",
- "integrity": "sha512-fyTE8cklgkyR1p03kJa5zXEaZ9El+kDNM5A+66+8evQS5e/6v0Gk28LqA0Jet8gKSOyP+OTm/tJHzMlGdQerdQ==",
- "cpu": [
- "ia32"
- ],
- "optional": true,
- "os": [
- "win32"
- ],
- "engines": {
- "node": ">= 10"
- }
- },
- "node_modules/@next/swc-win32-x64-msvc": {
- "version": "14.2.15",
- "resolved": "https://registry.npmjs.org/@next/swc-win32-x64-msvc/-/swc-win32-x64-msvc-14.2.15.tgz",
- "integrity": "sha512-SzqGbsLsP9OwKNUG9nekShTwhj6JSB9ZLMWQ8g1gG6hdE5gQLncbnbymrwy2yVmH9nikSLYRYxYMFu78Ggp7/g==",
- "cpu": [
- "x64"
- ],
- "optional": true,
- "os": [
- "win32"
- ],
- "engines": {
- "node": ">= 10"
- }
}
}
}
diff --git a/turbo.json b/turbo.json
index c2902f0..563e93c 100644
--- a/turbo.json
+++ b/turbo.json
@@ -10,8 +10,7 @@
"DIRECT_URL",
"OPENAI_API_KEY",
"NEXTAUTH_SECRET",
- "NEXT_BG_TASK_URL",
- "REDIS_HOST"
+ "NEXT_BG_TASK_URL"
],
"outputs": [".next/**", "!.next/cache/**"]
},