Skip to content
Open
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
16 changes: 9 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -84,16 +84,18 @@ There are mainly 2 ways of installing Perplexica - With Docker, Without Docker.

3. After cloning, navigate to the directory containing the project files.

4. Rename the `sample.config.toml` file to `config.toml`. For Docker setups, you need only fill in the following fields:
4. Update environment variables in `docker-compose.yml` file to configure `config.toml`.

- `OPENAI`: Your OpenAI API key. **You only need to fill this if you wish to use OpenAI's models**.
- `OLLAMA`: Your Ollama API URL. You should enter it as `http://host.docker.internal:PORT_NUMBER`. If you installed Ollama on port 11434, use `http://host.docker.internal:11434`. For other ports, adjust accordingly. **You need to fill this if you wish to use Ollama's models instead of OpenAI's**.
- `GROQ`: Your Groq API key. **You only need to fill this if you wish to use Groq's hosted models**.
- `ANTHROPIC`: Your Anthropic API key. **You only need to fill this if you wish to use Anthropic models**.
Example:

**Note**: You can change these after starting Perplexica from the settings dialog.
Below section in `config.toml` can be configured using variables `MODELS_CUSTOM_OPENAI_API_KEY="sk-123456"`, `MODELS_CUSTOM_OPENAI_API_URL="http://localopenai:11134"` and `MODELS_CUSTOM_OPENAI_MODEL_NAME="meta-llama/llama-4"`

- `SIMILARITY_MEASURE`: The similarity measure to use (This is filled by default; you can leave it as is if you are unsure about it.)
```toml
[MODELS.CUSTOM_OPENAI]
API_KEY = "sk-123456"
API_URL = "http://localopenai:11134"
MODEL_NAME = "meta-llama/llama-4"
```

5. Ensure you are in the directory containing the `docker-compose.yaml` file and execute:

Expand Down
4 changes: 3 additions & 1 deletion app.dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -24,12 +24,14 @@ COPY --from=builder /home/perplexica/.next/static ./public/_next/static

COPY --from=builder /home/perplexica/.next/standalone ./
COPY --from=builder /home/perplexica/data ./data

COPY drizzle ./drizzle
COPY --from=builder /home/perplexica/migrator/build ./build
COPY --from=builder /home/perplexica/migrator/index.js ./migrate.js

RUN mkdir /home/perplexica/uploads

COPY sample.config.toml /home/perplexica/config.toml
COPY entrypoint.sh ./entrypoint.sh
RUN chmod +x ./entrypoint.sh
CMD ["./entrypoint.sh"]
CMD ["bash", "./entrypoint.sh"]
14 changes: 13 additions & 1 deletion docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,19 @@ services:
dockerfile: app.dockerfile
environment:
- SEARXNG_API_URL=http://searxng:8080
- GENERAL_SIMILARITY_MEASURE="cosine" # "cosine" or "dot"
- GENERAL_KEEP_ALIVE="5m" # How long to keep Ollama models loaded into memory. (Instead of using -1 use "-1m")
- MODELS_OPENAI_API_KEY=""
- MODELS_GROQ_API_KEY=""
- MODELS_ANTHROPIC_API_KEY=""
- MODELS_GEMINI_API_KEY=""
- MODELS_CUSTOM_OPENAI_API_KEY=""
- MODELS_CUSTOM_OPENAI_API_URL=""
- MODELS_CUSTOM_OPENAI_MODEL_NAME=""
- MODELS_OLLAMA_API_URL="" # Ollama API URL - http://host.docker.internal:11434
- MODELS_DEEPSEEK_API_KEY=""
- MODELS_LM_STUDIO_API_URL="" # LM Studio API URL - http://host.docker.internal:1234
- API_ENDPOINTS_SEARXNG="" # SearxNG API URL - http://localhost:32768
- DATA_DIR=/home/perplexica
ports:
- 3000:3000
Expand All @@ -24,7 +37,6 @@ services:
volumes:
- backend-dbstore:/home/perplexica/data
- uploads:/home/perplexica/uploads
- ./config.toml:/home/perplexica/config.toml
restart: unless-stopped

networks:
Expand Down
48 changes: 47 additions & 1 deletion entrypoint.sh
Original file line number Diff line number Diff line change
@@ -1,6 +1,52 @@
#!/bin/sh
#!/usr/bin/env bash

set -e

CONFIG_TOML_FILE=/home/perplexica/config.toml

TMP_FILE=${CONFIG_TOML_FILE}.tmp
touch $TMP_FILE

while IFS= read -r line; do
# Check if line is a section header (e.g., "[GENERAL]")
if [[ "$line" =~ ^\[([^]]+)\] ]]; then
current_section="${BASH_REMATCH[1]}"
echo "$line" >> "$TMP_FILE"
continue
fi

# Skip empty lines and comments
if [[ -z "$line" || "$line" =~ ^[[:space:]]*\# ]]; then
echo "$line" >> "$TMP_FILE"
continue
fi

# Extract key and value (handling quoted values)
key=$(echo "$line" | cut -d '=' -f 1 | xargs)
value=$(echo "$line" | cut -d '=' -f 2- | xargs)


# Construct the environment variable name in form of SECTION_KEY (e.g., GENERAL_SIMILARITY_MEASURE, MODELS_GEMINI_API_KEY)
current_section=$(echo "$current_section" | sed 's/\./_/')
env_var_name="${current_section}_${key}"

# Check if the environment variable exists
env_var_value=$(echo "${!env_var_name}")
if [ -n "$env_var_value" ]; then
new_value="$env_var_value"
echo "$key = $new_value" >> "$TMP_FILE"
else
# Keep original line if no env var exists
echo "$line" >> "$TMP_FILE"
fi

done < "$CONFIG_TOML_FILE"

# Replace the original file
mv "$TMP_FILE" "$CONFIG_TOML_FILE"

echo "Config file updated successfully."

node migrate.js

exec node server.js