|
| 1 | +# AnythingLLM Extension |
| 2 | + |
| 3 | +All-in-one AI productivity tool with RAG for Dream Server. |
| 4 | + |
| 5 | +## What It Is |
| 6 | + |
| 7 | +AnythingLLM lets you chat with your documents using AI: |
| 8 | +- Upload PDFs, Word docs, text files, code |
| 9 | +- Automatic chunking and embedding |
| 10 | +- Built-in vector database (LanceDB) |
| 11 | +- Multiple LLM provider support |
| 12 | +- Fully local, privacy-first |
| 13 | + |
| 14 | +## Features |
| 15 | + |
| 16 | +- **Document chat**: Upload and chat with any document |
| 17 | +- **Multi-LLM**: Use Ollama, OpenAI, Anthropic, or local models |
| 18 | +- **Built-in embeddings**: Automatic document vectorization |
| 19 | +- **Workspaces**: Organize documents into projects |
| 20 | +- **Agent support**: Automated workflows and tasks |
| 21 | +- **Web browsing**: Optional web search integration |
| 22 | +- **Multi-user**: Built-in authentication |
| 23 | + |
| 24 | +## Configuration |
| 25 | + |
| 26 | +### Environment Variables |
| 27 | + |
| 28 | +| Variable | Description | Default | |
| 29 | +|----------|-------------|---------| |
| 30 | +| `ANYTHINGLLM_PORT` | External port | `3001` | |
| 31 | +| `ANYTHINGLLM_JWT_SECRET` | JWT signing secret | (required, 32+ chars) | |
| 32 | +| `ANYTHINGLLM_LLM_PROVIDER` | LLM backend | `ollama` | |
| 33 | +| `OLLAMA_BASE_PATH` | Ollama API URL | `http://ollama:11434` | |
| 34 | +| `OLLAMA_MODEL_PREF` | Default model | `llama3.2` | |
| 35 | +| `ANYTHINGLLM_EMBEDDING_ENGINE` | Embedding provider | `ollama` | |
| 36 | +| `EMBEDDING_MODEL_PREF` | Embedding model | `nomic-embed-text:latest` | |
| 37 | +| `ANYTHINGLLM_VECTOR_DB` | Vector database | `lancedb` | |
| 38 | + |
| 39 | +### LLM Providers |
| 40 | + |
| 41 | +Set `ANYTHINGLLM_LLM_PROVIDER` to one of: |
| 42 | +- `ollama` - Local models via Ollama |
| 43 | +- `openai` - OpenAI API |
| 44 | +- `anthropic` - Claude API |
| 45 | +- `azure` - Azure OpenAI |
| 46 | +- `localai` - LocalAI endpoint |
| 47 | + |
| 48 | +## Usage |
| 49 | + |
| 50 | +```bash |
| 51 | +# Enable the extension |
| 52 | +dream extensions enable anythingllm |
| 53 | + |
| 54 | +# Start the service |
| 55 | +docker compose up -d anythingllm |
| 56 | + |
| 57 | +# Access at http://localhost:3001 |
| 58 | +# First run: Create admin account |
| 59 | +``` |
| 60 | + |
| 61 | +## Setup Steps |
| 62 | + |
| 63 | +1. **Enable**: `dream extensions enable anythingllm` |
| 64 | +2. **Start**: `docker compose up -d anythingllm` |
| 65 | +3. **Open**: Visit http://localhost:3001 |
| 66 | +4. **Create workspace**: Click "New Workspace" |
| 67 | +5. **Upload documents**: Drag & drop files |
| 68 | +6. **Chat**: Ask questions about your documents |
| 69 | + |
| 70 | +## Data Persistence |
| 71 | + |
| 72 | +All data stored in: |
| 73 | +- `./data/anythingllm/` - Documents, embeddings, settings |
| 74 | + |
| 75 | +## Integration with Dream Server |
| 76 | + |
| 77 | +By default, uses Dream Server's Ollama extension: |
| 78 | +- Set `OLLAMA_BASE_PATH=http://ollama:11434` |
| 79 | +- Models auto-detected from Ollama |
| 80 | + |
| 81 | +To use llama-server instead: |
| 82 | +1. Set `ANYTHINGLLM_LLM_PROVIDER=openai` |
| 83 | +2. Set custom endpoint in UI to `${LLM_API_URL}` |
| 84 | + |
| 85 | +## Security Note |
| 86 | + |
| 87 | +⚠️ **Change the JWT secret before production use:** |
| 88 | +```bash |
| 89 | +# In your .env |
| 90 | +ANYTHINGLLM_JWT_SECRET=your-64-character-random-string-here-please-change-me |
| 91 | +``` |
| 92 | + |
| 93 | +## Resources |
| 94 | + |
| 95 | +- [AnythingLLM Docs](https://docs.anythingllm.com/) |
| 96 | +- [GitHub Repository](https://github.com/Mintplex-Labs/anything-llm) |
0 commit comments