Skip to content

Commit 15a179c

Browse files
committed
Update README for improved setup instructions
Clarifies local setup instructions, corrects Docker port mapping, and updates LLM router description to reflect server-side smart routing. These changes improve accuracy and usability of the documentation.
1 parent ef72b8d commit 15a179c

File tree

1 file changed

+3
-3
lines changed

1 file changed

+3
-3
lines changed

README.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,7 @@ npm install
5353
npm run dev -- --open
5454
```
5555

56-
You now have Chat UI running against the Hugging Face router without needing to host MongoDB yourself.
56+
You now have Chat UI running locally. Open the browser and start chatting.
5757

5858
## Database Options
5959

@@ -95,7 +95,7 @@ Prefer containerized setup? You can run everything in one container as long as y
9595

9696
```bash
9797
docker run \
98-
-p 3000 \
98+
-p 3000:3000 \
9999
-e MONGODB_URL=mongodb://host.docker.internal:27017 \
100100
-e OPENAI_BASE_URL=https://router.huggingface.co/v1 \
101101
-e OPENAI_API_KEY=hf_*** \
@@ -128,7 +128,7 @@ This build does not use the `MODELS` env var or GGUF discovery. Configure models
128128

129129
### LLM Router (Optional)
130130

131-
Chat UI can perform client-side routing [katanemo/Arch-Router-1.5B](https://huggingface.co/katanemo/Arch-Router-1.5B) as the routing model without running a separate router service. The UI exposes a virtual model alias called "Omni" (configurable) that, when selected, chooses the best route/model for each message.
131+
Chat UI can perform server-side smart routing using [katanemo/Arch-Router-1.5B](https://huggingface.co/katanemo/Arch-Router-1.5B) as the routing model without running a separate router service. The UI exposes a virtual model alias called "Omni" (configurable) that, when selected, chooses the best route/model for each message.
132132

133133
- Provide a routes policy JSON via `LLM_ROUTER_ROUTES_PATH`. No sample file ships with this branch, so you must point the variable to a JSON array you create yourself (for example, commit one in your project like `config/routes.chat.json`). Each route entry needs `name`, `description`, `primary_model`, and optional `fallback_models`.
134134
- Configure the Arch router selection endpoint with `LLM_ROUTER_ARCH_BASE_URL` (OpenAI-compatible `/chat/completions`) and `LLM_ROUTER_ARCH_MODEL` (e.g. `router/omni`). The Arch call reuses `OPENAI_API_KEY` for auth.

0 commit comments

Comments
 (0)