You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
release: v2.3.6 — configurable ComfyUI host + OpenAI-compat local CORS fix
Add
- Configurable ComfyUI host (Settings → ComfyUI → Host). Point LU at a
remote ComfyUI (Docker, LAN, homelab). Local controls stay visible
when the host resolves to localhost/127.0.0.1/::1/0.0.0.0; remote
hosts hide Start/Stop/Restart/Install/Path since LU can't manage a
remote Python process. Mobile Remote proxy also honors the host.
Requested in Discussion #1 by @ShoaibSajid. +17 regression tests.
Fix
- ComfyUI port now persists across restarts. `set_comfyui_port` wrote
to config.json but AppState::new() never read it back.
- OpenAI-compat local backends (LM Studio, vLLM, llama.cpp server,
KoboldCpp, Jan, GPT4All, oobabooga, Aphrodite, SGLang, TGI, LocalAI,
TabbyAPI) can actually be reached from LU's Tauri webview.
openai-provider.ts used plain fetch() which CORS-blocks localhost
inside the WebView2; Settings "Test" always showed Failed and
models never appeared in the dropdown. Each HTTP call now picks
localFetch/localFetchStream when the baseUrl hostname is local.
Cloud providers (OpenAI, Anthropic, Groq, etc.) still use plain fetch.
Surfaced during live E2E against a real LM Studio server on :1234.
Test suite 2166 → 2183 green.
Copy file name to clipboardExpand all lines: CHANGELOG.md
+15Lines changed: 15 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -2,6 +2,21 @@
2
2
3
3
All notable changes to Locally Uncensored are documented here.
4
4
5
+
## [2.3.6] - 2026-04-21
6
+
7
+
### Added
8
+
-**Configurable ComfyUI host (remote ComfyUI support)** — Settings → ComfyUI → Host. Previously only the port was configurable; the host was hardcoded `localhost`, which meant users running ComfyUI in Docker, on a LAN machine, or on a headless homelab server couldn't point LU at it. The Host field accepts any hostname or IP. When the host resolves to the local machine (`localhost`, `127.0.0.1`, `::1`, `0.0.0.0`) the Start/Stop/Restart/Install/Path controls stay visible; when it's remote LU hides those controls and shows an amber hint that you manage the Python process on the server yourself. Requested in GitHub Discussion #1 by @ShoaibSajid (desktop LU + Ollama on server-1 + ComfyUI on server-1 docker). The mobile Remote proxy also honors the new host so mobile-dispatched ComfyUI calls reach the configured backend. 17 regression tests in `backend-urls.test.ts`.
9
+
10
+
### Fixed
11
+
-**ComfyUI port now actually persists across restarts** — pre-existing bug: `set_comfyui_port` wrote `comfyui_port` to `%APPDATA%/locally-uncensored/config.json`, but `AppState::new()` never read it back on startup. Users who set a custom port (e.g. because 8188 was taken) got their change reverted to 8188 on the next launch. New `load_comfy_config_values()` helper runs at startup and applies persisted port + host. Bundled with the host feature since they share the same config-load path.
12
+
- **OpenAI-compat local backends (LM Studio, vLLM, llama.cpp server, KoboldCpp, oobabooga, Jan, GPT4All, Aphrodite, SGLang, TGI, LocalAI, TabbyAPI) can actually be reached from LU's Tauri webview** — `openai-provider.ts` used plain `fetch()` for `/v1/models`, `/v1/chat/completions`, and `checkConnection`, which CORS-blocks localhost requests inside the Tauri WebView (only Ollama had CORS open). The "Test" button in Settings → Providers always showed **Failed** and models never appeared in the dropdown even when the backend was obviously reachable via curl. Fix: each HTTP call now picks `localFetch`/`localFetchStream` when the provider baseUrl hostname is local (`localhost`/`127.0.0.1`/`::1`/`0.0.0.0`), which routes through the Rust proxy with a direct-fetch fallback. Cloud endpoints (OpenAI proper, OpenRouter, Groq, Together, DeepSeek, Mistral, etc.) skip the proxy since they don't have the localhost CORS issue. Surfaced during v2.3.6 live E2E against a real LM Studio server on :1234; the Djoks auto-detection fix (v2.3.5) was detecting + pre-enabling the provider correctly, but the actual /v1/* calls were silently CORS-rejected.
13
+
14
+
### Changed
15
+
- Test suite 2166 → 2183 green (+17 regression tests for `setComfyHost` / `isComfyLocal` / `comfyuiUrl` / `comfyuiWsUrl` with custom hosts).
16
+
17
+
### Notes
18
+
- Drop-in upgrade from v2.3.5. No breaking changes. The default host is still `localhost` — existing users see zero behavior change unless they explicitly switch to a remote host.
0 commit comments