Skip to content

Commit 65d2587

Browse files
committed
release: v2.3.6 — configurable ComfyUI host + OpenAI-compat local CORS fix
Add - Configurable ComfyUI host (Settings → ComfyUI → Host). Point LU at a remote ComfyUI (Docker, LAN, homelab). Local controls stay visible when the host resolves to localhost/127.0.0.1/::1/0.0.0.0; remote hosts hide Start/Stop/Restart/Install/Path since LU can't manage a remote Python process. Mobile Remote proxy also honors the host. Requested in Discussion #1 by @ShoaibSajid. +17 regression tests. Fix - ComfyUI port now persists across restarts. `set_comfyui_port` wrote to config.json but AppState::new() never read it back. - OpenAI-compat local backends (LM Studio, vLLM, llama.cpp server, KoboldCpp, Jan, GPT4All, oobabooga, Aphrodite, SGLang, TGI, LocalAI, TabbyAPI) can actually be reached from LU's Tauri webview. openai-provider.ts used plain fetch() which CORS-blocks localhost inside the WebView2; Settings "Test" always showed Failed and models never appeared in the dropdown. Each HTTP call now picks localFetch/localFetchStream when the baseUrl hostname is local. Cloud providers (OpenAI, Anthropic, Groq, etc.) still use plain fetch. Surfaced during live E2E against a real LM Studio server on :1234. Test suite 2166 → 2183 green.
1 parent dbb4e96 commit 65d2587

13 files changed

Lines changed: 394 additions & 33 deletions

File tree

CHANGELOG.md

Lines changed: 15 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,21 @@
22

33
All notable changes to Locally Uncensored are documented here.
44

5+
## [2.3.6] - 2026-04-21
6+
7+
### Added
8+
- **Configurable ComfyUI host (remote ComfyUI support)** — Settings → ComfyUI → Host. Previously only the port was configurable; the host was hardcoded `localhost`, which meant users running ComfyUI in Docker, on a LAN machine, or on a headless homelab server couldn't point LU at it. The Host field accepts any hostname or IP. When the host resolves to the local machine (`localhost`, `127.0.0.1`, `::1`, `0.0.0.0`) the Start/Stop/Restart/Install/Path controls stay visible; when it's remote LU hides those controls and shows an amber hint that you manage the Python process on the server yourself. Requested in GitHub Discussion #1 by @ShoaibSajid (desktop LU + Ollama on server-1 + ComfyUI on server-1 docker). The mobile Remote proxy also honors the new host so mobile-dispatched ComfyUI calls reach the configured backend. 17 regression tests in `backend-urls.test.ts`.
9+
10+
### Fixed
11+
- **ComfyUI port now actually persists across restarts** — pre-existing bug: `set_comfyui_port` wrote `comfyui_port` to `%APPDATA%/locally-uncensored/config.json`, but `AppState::new()` never read it back on startup. Users who set a custom port (e.g. because 8188 was taken) got their change reverted to 8188 on the next launch. New `load_comfy_config_values()` helper runs at startup and applies persisted port + host. Bundled with the host feature since they share the same config-load path.
12+
- **OpenAI-compat local backends (LM Studio, vLLM, llama.cpp server, KoboldCpp, oobabooga, Jan, GPT4All, Aphrodite, SGLang, TGI, LocalAI, TabbyAPI) can actually be reached from LU's Tauri webview** — `openai-provider.ts` used plain `fetch()` for `/v1/models`, `/v1/chat/completions`, and `checkConnection`, which CORS-blocks localhost requests inside the Tauri WebView (only Ollama had CORS open). The "Test" button in Settings → Providers always showed **Failed** and models never appeared in the dropdown even when the backend was obviously reachable via curl. Fix: each HTTP call now picks `localFetch`/`localFetchStream` when the provider baseUrl hostname is local (`localhost`/`127.0.0.1`/`::1`/`0.0.0.0`), which routes through the Rust proxy with a direct-fetch fallback. Cloud endpoints (OpenAI proper, OpenRouter, Groq, Together, DeepSeek, Mistral, etc.) skip the proxy since they don't have the localhost CORS issue. Surfaced during v2.3.6 live E2E against a real LM Studio server on :1234; the Djoks auto-detection fix (v2.3.5) was detecting + pre-enabling the provider correctly, but the actual /v1/* calls were silently CORS-rejected.
13+
14+
### Changed
15+
- Test suite 2166 → 2183 green (+17 regression tests for `setComfyHost` / `isComfyLocal` / `comfyuiUrl` / `comfyuiWsUrl` with custom hosts).
16+
17+
### Notes
18+
- Drop-in upgrade from v2.3.5. No breaking changes. The default host is still `localhost` — existing users see zero behavior change unless they explicitly switch to a remote host.
19+
520
## [2.3.5] - 2026-04-21
621

722
### Fixed

package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "locally-uncensored",
3-
"version": "2.3.5",
3+
"version": "2.3.6",
44
"private": false,
55
"description": "Generate anything — text, images, video. Locally. Uncensored.",
66
"license": "AGPL-3.0",

src-tauri/Cargo.lock

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

src-tauri/Cargo.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[package]
22
name = "locally-uncensored"
3-
version = "2.3.5"
3+
version = "2.3.6"
44
description = "Private, local AI chat & image/video generation"
55
authors = ["purpledoubled"]
66
edition = "2021"

src-tauri/src/commands/process.rs

Lines changed: 94 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -235,6 +235,19 @@ pub fn start_ollama(_state: State<'_, AppState>) -> Result<serde_json::Value, St
235235

236236
#[tauri::command]
237237
pub fn start_comfyui(state: State<'_, AppState>) -> Result<serde_json::Value, String> {
238+
// If user pointed LU at a remote ComfyUI, we have no local process to spawn.
239+
// Just report status — the remote side is responsible for running ComfyUI.
240+
{
241+
let host = state.comfy_host.lock().unwrap().clone();
242+
if !is_local_host(&host) {
243+
return Ok(serde_json::json!({
244+
"status": "remote",
245+
"host": host,
246+
"message": "Remote ComfyUI — manage the Python process on the server itself"
247+
}));
248+
}
249+
}
250+
238251
let port = *state.comfy_port.lock().unwrap();
239252

240253
if is_comfyui_running_on_port(port) {
@@ -339,11 +352,22 @@ pub fn stop_comfyui(state: State<'_, AppState>) -> Result<serde_json::Value, Str
339352
#[tauri::command]
340353
pub async fn comfyui_status(state: State<'_, AppState>) -> Result<serde_json::Value, String> {
341354
let port = *state.comfy_port.lock().unwrap();
342-
343-
let running = reqwest::get(format!("http://localhost:{}/system_stats", port))
344-
.await
345-
.map(|r| r.status().is_success())
346-
.unwrap_or(false);
355+
let host = state.comfy_host.lock().unwrap().clone();
356+
let is_local = is_local_host(&host);
357+
358+
// Probe the configured host (not just localhost). Remote ComfyUI
359+
// still reports running: true if the /system_stats endpoint responds.
360+
let running = reqwest::Client::builder()
361+
.timeout(std::time::Duration::from_secs(3))
362+
.build()
363+
.ok()
364+
.and_then(|c| Some(c.get(format!("http://{}:{}/system_stats", host, port))))
365+
.map(|req| async move { req.send().await.map(|r| r.status().is_success()).unwrap_or(false) })
366+
;
367+
let running = match running {
368+
Some(fut) => fut.await,
369+
None => false,
370+
};
347371

348372
let process_alive = {
349373
let proc = state.comfy_process.lock().unwrap();
@@ -355,18 +379,32 @@ pub async fn comfyui_status(state: State<'_, AppState>) -> Result<serde_json::Va
355379
p.clone()
356380
};
357381

358-
let found = path.is_some() || find_comfyui_path().is_some();
382+
// For remote hosts we don't care whether a local install path exists.
383+
let found = if is_local {
384+
path.is_some() || find_comfyui_path().is_some()
385+
} else {
386+
true // the remote side handles its own install
387+
};
359388

360389
Ok(serde_json::json!({
361390
"running": running,
362391
"starting": process_alive && !running,
363392
"found": found,
364393
"path": path,
365394
"port": port,
395+
"host": host,
396+
"isLocal": is_local,
366397
"processAlive": process_alive,
367398
}))
368399
}
369400

401+
/// Returns true when `host` refers to the local machine.
402+
/// Anything else = remote and LU won't try to manage the process.
403+
pub fn is_local_host(host: &str) -> bool {
404+
let h = host.trim().to_ascii_lowercase();
405+
matches!(h.as_str(), "localhost" | "127.0.0.1" | "::1" | "0.0.0.0" | "")
406+
}
407+
370408
#[tauri::command]
371409
pub fn find_comfyui() -> Result<serde_json::Value, String> {
372410
match find_comfyui_path() {
@@ -410,6 +448,47 @@ pub fn set_comfyui_path(path: String, state: State<'_, AppState>) -> Result<serd
410448
Ok(serde_json::json!({"status": "saved", "path": path}))
411449
}
412450

451+
#[tauri::command]
452+
pub fn set_comfyui_host(host: String, state: State<'_, AppState>) -> Result<serde_json::Value, String> {
453+
let trimmed = host.trim();
454+
if trimmed.is_empty() {
455+
return Err("Host must not be empty".to_string());
456+
}
457+
// Reject obviously invalid chars — helps avoid URL-injection style typos.
458+
if trimmed.contains('/') || trimmed.contains(' ') || trimmed.contains('?') {
459+
return Err("Host must be a plain hostname or IP, no slashes/spaces".to_string());
460+
}
461+
let final_host = trimmed.to_string();
462+
463+
{
464+
let mut h = state.comfy_host.lock().unwrap();
465+
*h = final_host.clone();
466+
}
467+
468+
// Persist to config file
469+
if let Some(config_dir) = dirs::config_dir() {
470+
let app_config = config_dir.join("locally-uncensored");
471+
let _ = fs::create_dir_all(&app_config);
472+
let config_file = app_config.join("config.json");
473+
474+
let mut config: serde_json::Value = if config_file.exists() {
475+
fs::read_to_string(&config_file)
476+
.ok()
477+
.and_then(|s| serde_json::from_str(&s).ok())
478+
.unwrap_or_else(|| serde_json::json!({}))
479+
} else {
480+
serde_json::json!({})
481+
};
482+
483+
config["comfyui_host"] = serde_json::json!(final_host);
484+
let _ = fs::write(&config_file, serde_json::to_string_pretty(&config).unwrap());
485+
}
486+
487+
let is_local = is_local_host(&final_host);
488+
println!("[ComfyUI] Host set to {} (local={})", final_host, is_local);
489+
Ok(serde_json::json!({"status": "saved", "host": final_host, "isLocal": is_local}))
490+
}
491+
413492
#[tauri::command]
414493
pub fn set_comfyui_port(port: u16, state: State<'_, AppState>) -> Result<serde_json::Value, String> {
415494
if port == 0 {
@@ -477,6 +556,15 @@ pub fn auto_start_ollama(_state: &AppState) {
477556

478557
/// Auto-start ComfyUI on app launch (called from setup)
479558
pub fn auto_start_comfyui(state: &AppState) {
559+
// If user configured a remote host, don't try to auto-start anything locally.
560+
{
561+
let host = state.comfy_host.lock().unwrap().clone();
562+
if !is_local_host(&host) {
563+
println!("[ComfyUI] Remote host configured ({}), skipping local auto-start", host);
564+
return;
565+
}
566+
}
567+
480568
// Always try to find and store the ComfyUI path (needed for downloads)
481569
if state.comfy_path.lock().unwrap().is_none() {
482570
if let Some(path) = find_comfyui_path() {

src-tauri/src/commands/remote.rs

Lines changed: 11 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -61,6 +61,10 @@ struct RemoteState {
6161
passcode: Arc<TokioMutex<PasscodeState>>,
6262
ollama_port: u16,
6363
comfy_port: u16,
64+
/// Configurable ComfyUI host — mirrors AppState.comfy_host so the mobile
65+
/// proxy forwards to the right machine when the user pointed LU at a
66+
/// remote ComfyUI instance.
67+
comfy_host: String,
6468
permissions: Arc<TokioMutex<RemotePermissions>>,
6569
connected_devices: Arc<TokioMutex<Vec<ConnectedDevice>>>,
6670
tunnel_url: Arc<TokioMutex<Option<String>>>,
@@ -688,7 +692,7 @@ async fn proxy_comfyui(
688692
}
689693

690694
let query = req.uri().query().map(|q| format!("?{}", q)).unwrap_or_default();
691-
let target = format!("http://127.0.0.1:{}{}{}", state.comfy_port, stripped_owned, query);
695+
let target = format!("http://{}:{}{}{}", state.comfy_host, state.comfy_port, stripped_owned, query);
692696
proxy_to_target(&target, req).await
693697
}
694698

@@ -758,10 +762,11 @@ async fn proxy_comfyui_ws(
758762
}
759763
}
760764
let comfy_port = state.comfy_port;
765+
let comfy_host = state.comfy_host.clone();
761766
ws.on_upgrade(move |client_socket| async move {
762767
use futures_util::{SinkExt, StreamExt};
763768

764-
let ws_url = format!("ws://127.0.0.1:{}/ws", comfy_port);
769+
let ws_url = format!("ws://{}:{}/ws", comfy_host, comfy_port);
765770
let upstream = match tokio_tungstenite::connect_async(&ws_url).await {
766771
Ok((stream, _)) => stream,
767772
Err(e) => {
@@ -3307,7 +3312,7 @@ pub async fn start_remote_server(
33073312
system_prompt: Option<String>,
33083313
) -> Result<serde_json::Value, String> {
33093314
// Clone Arcs from std::sync::Mutex, then drop it before any .await
3310-
let (jwt_secret_arc, passcode_arc, permissions_arc, devices_arc, tunnel_url_arc, dispatched_model_arc, dispatched_system_prompt_arc, port, comfy_port) = {
3315+
let (jwt_secret_arc, passcode_arc, permissions_arc, devices_arc, tunnel_url_arc, dispatched_model_arc, dispatched_system_prompt_arc, port, comfy_port, comfy_host) = {
33113316
let remote = state.remote.lock().map_err(|e| e.to_string())?;
33123317
if remote.handle.is_some() {
33133318
return Err("Remote server already running".into());
@@ -3316,6 +3321,7 @@ pub async fn start_remote_server(
33163321
// unwrap on a poisoned mutex would terminate the entire app. Treat
33173322
// a missing comfy_port as a non-fatal "no comfy yet" (port 0).
33183323
let comfy_port = state.comfy_port.lock().map(|g| *g).unwrap_or(0);
3324+
let comfy_host = state.comfy_host.lock().map(|g| g.clone()).unwrap_or_else(|_| "localhost".to_string());
33193325

33203326
(
33213327
remote.jwt_secret.clone(),
@@ -3327,6 +3333,7 @@ pub async fn start_remote_server(
33273333
remote.dispatched_system_prompt.clone(),
33283334
remote.port,
33293335
comfy_port,
3336+
comfy_host,
33303337
)
33313338
}; // std::sync::MutexGuard dropped here
33323339

@@ -3368,6 +3375,7 @@ pub async fn start_remote_server(
33683375
passcode: passcode_arc,
33693376
ollama_port: 11434,
33703377
comfy_port,
3378+
comfy_host,
33713379
permissions: permissions_arc,
33723380
connected_devices: devices_arc,
33733381
tunnel_url: tunnel_url_arc,

src-tauri/src/main.rs

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -29,6 +29,7 @@ fn main() {
2929
commands::process::find_comfyui,
3030
commands::process::set_comfyui_path,
3131
commands::process::set_comfyui_port,
32+
commands::process::set_comfyui_host,
3233
// Installation
3334
commands::install::install_comfyui,
3435
commands::install::install_comfyui_status,

src-tauri/src/state.rs

Lines changed: 46 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -41,10 +41,43 @@ impl Default for InstallState {
4141
}
4242
}
4343

44+
/// Read persisted ComfyUI port + host from %APPDATA%/locally-uncensored/config.json.
45+
/// Returns (port, host) with sensible defaults (8188, "localhost") on any error.
46+
/// Called at startup so user-configured values survive app restarts.
47+
pub(crate) fn load_comfy_config_values() -> (u16, String) {
48+
let mut port = 8188u16;
49+
let mut host = "localhost".to_string();
50+
51+
if let Some(config_dir) = dirs::config_dir() {
52+
let config_file = config_dir.join("locally-uncensored").join("config.json");
53+
if let Ok(raw) = std::fs::read_to_string(&config_file) {
54+
if let Ok(v) = serde_json::from_str::<serde_json::Value>(&raw) {
55+
if let Some(p) = v.get("comfyui_port").and_then(|x| x.as_u64()) {
56+
if p > 0 && p < 65536 {
57+
port = p as u16;
58+
}
59+
}
60+
if let Some(h) = v.get("comfyui_host").and_then(|x| x.as_str()) {
61+
let trimmed = h.trim();
62+
if !trimmed.is_empty() {
63+
host = trimmed.to_string();
64+
}
65+
}
66+
}
67+
}
68+
}
69+
70+
(port, host)
71+
}
72+
4473
pub struct AppState {
4574
pub comfy_process: Mutex<Option<Child>>,
4675
pub comfy_path: Mutex<Option<String>>,
4776
pub comfy_port: Mutex<u16>,
77+
/// Configurable ComfyUI host. Default "localhost". Setting this to a
78+
/// remote hostname/IP lets users point LU at a ComfyUI running on
79+
/// another machine (homelab, Docker, LAN). Persisted in config.json.
80+
pub comfy_host: Mutex<String>,
4881
pub whisper: Arc<Mutex<WhisperServer>>,
4982
pub downloads: Arc<Mutex<HashMap<String, DownloadProgress>>>,
5083
pub download_tokens: Arc<Mutex<HashMap<String, CancellationToken>>>,
@@ -66,10 +99,22 @@ impl AppState {
6699
let python_bin = get_python_bin();
67100
println!("[Python] Resolved: {}", python_bin);
68101

102+
// Load persisted ComfyUI port+host from config.json if available.
103+
// Fixes a pre-existing bug where `set_comfyui_port` wrote to disk but
104+
// startup never read it back. Same loader now handles the new host field.
105+
let (initial_port, initial_host) = load_comfy_config_values();
106+
if initial_port != 8188 {
107+
println!("[ComfyUI] Loaded persisted port: {}", initial_port);
108+
}
109+
if initial_host != "localhost" {
110+
println!("[ComfyUI] Loaded persisted host: {}", initial_host);
111+
}
112+
69113
Self {
70114
comfy_process: Mutex::new(None),
71115
comfy_path: Mutex::new(None),
72-
comfy_port: Mutex::new(8188),
116+
comfy_port: Mutex::new(initial_port),
117+
comfy_host: Mutex::new(initial_host),
73118
whisper: Arc::new(Mutex::new(WhisperServer::new())),
74119
downloads: Arc::new(Mutex::new(HashMap::new())),
75120
download_tokens: Arc::new(Mutex::new(HashMap::new())),

src-tauri/tauri.conf.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"$schema": "https://raw.githubusercontent.com/tauri-apps/tauri/dev/crates/tauri-cli/schema.json",
33
"productName": "Locally Uncensored",
4-
"version": "2.3.5",
4+
"version": "2.3.6",
55
"identifier": "com.purpledoubled.locally-uncensored",
66
"build": {
77
"beforeBuildCommand": "npm run build",

0 commit comments

Comments
 (0)