You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
chore: prevent setup.bat from luring end-users into dev mode (#30)
Reported as github issue #30 by @EnotikSergo: ran setup.bat expecting
it to install the LU desktop app, ended up with `npm run dev` in
Opera/Chrome, saw [vite] http proxy error: /system_stats ECONNREFUSED
from a ComfyUI that was never installed. README described setup.bat
as "Windows One-Click Setup" which reads as end-user install.
Fixed on three scripts + the README:
- setup.bat, setup.ps1, setup.sh now start with a prominent yellow
banner explaining this is dev-mode (Vite + browser, fewer features)
and linking to the GitHub Releases installer. Each script prompts
Y/N before continuing so an accidental end-user can bail.
- README's "Windows One-Click Setup" section was reframed as
"For Contributors — Dev-Mode Setup" with an explicit "Just want to
use the app? Download the installer above" blurb.
Folded into v2.3.5 release alongside the LM Studio auto-detection fix
and the Windows subprocess CREATE_NO_WINDOW cleanup.
Co-Authored-By: Claude Opus 4.7 (1M context) <noreply@anthropic.com>
Copy file name to clipboardExpand all lines: CHANGELOG.md
+1Lines changed: 1 addition & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -7,6 +7,7 @@ All notable changes to Locally Uncensored are documented here.
7
7
### Fixed
8
8
-**LM Studio (and other openai-compat backends) now show up when Ollama is also running** — `AppShell`'s post-onboarding detection only auto-enabled a backend when exactly one was detected. With two or more (the very common Ollama + LM Studio setup) it showed the `BackendSelector` modal but pre-enabled nothing. Users who dismissed the modal saw zero LM Studio models in the chat dropdown even though LM Studio was clearly running — looked from the outside like "LU doesn't recognize my models". Reported via Discord `#help-chat` on 2026-04-21. Fix: the first non-Ollama detected backend is always pre-enabled (Ollama is left untouched since it has its own provider slot); the selector stays as an educational picker so you can change which openai-compat backend is primary. Reproduced live with a mock LM Studio endpoint on port 1234 with Ollama also running, verified the fix against the same setup on the release binary. Five regression tests in `AppShell-backend-autoenable.test.ts`.
9
9
-**No more terminal flashes on Windows when LU kills subprocesses** — two Windows-branch `Command::new` spawns were missing `CREATE_NO_WINDOW`: the `taskkill` calls in `AppState::Drop` that tear down ComfyUI + Claude Code process trees on LU shutdown, and the `docker pull` / `docker run` in `search.rs` that installs SearXNG. Both briefly flashed a console window at the user. Now 100% of Windows-branch subprocess spawns carry the flag. LU itself never spawns LM Studio (only talks HTTP to a user-run instance), so the "no terminal when using LM Studio" guarantee was already true on that path; this tightens the peripheral surface.
10
+
-**`setup.bat` / `setup.ps1` / `setup.sh` no longer mislead end-users into dev mode** — the scripts launched `npm run dev` (Vite + browser), which has fewer features than the installed Tauri app and produced confusing `[vite] http proxy error: /system_stats ECONNREFUSED` when ComfyUI wasn't installed yet. Reported via GitHub issue #30. Fix: all three setup scripts now start with a clear dev-mode banner, a link to the installer in Releases, and a one-key prompt to continue or exit. The README's "Windows One-Click Setup" section was also reframed as "For Contributors — Dev-Mode Setup" with an explicit pointer to the installer for end-users.
10
11
11
12
### Changed
12
13
- Test suite 2161 → 2166 green (+5 regression tests for the backend-autoenable fix).
Copy file name to clipboardExpand all lines: README.md
+8-4Lines changed: 8 additions & 4 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -37,11 +37,12 @@ No cloud. No data collection. No API keys. Auto-detects 12 local backends. Your
37
37
38
38
## v2.3.5 — Current Release
39
39
40
-
**LM Studio auto-detection fix + Windows terminal-popup cleanup, 2166 Tests**
40
+
**LM Studio auto-detection fix + Windows terminal-popup cleanup + clearer setup scripts, 2166 Tests**
41
41
42
42
### Critical Fixes (why you want this update)
43
43
-**LU now recognizes your LM Studio models when Ollama is also running** — if the first-launch detection found 2+ local backends (the very common "Ollama + LM Studio" setup), the backend selector modal opened but no provider got auto-enabled. Users who dismissed the modal saw zero LM Studio models in the dropdown even though LM Studio was clearly running. Fixed: the first non-Ollama backend is always pre-enabled; the selector stays as an educational picker so you can still switch primaries. Reproduced live and verified against a real LM Studio-like endpoint.
44
44
-**No more terminal flashes on Windows** — a couple of subprocess spawns on the Windows code path were missing `CREATE_NO_WINDOW`, so killing ComfyUI/Claude Code during LU shutdown or installing SearXNG briefly flashed a console window. 100% of Windows-branch spawns are now flagged.
45
+
-**`setup.bat` / `setup.sh` no longer lure end-users into dev mode** — previously advertised as "one-click setup" but actually launched `npm run dev` in a browser. Now each script opens with a clear dev-mode banner, a link to the installer, and a one-key prompt to continue or exit. README's setup section reframed for contributors only.
45
46
46
47
### What's still in v2.3.5 from v2.3.4
47
48
This is a hotfix release — v2.3.4's feature surface (chat-history persistence, Ollama 0.21 compat, Codex loop guard, stop-button instant, stale-chip fix, 12 backend auto-detection, Mobile Remote, Codex streaming, Agent Mode, ERNIE-Image, Qwen 3.6, 75+ downloadable models) is unchanged. Every fix from v2.3.4 and earlier still applies.
@@ -186,15 +187,18 @@ npm install
186
187
npm run dev
187
188
```
188
189
189
-
### Windows One-Click Setup
190
+
### For Contributors — Dev-Mode Setup
191
+
192
+
> ⚠️ **Just want to use the app?** Grab the installer from [Releases](https://github.com/PurpleDoubleD/locally-uncensored/releases/latest) (the `.exe` or `.msi` in the **Download** section above). That gives you the full Tauri desktop app with auto-update. The commands below start LU in **browser dev-mode** — fewer features, Vite proxy noise, meant for contributing to the codebase.
Copy file name to clipboardExpand all lines: setup.bat
+13-2Lines changed: 13 additions & 2 deletions
Original file line number
Diff line number
Diff line change
@@ -1,9 +1,20 @@
1
1
@echooff
2
-
title Locally Uncensored - Setup
2
+
title Locally Uncensored - Dev Setup
3
3
echo.
4
+
powershell -NoProfile -Command ^
5
+
"Write-Host '';Write-Host ' +---------------------------------------------------------------+' -F Yellow;Write-Host ' | This script starts Locally Uncensored in DEVELOPER mode. |' -F Yellow;Write-Host ' | It runs via Vite at http://localhost:5173 in your browser, |' -F Yellow;Write-Host ' | which has fewer features than the installed desktop app. |' -F Yellow;Write-Host ' | |' -F Yellow;Write-Host ' | Just want to USE the app? Download the installer instead: |' -F Yellow;Write-Host ' | https://github.com/PurpleDoubleD/locally-uncensored/releases |' -F Cyan;Write-Host ' +---------------------------------------------------------------+' -F Yellow;Write-Host ''"
6
+
echo.
7
+
choice /C YN /T 8 /D Y /M "Continue with developer setup? (Y/N, auto-Yes in 8s)"
0 commit comments