Skip to content

Commit 7702f24

Browse files
PurpleDoubleDclaude
andcommitted
feat: v2.3.2 — GLM-4.7-Flash, model loading fix, agent badge audit
- Add GLM-4.7-Flash in 11 variants (4 Heretic uncensored + 7 mainstream, IQ2 to Q8) - Add GLM 5.1 754B MoE as cloud-available model - Fix 3 model loading bugs (race condition, broken retry, stale cache) — Discussion #22 - Audit all 75+ models for correct agent flags (tool calling capability) - Remove HOT badges from all models, keep only AGENT badges - Add agent flag to Qwen 3.5, Qwen3, GLM-4, GPT-OSS, Llama 3.1/3.3, Phi-4 - Think-mode guard for non-thinking models - Chat homepage null crash fix - Version bump to 2.3.2, README + landing page updated Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
1 parent 0509f8f commit 7702f24

26 files changed

Lines changed: 403 additions & 124 deletions

README.md

Lines changed: 19 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -34,23 +34,28 @@ No cloud. No data collection. No API keys. Auto-detects 12 local backends. Your
3434

3535
---
3636

37-
## v2.3.0 — Current Release
37+
## v2.3.2 — Current Release
3838

39-
**ComfyUI Plug & Play, 20 Model Bundles, Image-to-Image, Z-Image, FramePack I2V**
39+
**GLM-4.7-Flash, Model Loading Fix, Agent Badge Audit, 75+ Downloadable Models**
40+
41+
- **GLM-4.7-Flash** — ZhipuAI's strongest 30B class model. 11 variants across uncensored (Heretic) and mainstream, IQ2 to Q8. Fits 12GB VRAM (IQ2_M). Native tool calling.
42+
- **GLM 5.1 754B MoE** — Frontier agentic engineering model listed as cloud-available via Ollama.
43+
- **Model Loading Fix** — Fixed 3 bugs causing "0 models loaded" in ComfyUI Create View (race condition at startup, broken auto-retry logic, stale cache after download). Models now load reliably within seconds.
44+
- **Agent Badge Audit** — Consistent agent flags across all 75+ models. Models with native tool calling are correctly marked.
45+
- **Removed HOT Badges** — Cleaner UI, only AGENT badges shown for tool-calling models.
46+
47+
### v2.3.0 Features (included)
4048

4149
- **ComfyUI Plug & Play** — Auto-detect, one-click install, auto-start. Zero config image and video generation.
42-
- **20 Model Bundles** — 8 image + 12 video bundles with one-click download. Verified models marked, untested show "Coming Soon".
43-
- **Z-Image Turbo/Base** — Uncensored image model. 8-15 seconds per image. No safety filters. Text-to-Image and Image-to-Image.
44-
- **FLUX 2 Klein** — Next-gen FLUX architecture with Qwen 3 text encoder. Fastest FLUX model.
45-
- **Image-to-Image (I2I)** — Upload a source image, adjust denoise strength (0.0-1.0), transform with any prompt. Works with all image models (SDXL, FLUX, Z-Image).
46-
- **Image-to-Video (I2V)** — FramePack F1 and CogVideoX support with drag & drop image upload.
47-
- **FramePack F1** — Revolutionary I2V: runs on 6 GB VRAM via next-frame prediction.
48-
- **Dynamic Workflow Builder** — 14 strategies. Auto-detects installed nodes and builds the correct pipeline.
49-
- **VRAM-Aware Model Filtering** — Lightweight / Mid-Range / High-End tabs based on GPU VRAM.
50-
- **Unified Download Manager** — Track all downloads with progress, speed, retry for failed files.
50+
- **20 Model Bundles** — 8 image + 12 video bundles with one-click download.
51+
- **Z-Image Turbo/Base** — Uncensored image model. 8-15 seconds per image. No safety filters.
52+
- **FLUX 2 Klein** — Next-gen FLUX architecture with Qwen 3 text encoder.
53+
- **Image-to-Image (I2I)** — Upload a source image, adjust denoise strength, transform with any image model.
54+
- **Image-to-Video (I2V)** — FramePack F1, CogVideoX, SVD with drag & drop image upload.
55+
- **Dynamic Workflow Builder** — 14 strategies. Auto-detects installed nodes.
56+
- **Unified Download Manager** — Track all downloads with progress, speed, retry.
5157
- **Think Mode in Chat Input** — Toggle thinking mode directly from the message input area.
5258
- **Process Cleanup** — ComfyUI auto-terminates when app is closed (Windows Job Object).
53-
- **Hardware-Aware Onboarding** — Recommends Gemma 4, Qwen 3.5, and other models based on your GPU VRAM.
5459

5560
---
5661

@@ -176,9 +181,10 @@ Open the **Create** tab. ComfyUI is auto-detected or one-click installed. Models
176181

177182
| Model | VRAM | Best For |
178183
|-------|------|----------|
184+
| **GLM-4.7-Flash IQ2** | 12 GB | Strongest 30B class. Tool calling. 198K context. |
179185
| **Gemma 4 E4B** | 4 GB | Lightweight, fast, great for small GPUs. |
180186
| **Qwen 3.5 9B** | 8 GB | Strongest reasoning + coding at 9B. |
181-
| **Gemma 4 27B** | 16 GB | Frontier dense model, native tools + vision. |
187+
| **Gemma 4 31B** | 16 GB | Frontier dense model, native tools + vision. |
182188
| **Qwen 3.5 35B MoE** | 16 GB | Best agentic, 256K context. SWE-bench leader. |
183189
| Hermes 3 8B | 6 GB | Agent Mode. Uncensored + tool calling. |
184190
| DeepSeek R1 (8B-70B) | 6-48 GB | Chain-of-thought reasoning. |

docs/index.html

Lines changed: 10 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -24,7 +24,7 @@
2424
<meta name="theme-color" content="#0a0a0a">
2525
<link rel="icon" href="./favicon.png">
2626
<script type="application/ld+json">
27-
{"@context":"https://schema.org","@type":"SoftwareApplication","name":"Locally Uncensored","applicationCategory":"DeveloperApplication","applicationSubCategory":"Artificial Intelligence","operatingSystem":"Windows","description":"Generate anything — text, images, video. Locally. Uncensored. The only desktop app that combines AI chat, image generation, and video creation in one. No cloud, no API keys, no data collection. Free and open source.","url":"https://locallyuncensored.com/","downloadUrl":"https://github.com/PurpleDoubleD/locally-uncensored/releases/download/v2.3.1/Locally.Uncensored_2.3.1_x64-setup.exe","softwareVersion":"2.3.0","license":"https://www.gnu.org/licenses/agpl-3.0.html","author":{"@type":"Person","name":"PurpleDoubleD","url":"https://github.com/PurpleDoubleD"},"offers":{"@type":"Offer","price":"0","priceCurrency":"USD"},"featureList":["Plug & Play Setup with 12 Local Backend Auto-Detection","Uncensored AI Chat with 20+ Provider Presets","Codex Coding Agent with 13 MCP Tools","Multi-Provider: Ollama, LM Studio, vLLM, KoboldCpp, llama.cpp, LocalAI, Jan, OpenAI, Anthropic, OpenRouter, Groq, and more","Image Generation via ComfyUI (FLUX 2, Z-Image, SDXL)","Image-to-Image with Denoise Control","Video Generation (Wan 2.1, HunyuanVideo, LTX, AnimateDiff, FramePack)","Image-to-Video (FramePack F1, CogVideoX, SVD)","Granular Permission System","File Upload with Vision Support","Thinking Mode","A/B Model Compare","100% Offline and Private"],"screenshot":"https://raw.githubusercontent.com/PurpleDoubleD/locally-uncensored/master/docs/social-preview.png","softwareRequirements":"Any OpenAI-compatible backend (Ollama, LM Studio, vLLM, KoboldCpp, etc.) or cloud API key","memoryRequirements":"8 GB RAM minimum","storageRequirements":"6 GB for default model"}
27+
{"@context":"https://schema.org","@type":"SoftwareApplication","name":"Locally Uncensored","applicationCategory":"DeveloperApplication","applicationSubCategory":"Artificial Intelligence","operatingSystem":"Windows","description":"Generate anything — text, images, video. Locally. Uncensored. The only desktop app that combines AI chat, image generation, and video creation in one. No cloud, no API keys, no data collection. Free and open source.","url":"https://locallyuncensored.com/","downloadUrl":"https://github.com/PurpleDoubleD/locally-uncensored/releases/download/v2.3.2/Locally.Uncensored_2.3.2_x64-setup.exe","softwareVersion":"2.3.0","license":"https://www.gnu.org/licenses/agpl-3.0.html","author":{"@type":"Person","name":"PurpleDoubleD","url":"https://github.com/PurpleDoubleD"},"offers":{"@type":"Offer","price":"0","priceCurrency":"USD"},"featureList":["Plug & Play Setup with 12 Local Backend Auto-Detection","Uncensored AI Chat with 20+ Provider Presets","Codex Coding Agent with 13 MCP Tools","Multi-Provider: Ollama, LM Studio, vLLM, KoboldCpp, llama.cpp, LocalAI, Jan, OpenAI, Anthropic, OpenRouter, Groq, and more","Image Generation via ComfyUI (FLUX 2, Z-Image, SDXL)","Image-to-Image with Denoise Control","Video Generation (Wan 2.1, HunyuanVideo, LTX, AnimateDiff, FramePack)","Image-to-Video (FramePack F1, CogVideoX, SVD)","Granular Permission System","File Upload with Vision Support","Thinking Mode","A/B Model Compare","100% Offline and Private"],"screenshot":"https://raw.githubusercontent.com/PurpleDoubleD/locally-uncensored/master/docs/social-preview.png","softwareRequirements":"Any OpenAI-compatible backend (Ollama, LM Studio, vLLM, KoboldCpp, etc.) or cloud API key","memoryRequirements":"8 GB RAM minimum","storageRequirements":"6 GB for default model"}
2828
</script>
2929
<script type="application/ld+json">
3030
{"@context":"https://schema.org","@type":"Organization","name":"PurpleDoubleD","url":"https://github.com/PurpleDoubleD","sameAs":["https://github.com/PurpleDoubleD","https://reddit.com/user/GroundbreakingMall54"]}
@@ -206,12 +206,12 @@
206206
</nav>
207207

208208
<section class="hero">
209-
<div class="hero-badge">v2.3.1 / Open Source / AGPL-3.0</div>
209+
<div class="hero-badge">v2.3.2 / Open Source / AGPL-3.0</div>
210210
<h1>Your AI. Your machine.<br>No limits.</h1>
211211
<p>Chat, code, generate images, create videos — all running locally. Plug & Play with 12 local backends. 13 built-in tools, coding agent. No cloud, no accounts.</p>
212212
<p style="margin-top:8px"><a href="./guide/" style="color:#a78bfa;font-size:0.9rem">Getting Started Guide</a></p>
213213
<div class="hero-cta">
214-
<a href="https://github.com/PurpleDoubleD/locally-uncensored/releases/download/v2.3.1/Locally.Uncensored_2.3.1_x64-setup.exe" class="btn btn-primary" download>Download for Windows</a>
214+
<a href="https://github.com/PurpleDoubleD/locally-uncensored/releases/download/v2.3.2/Locally.Uncensored_2.3.2_x64-setup.exe" class="btn btn-primary" download>Download for Windows</a>
215215
<a href="https://github.com/PurpleDoubleD/locally-uncensored" class="btn" target="_blank">
216216
<svg width="16" height="16" viewBox="0 0 16 16" fill="currentColor"><path d="M8 0C3.58 0 0 3.58 0 8c0 3.54 2.29 6.53 5.47 7.59.4.07.55-.17.55-.38 0-.19-.01-.82-.01-1.49-2.01.37-2.53-.49-2.69-.94-.09-.23-.48-.94-.82-1.13-.28-.15-.68-.52-.01-.53.63-.01 1.08.58 1.23.82.72 1.21 1.87.87 2.33.66.07-.52.28-.87.51-1.07-1.78-.2-3.64-.89-3.64-3.95 0-.87.31-1.59.82-2.15-.08-.2-.36-1.02.08-2.12 0 0 .67-.21 2.2.82.64-.18 1.32-.27 2-.27.68 0 1.36.09 2 .27 1.53-1.04 2.2-.82 2.2-.82.44 1.1.16 1.92.08 2.12.51.56.82 1.27.82 2.15 0 3.07-1.87 3.75-3.65 3.95.29.25.54.73.54 1.48 0 1.07-.01 1.93-.01 2.2 0 .21.15.46.55.38A8.013 8.013 0 0016 8c0-4.42-3.58-8-8-8z"/></svg>
217217
View Source
@@ -226,10 +226,10 @@ <h1>Your AI. Your machine.<br>No limits.</h1>
226226
</section>
227227

228228
<section class="install reveal" id="install">
229-
<p style="margin-bottom:1.5rem;font-size:1.1rem;color:#ededed;font-weight:600">Download v2.3.1</p>
229+
<p style="margin-bottom:1.5rem;font-size:1.1rem;color:#ededed;font-weight:600">Download v2.3.2</p>
230230
<div style="display:flex;gap:1rem;justify-content:center;margin-bottom:2rem;flex-wrap:wrap">
231-
<a href="https://github.com/PurpleDoubleD/locally-uncensored/releases/download/v2.3.1/Locally.Uncensored_2.3.1_x64-setup.exe" class="btn btn-primary" style="font-size:.9rem;padding:.6rem 1.5rem" download>Windows (.exe)</a>
232-
<a href="https://github.com/PurpleDoubleD/locally-uncensored/releases/tag/v2.3.1" class="btn" style="font-size:.9rem;padding:.6rem 1.5rem">All Downloads</a>
231+
<a href="https://github.com/PurpleDoubleD/locally-uncensored/releases/download/v2.3.2/Locally.Uncensored_2.3.2_x64-setup.exe" class="btn btn-primary" style="font-size:.9rem;padding:.6rem 1.5rem" download>Windows (.exe)</a>
232+
<a href="https://github.com/PurpleDoubleD/locally-uncensored/releases/tag/v2.3.2" class="btn" style="font-size:.9rem;padding:.6rem 1.5rem">All Downloads</a>
233233
</div>
234234
<p style="color:var(--text-tertiary);font-size:.8rem">Plug & Play — choose from 20+ providers. The setup wizard auto-detects 12 local backends (Ollama, LM Studio, vLLM, KoboldCpp, Jan, GPT4All, llama.cpp, and more) or configure cloud APIs in Settings. Other platforms: <a href="https://github.com/PurpleDoubleD/locally-uncensored#-quick-start" style="color:var(--text-secondary);border-bottom:1px solid var(--border)">build from source</a>.</p>
235235
</section>
@@ -326,9 +326,9 @@ <h3>Thinking Mode</h3>
326326
</section>
327327

328328
<section class="agent-section reveal" id="whats-new">
329-
<div class="section-label">New in v2.3.1</div>
330-
<h2 class="section-heading">ComfyUI Plug & Play. 20 Model Bundles. I2I + I2V.</h2>
331-
<div class="section-desc">One-click ComfyUI install. 20 image + video bundles. Z-Image uncensored. FramePack I2V on 6 GB VRAM. Image-to-Image. Dynamic workflows. Hardware-aware onboarding.</div>
329+
<div class="section-label">New in v2.3.2</div>
330+
<h2 class="section-heading">GLM-4.7-Flash. 75+ Models. ComfyUI Plug & Play.</h2>
331+
<div class="section-desc">GLM-4.7-Flash in 11 variants. 75+ downloadable models with agent badges. Model loading fix. 20 image + video bundles. Z-Image uncensored. FramePack I2V. Image-to-Image.</div>
332332

333333
<div class="agent-grid">
334334
<div class="agent-card">
@@ -509,7 +509,7 @@ <h2 class="section-heading">Guides and comparisons.</h2>
509509
<h2>Run your own AI stack. No limits.</h2>
510510
<p>Free, open source, and yours to keep.</p>
511511
<div class="cta-buttons">
512-
<a href="https://github.com/PurpleDoubleD/locally-uncensored/releases/download/v2.3.1/Locally.Uncensored_2.3.1_x64-setup.exe" class="btn btn-primary" download>Download for Windows</a>
512+
<a href="https://github.com/PurpleDoubleD/locally-uncensored/releases/download/v2.3.2/Locally.Uncensored_2.3.2_x64-setup.exe" class="btn btn-primary" download>Download for Windows</a>
513513
<a href="https://github.com/PurpleDoubleD/locally-uncensored" class="btn" target="_blank">View on GitHub</a>
514514
</div>
515515
</section>

package.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
{
22
"name": "locally-uncensored",
3-
"version": "2.3.1",
3+
"version": "2.3.2",
44
"private": false,
55
"description": "Generate anything — text, images, video. Locally. Uncensored.",
66
"license": "AGPL-3.0",

src-tauri/Cargo.lock

Lines changed: 1 addition & 1 deletion
Some generated files are not rendered by default. Learn more about customizing how changed files appear on GitHub.

src-tauri/Cargo.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[package]
22
name = "locally-uncensored"
3-
version = "2.3.1"
3+
version = "2.3.2"
44
description = "Private, local AI chat & image/video generation"
55
authors = ["purpledoubled"]
66
edition = "2021"

src-tauri/src/commands/mod.rs

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,5 @@
11
pub mod agent;
2+
pub mod claude_code;
23
pub mod download;
34
pub mod filesystem;
45
pub mod install;

src-tauri/src/main.rs

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -68,6 +68,13 @@ fn main() {
6868
commands::search::search_status,
6969
commands::search::install_searxng,
7070
commands::search::searxng_status,
71+
// Claude Code
72+
commands::claude_code::detect_claude_code,
73+
commands::claude_code::install_claude_code,
74+
commands::claude_code::install_claude_code_status,
75+
commands::claude_code::start_claude_code,
76+
commands::claude_code::stop_claude_code,
77+
commands::claude_code::send_claude_code_input,
7178
// Proxy
7279
commands::proxy::ollama_search,
7380
commands::proxy::fetch_external,

src-tauri/src/state.rs

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -53,6 +53,9 @@ pub struct AppState {
5353
pub searxng_install: Mutex<InstallState>,
5454
pub searxng_available: AtomicBool,
5555
pub python_bin: String,
56+
// Claude Code
57+
pub claude_code_process: Mutex<Option<Child>>,
58+
pub claude_code_install: Arc<Mutex<InstallState>>,
5659
}
5760

5861
impl AppState {
@@ -73,6 +76,9 @@ impl AppState {
7376
searxng_install: Mutex::new(InstallState::default()),
7477
searxng_available: AtomicBool::new(false),
7578
python_bin,
79+
// Claude Code
80+
claude_code_process: Mutex::new(None),
81+
claude_code_install: Arc::new(Mutex::new(InstallState::default())),
7682
}
7783
}
7884
}
@@ -96,6 +102,21 @@ impl Drop for AppState {
96102
}
97103
}
98104

105+
// Kill Claude Code process
106+
if let Ok(mut proc) = self.claude_code_process.lock() {
107+
if let Some(ref mut child) = *proc {
108+
let pid = child.id();
109+
if cfg!(target_os = "windows") {
110+
let _ = std::process::Command::new("taskkill")
111+
.args(["/pid", &pid.to_string(), "/T", "/F"])
112+
.output();
113+
} else {
114+
let _ = child.kill();
115+
}
116+
println!("[ClaudeCode] Stopped");
117+
}
118+
}
119+
99120
// Stop Whisper server
100121
if let Ok(mut whisper) = self.whisper.lock() {
101122
whisper.stop();

src-tauri/tauri.conf.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
11
{
22
"$schema": "https://raw.githubusercontent.com/tauri-apps/tauri/dev/crates/tauri-cli/schema.json",
33
"productName": "Locally Uncensored",
4-
"version": "2.3.1",
4+
"version": "2.3.2",
55
"identifier": "com.purpledoubled.locally-uncensored",
66
"build": {
77
"beforeBuildCommand": "npm run build",

src/api/backend.ts

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -152,6 +152,13 @@ export async function backendCall<T = any>(
152152
ollama_search: { path: "/ollama-search" },
153153
fetch_external: { path: "/local-api/proxy-download" },
154154
fetch_external_bytes: { path: "/local-api/proxy-download" },
155+
// Claude Code
156+
detect_claude_code: { path: "/local-api/detect-claude-code" },
157+
install_claude_code: { path: "/local-api/install-claude-code", method: "POST" },
158+
install_claude_code_status: { path: "/local-api/install-claude-code-status" },
159+
start_claude_code: { path: "/local-api/start-claude-code", method: "POST" },
160+
stop_claude_code: { path: "/local-api/stop-claude-code", method: "POST" },
161+
send_claude_code_input: { path: "/local-api/send-claude-code-input", method: "POST" },
155162
// Agent tools (Phase 1 — new commands)
156163
shell_execute: { path: "/local-api/shell-execute", method: "POST" },
157164
fs_read: { path: "/local-api/fs-read", method: "POST" },

0 commit comments

Comments
 (0)