Conversation
Add a collapsible AI chat panel integrated into the main terminal layout. - Add ChatService.swift: multi-provider AI service supporting Gemini, GPT, Claude, OpenRouter, DeepSeek, Qwen, and MiniMax with streaming responses - Add ChatPanelView.swift: SwiftUI chat panel with message history, settings sheet for API key/model/provider configuration, and hide button - Integrate ChatPanelView into ContentView alongside the terminal area - Add 'Show/Hide AI Chat Panel' menu item under View (Cmd+Shift+\) - Persist panel visibility via @AppStorage("chatPanelVisible") Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
|
@lifejwang11 is attempting to deploy a commit to the Manaflow Team on Vercel. A member of the Team first needs to authorize it. |
📝 WalkthroughWalkthroughAdds a new AI chat panel feature: a SwiftUI ChatPanelView and ChatService that stream streamed LLM responses from multiple providers, wires the panel into the app UI and menu/shortcut, and introduces hit-test/responder ownership checks to avoid focus and pointer routing conflicts with the terminal. Changes
Sequence Diagram(s)sequenceDiagram
participant User
participant ChatPanelView
participant ChatService
participant URLSession
participant LLMProvider
User->>ChatPanelView: Enter message + Send
ChatPanelView->>ChatService: sendMessage(text)
ChatService->>ChatService: Validate API key & trim input
ChatService->>ChatService: Append user Message
ChatService->>URLSession: Start provider-specific request (stream)
activate URLSession
URLSession->>LLMProvider: POST / stream request
activate LLMProvider
LLMProvider-->>URLSession: Streamed chunks (SSE/lines)
URLSession-->>ChatService: bytes.lines()
ChatService->>ChatService: Parse chunk, update assistant Message incrementally
ChatPanelView->>ChatPanelView: Auto-scroll / render updates
LLMProvider-->>URLSession: [DONE]
deactivate LLMProvider
URLSession->>ChatService: EOF
deactivate URLSession
ChatService->>ChatService: Mark isStreaming = false
ChatPanelView->>User: Display final assistant message
Estimated code review effort🎯 4 (Complex) | ⏱️ ~60 minutes Possibly related PRs
Poem
🚥 Pre-merge checks | ✅ 3 | ❌ 2❌ Failed checks (1 warning, 1 inconclusive)
✅ Passed checks (3 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing Touches🧪 Generate unit tests (beta)
⚔️ Resolve merge conflicts
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
Greptile SummaryThis PR adds a collapsible AI chat panel supporting seven providers (Gemini, GPT, Claude, OpenRouter, DeepSeek, Qwen, MiniMax) with streaming responses, integrated alongside the terminal via
Confidence Score: 3/5Not safe to merge until API key storage is moved to the Keychain and the Gemini key is sent via a header rather than a URL parameter. Two P1 security defects (plaintext credential storage and key exposure in URL) directly affect users who enter API keys. A third P1 (stale menu label) creates a confusing UI. These require fixes before shipping. Sources/ChatService.swift (credential storage and Gemini request builder) and Sources/cmuxApp.swift (menu label reactivity).
|
| Filename | Overview |
|---|---|
| Sources/ChatService.swift | New multi-provider AI streaming service; API keys stored unencrypted in UserDefaults (should use Keychain), and Gemini key is leaked via URL query parameter instead of a header. |
| Sources/cmuxApp.swift | Adds Show/Hide AI Chat Panel menu item; label reads UserDefaults directly without @AppStorage so it won't reactively update when toggled from ContentView. |
| Sources/ChatPanelView.swift | New SwiftUI chat panel with message bubbles, streaming UI, settings sheet, and a custom NSView-based hit-test registry to prevent the terminal from stealing focus. |
| Sources/ContentView.swift | Integrates ChatPanelView in both sidebar-visible and sidebar-hidden layouts; uses @StateObject with a singleton where @ObservedObject is more appropriate. |
| Sources/GhosttyTerminalView.swift | Small guard added to prevent the terminal surface from stealing firstResponder when the chat panel input is focused. |
| Sources/TerminalWindowPortal.swift | Routes pointer events away from the terminal's custom hit-test path when the cursor is over the chat panel area. |
| Sources/AppDelegate.swift | Small change: skips reassigning firstResponder when the current responder belongs to the chat panel. |
Sequence Diagram
sequenceDiagram
participant U as User
participant CPV as ChatPanelView
participant CS as ChatService
participant API as AI Provider API
U->>CPV: Types message, presses Enter
CPV->>CS: sendMessage(text)
CS->>CS: Append user Message to messages[]
CS->>API: URLSession.bytes(for: request) [SSE stream]
loop Streaming chunks
API-->>CS: data: {...}
CS->>CS: parseSSEChunk()
CS->>CS: Append/update assistant Message
CS-->>CPV: @Published messages updated
CPV-->>U: Bubble updates live
end
API-->>CS: data: [DONE]
CS->>CS: isStreaming = false
CS-->>CPV: UI unlocked
U->>CPV: Tap stop button
CPV->>CS: cancelStreaming()
CS->>CS: streamTask.cancel()
CS->>CS: isStreaming = false
Comments Outside Diff (3)
-
Sources/ChatService.swift, line 808-810 (link)API keys stored unencrypted in UserDefaults
API keys for all providers (OpenAI, Anthropic, Gemini, etc.) are saved via
UserDefaults.standard.set(key, forKey:), which writes them as plaintext into a plist under~/Library/Preferences/. Any process running as the same user, or a backup of that directory, exposes the keys. macOS provides the Keychain specifically for this use case — sensitive credentials should be stored and retrieved viaSecItemAdd/SecItemCopyMatching(or a thin wrapper likeKeychainAccess). -
Sources/ChatService.swift, line 1005-1012 (link)Gemini API key sent in URL query string
Appending the API key as a
keyquery parameter means it appears in server access logs, any intermediate proxy or network capture, and NSURLSession diagnostics. For all other providers the key is carried in a request header, which is not logged by standard HTTP server software. The Google AI REST API also accepts anx-goog-api-keyheader — prefer that approach to keep the credential out of the URL entirely. -
Sources/ContentView.swift, line 1096 (link)@StateObjectused with a shared singleton@StateObjectis intended for objects that the view creates and owns — SwiftUI will instantiate the object once on first render and keep it alive as long as the view is in the hierarchy. WrappingChatService.shared(an already-alive singleton) in@StateObjectis semantically misleading, and ifContentViewis ever recreated, SwiftUI will allocate a new@StateObjectwrapper around the same underlying object, which can interact oddly with the published-value observation machinery.Prefer
@ObservedObjectfor externally-owned objects:@ObservedObject private var chatService = ChatService.shared
Reviews (1): Last reviewed commit: "feat: add AI Chat Panel" | Re-trigger Greptile
| Button( | ||
| UserDefaults.standard.bool(forKey: "chatPanelVisible") | ||
| ? String(localized: "menu.view.hideChatPanel", defaultValue: "Hide AI Chat Panel") | ||
| : String(localized: "menu.view.showChatPanel", defaultValue: "Show AI Chat Panel") | ||
| ) { | ||
| let current = UserDefaults.standard.object(forKey: "chatPanelVisible") as? Bool ?? true | ||
| UserDefaults.standard.set(!current, forKey: "chatPanelVisible") | ||
| } | ||
| .keyboardShortcut("\\", modifiers: [.command, .shift]) |
There was a problem hiding this comment.
Menu label goes stale after first render
The button title reads directly from UserDefaults.standard at cmuxApp.body evaluation time, but cmuxApp has no @AppStorage property for the chatPanelVisible key. SwiftUI never re-evaluates the menu when ContentView (which binds to that key via @AppStorage) changes the value — so the label can permanently display the wrong string. Adding an @AppStorage binding on cmuxApp and driving both the label and toggle action from it (the same pattern used in ContentView) will keep the menu item consistent.
There was a problem hiding this comment.
💡 Codex Review
Here are some automated review suggestions for this pull request.
Reviewed commit: af7e66d171
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| } | ||
| } | ||
|
|
||
| isStreaming = false |
There was a problem hiding this comment.
Guard
isStreaming updates against stale canceled tasks
The task created in sendMessage always executes isStreaming = false on completion, even if a newer stream has already started after cancelStreaming(). In the cancel-then-resend flow, the canceled task can finish later and flip isStreaming to false while the new request is still streaming, which re-enables send UI and allows overlapping requests/responses. Gate the final state update so only the currently active task can clear streaming state.
Useful? React with 👍 / 👎.
| UserDefaults.standard.bool(forKey: "chatPanelVisible") | ||
| ? String(localized: "menu.view.hideChatPanel", defaultValue: "Hide AI Chat Panel") | ||
| : String(localized: "menu.view.showChatPanel", defaultValue: "Show AI Chat Panel") |
There was a problem hiding this comment.
Use true default when reading
chatPanelVisible for menu label
The View menu label uses UserDefaults.standard.bool(forKey: "chatPanelVisible"), which returns false when the key is unset, but ContentView defaults this setting to true. On a fresh profile this makes the panel visible while the menu incorrectly shows “Show AI Chat Panel,” and the first toggle action does the opposite of what the label implies. Read the label state with the same defaulting logic used in the action (object(forKey:) as? Bool ?? true).
Useful? React with 👍 / 👎.
| Image(systemName: "bubble.left.and.bubble.right") | ||
| .foregroundStyle(.secondary) | ||
| .font(.system(size: 13)) | ||
| Text(String(localized: "chatPanel.title", defaultValue: "AI Chat")) |
There was a problem hiding this comment.
Add new chat strings to the localization catalog
This commit introduces new user-facing keys (for example chatPanel.title) but does not add corresponding entries/translations in Resources/Localizable.xcstrings (no chatPanel.* keys are present), so non-English users will get English fallbacks for the new UI. Per /workspace/cmux AGENTS.md (“All user-facing strings must be localized”), these keys need catalog entries and translations before shipping.
Useful? React with 👍 / 👎.
| let current = UserDefaults.standard.object(forKey: "chatPanelVisible") as? Bool ?? true | ||
| UserDefaults.standard.set(!current, forKey: "chatPanelVisible") | ||
| } | ||
| .keyboardShortcut("\\", modifiers: [.command, .shift]) |
There was a problem hiding this comment.
Route chat panel shortcut through shortcut settings
The new chat-panel command hardcodes .keyboardShortcut("\\", modifiers: [.command, .shift]) instead of going through KeyboardShortcutSettings, so users cannot remap/disable it via Settings or ~/.config/cmux/settings.json. /workspace/cmux AGENTS.md explicitly requires every new cmux-owned shortcut to be added to KeyboardShortcutSettings and configuration/docs, so this introduces a settings/config inconsistency.
Useful? React with 👍 / 👎.
There was a problem hiding this comment.
Actionable comments posted: 8
Caution
Some comments are outside the diff and can’t be posted inline due to platform limitations.
⚠️ Outside diff range comments (1)
Sources/ContentView.swift (1)
2884-2908:⚠️ Potential issue | 🟠 MajorResync terminal portal geometry when the chat panel changes width.
Adding/removing
chatPanelViewchangesterminalContentWithSidebarDropOverlay’s available width, but only sidebar changes currently triggerTerminalWindowPortalRegistry.scheduleExternalGeometrySynchronize(...). Without anonChange(of: isChatPanelVisible), portal-hosted terminals can keep stale bounds after the toolbar/shortcut toggles the sharedchatPanelVisiblekey.Proposed fix
view = AnyView(view.onChange(of: sidebarState.isVisible) { _ in if let observedWindow { TerminalWindowPortalRegistry.scheduleExternalGeometrySynchronize(for: observedWindow) } else { TerminalWindowPortalRegistry.scheduleExternalGeometrySynchronizeForAllWindows() } updateSidebarResizerBandState() syncTrafficLightInset() }) + + view = AnyView(view.onChange(of: isChatPanelVisible) { _ in + if let observedWindow { + TerminalWindowPortalRegistry.scheduleExternalGeometrySynchronize(for: observedWindow) + } else { + TerminalWindowPortalRegistry.scheduleExternalGeometrySynchronizeForAllWindows() + } + }) view = AnyView(view.onChange(of: sidebarMatchTerminalBackground) { _ in guard sidebarState.isVisible, sidebarBlendMode == SidebarBlendModeOption.withinWindow.rawValue else { return }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@Sources/ContentView.swift` around lines 2884 - 2908, The terminal portal geometry can become stale when the chat panel is shown/hidden because only sidebar visibility changes trigger synchronization; add an onChange(of: isChatPanelVisible) observer next to the existing sidebar-trigger logic so that when isChatPanelVisible toggles you call TerminalWindowPortalRegistry.scheduleExternalGeometrySynchronize(...) (the same synchronization path used for sidebar changes) to resync terminalContentWithSidebarDropOverlay bounds; locate this next to the HStack that composes terminalContentWithSidebarDropOverlay and chatPanelView and mirror the sidebar visibility sync behavior.
🧹 Nitpick comments (2)
Sources/ChatService.swift (1)
214-245: Recommended: offload SSE parsing off the main actor.The
Task { ... }inherits the@MainActorisolation from this class, so every iteration offor try await line in bytes.lines— includingJSONSerialization.jsonObject(...)insideparseSSEChunk— runs on main. For long replies or fast providers this puts noticeable decode work on the main thread. Consider a detached task (or a nonisolated helper) that parses chunks and only hops back to main for themessagesmutation:streamTask = Task.detached { [provider, key, model, sendMessages] in // build request, iterate bytes.lines, decode off-main // for each parsed delta: await MainActor.run { /* append/update messages */ } }Not required for correctness, but worth doing before this ships as the default assistant surface.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@Sources/ChatService.swift` around lines 214 - 245, The Task assigned to streamTask is running on the actor (MainActor) so SSE parsing in the Task closure (including Self.parseSSEChunk and JSON decoding) happens on the main thread; change the Task { ... } to a background context (e.g., Task.detached or a nonisolated helper) so you build the request (Self.buildRequest), iterate bytes.lines and call Self.parseSSEChunk off-main, and only hop back to the main actor when mutating messages (wrap appending/updating Message instances in await MainActor.run { ... }); keep appendedAssistant logic but perform it inside the MainActor hop to avoid heavy parsing on the main actor.Sources/ChatPanelView.swift (1)
177-184: Throttle auto-scroll on streaming chunks.
onChange(of: chatService.messages.last?.content)fires for every SSE delta (often many per second).proxy.scrollTo("bottom", anchor: .bottom)invalidates layout each time, and — more importantly — it will forcibly yank the view back to the bottom even if the user has scrolled up to re-read earlier context mid-stream. Consider (a) only scrolling when the view is already near the bottom, or (b) debouncing to ~10 Hz. Themessages.countonChange above is fine; the per-character one is the hot path.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@Sources/ChatPanelView.swift` around lines 177 - 184, The per-character onChange handler using onChange(of: chatService.messages.last?.content) calls proxy.scrollTo("bottom", anchor: .bottom) for every SSE chunk and should be throttled or gated; modify the handler in ChatPanelView to (a) only call proxy.scrollTo when the scroll is already near bottom (detect via ScrollViewReader/ScrollView offset or a Boolean like isNearBottom) or (b) debounce/throttle the scroll calls to ~10 Hz (e.g., use a DispatchWorkItem/Task with a 100ms debounce or a Combine debounce on a PassthroughSubject) instead of scrolling on every content delta. Keep the existing onChange(of: chatService.messages.count) behavior for new messages, and update the handler tied to messages.last?.content to reference the proximity check or throttled mechanism before invoking proxy.scrollTo("bottom", anchor: .bottom).
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@Sources/ChatPanelView.swift`:
- Around line 137-147: The Save button is disabled purely based on apiKeyDraft
being empty which blocks model-only changes; update the disable condition to
allow saving when a stored API key already exists for the selected provider.
Specifically, change the .disabled(...) check on the Button (referencing
apiKeyDraft, settingsProvider, and chatService) to only disable when
apiKeyDraft.trimmed().isEmpty AND there is no existing stored key for
settingsProvider (query via the chatService API such as a getter like
chatService.apiKey(for: settingsProvider) or equivalent); leave the rest of the
action (setting selectedProvider, calling chatService.setApiKey and
chatService.setModel, clearing apiKeyDraft and closing isShowingSettings)
unchanged. Ensure the new predicate uses the same trimming call used elsewhere
to avoid whitespace-only keys being treated as valid.
- Around line 363-415: Annotate ChatPanelHitTestRegistry with `@MainActor` to
ensure all access to its mutable static state (entriesByMarkerId and
focusOwnerTable) runs on the main actor, then add a nonisolated public helper
(e.g. static func _remove(markerId: ObjectIdentifier)) that schedules the actual
deletion onto the main actor so deinit callers can safely invoke it; finally
update MarkerView.deinit to call that helper with ObjectIdentifier(self) (or
dispatch to the main actor via Task/DispatchQueue.main.async to call
ChatPanelHitTestRegistry.remove) so removals never race with ownsFocusResponder
or other registry accesses.
In `@Sources/ChatService.swift`:
- Around line 269-273: migrateLegacyAnthropicKeyIfNeeded currently copies the
legacy "anthropicAPIKey" into the new store but leaves the original in
UserDefaults; after calling setApiKey(legacyKey, for: .claude) remove the legacy
plaintext entry from UserDefaults (e.g.
UserDefaults.standard.removeObject(forKey: "anthropicAPIKey")) so the credential
is not duplicated—perform the removal only after a successful setApiKey(...)
call and retain use of apiKey(for: .claude) to verify success if needed.
- Around line 37-54: The API keys are being stored in plaintext via UserDefaults
using apiKeyDefaultsKey; replace that with Keychain storage: implement
apiKey(for:) and setApiKey(_:for:) to read/write kSecClassGenericPassword items
keyed by the provider (use apiKeyDefaultsKey as the key name), update any
callers to use these methods, and remove the UserDefaults read/write paths; in
migrateLegacyAnthropicKeyIfNeeded() read the legacy "anthropicAPIKey" from
UserDefaults, write it to the Keychain under the corresponding provider key, and
then delete the legacy UserDefaults entries and any old provider keys to
complete migration.
- Around line 218-260: When handling non-2xx responses in the URLSession stream
(inside the do-catch around buildRequest/bytes(for:)), cap the accumulated error
body read from bytes.lines (e.g., stop appending once body.count exceeds a small
threshold like 4096) before throwing ChatAPIError.httpError so a large/streaming
payload can't blow memory; locate the logic that builds `body` and add the
length-check/break. Also remove or correct the dead post-error cleanup that
checks `if assistantContent.isEmpty && appendedAssistant { messages.removeLast()
}` — either invert the condition to `if !assistantContent.isEmpty &&
appendedAssistant` if the intent was to drop a trailing empty assistant message,
or simply delete that block; this affects the variables `assistantContent`,
`appendedAssistant`, and `messages` in the streaming loop/error handler.
- Around line 366-387: In buildGeminiRequest(messages:apiKey:model:) stop
sending the API key as a URL query item and instead set it in the x-goog-api-key
header (request.setValue(apiKey, forHTTPHeaderField: "x-goog-api-key")), remove
the URLQueryItem(name: "key", ...) entry, and eliminate force-unwraps: don't
force-unwrap URLComponents or components.url — guard/throw (e.g.,
URLError(.badURL) or a custom error) if URLComponents(string: ...) or
components.url is nil; keep using JSONSerialization.data(withJSONObject:) with
try as before and preserve the content-type header.
In `@Sources/cmuxApp.swift`:
- Around line 677-685: Replace the direct UserDefaults reads/writes in the
Button with a single `@AppStorage-backed` Bool so the menu label and toggle stay
in sync with the panel's reactive default; add an
`@AppStorage`("chatPanelVisible") var chatPanelVisible: Bool = false (matching the
panel's default) in the view, use chatPanelVisible in the Button's ternary label
instead of UserDefaults.standard.bool(...), and toggle chatPanelVisible.toggle()
in the action (keep the keyboardShortcut as-is).
In `@Sources/GhosttyTerminalView.swift`:
- Around line 10584-10590: The current guard using
ChatPanelHitTestRegistry.ownsFocusResponder(window.firstResponder) misses cases
where window.firstResponder is the shared NSTextView field editor; update the
logic so that if the responder is a field editor (responder.responds(to:
Selector("isFieldEditor")) / cast to NSTextView and isFieldEditor == true) you
resolve its owner (use the cmuxFieldEditorOwnerView pattern from
AppDelegate.swift: walk nextResponder and/or superview to find the actual owning
NSView) and then call ChatPanelHitTestRegistry.ownsFocusResponder with that
owner view (or extend ownsFocusResponder to accept an owner view) before
returning to avoid stealing focus while typing in chat.
---
Outside diff comments:
In `@Sources/ContentView.swift`:
- Around line 2884-2908: The terminal portal geometry can become stale when the
chat panel is shown/hidden because only sidebar visibility changes trigger
synchronization; add an onChange(of: isChatPanelVisible) observer next to the
existing sidebar-trigger logic so that when isChatPanelVisible toggles you call
TerminalWindowPortalRegistry.scheduleExternalGeometrySynchronize(...) (the same
synchronization path used for sidebar changes) to resync
terminalContentWithSidebarDropOverlay bounds; locate this next to the HStack
that composes terminalContentWithSidebarDropOverlay and chatPanelView and mirror
the sidebar visibility sync behavior.
---
Nitpick comments:
In `@Sources/ChatPanelView.swift`:
- Around line 177-184: The per-character onChange handler using onChange(of:
chatService.messages.last?.content) calls proxy.scrollTo("bottom", anchor:
.bottom) for every SSE chunk and should be throttled or gated; modify the
handler in ChatPanelView to (a) only call proxy.scrollTo when the scroll is
already near bottom (detect via ScrollViewReader/ScrollView offset or a Boolean
like isNearBottom) or (b) debounce/throttle the scroll calls to ~10 Hz (e.g.,
use a DispatchWorkItem/Task with a 100ms debounce or a Combine debounce on a
PassthroughSubject) instead of scrolling on every content delta. Keep the
existing onChange(of: chatService.messages.count) behavior for new messages, and
update the handler tied to messages.last?.content to reference the proximity
check or throttled mechanism before invoking proxy.scrollTo("bottom", anchor:
.bottom).
In `@Sources/ChatService.swift`:
- Around line 214-245: The Task assigned to streamTask is running on the actor
(MainActor) so SSE parsing in the Task closure (including Self.parseSSEChunk and
JSON decoding) happens on the main thread; change the Task { ... } to a
background context (e.g., Task.detached or a nonisolated helper) so you build
the request (Self.buildRequest), iterate bytes.lines and call Self.parseSSEChunk
off-main, and only hop back to the main actor when mutating messages (wrap
appending/updating Message instances in await MainActor.run { ... }); keep
appendedAssistant logic but perform it inside the MainActor hop to avoid heavy
parsing on the main actor.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 930d0b99-c12d-42b4-821c-6d9905d09d39
📒 Files selected for processing (8)
GhosttyTabs.xcodeproj/project.pbxprojSources/AppDelegate.swiftSources/ChatPanelView.swiftSources/ChatService.swiftSources/ContentView.swiftSources/GhosttyTerminalView.swiftSources/TerminalWindowPortal.swiftSources/cmuxApp.swift
| Button(String(localized: "chatPanel.apiKey.save", defaultValue: "Save")) { | ||
| let trimmed = apiKeyDraft.trimmingCharacters(in: .whitespacesAndNewlines) | ||
| chatService.selectedProvider = settingsProvider | ||
| chatService.setApiKey(trimmed, for: settingsProvider) | ||
| chatService.setModel(modelDraft, for: settingsProvider) | ||
| apiKeyDraft = "" | ||
| isShowingSettings = false | ||
| } | ||
| .buttonStyle(.borderedProminent) | ||
| .disabled(apiKeyDraft.trimmingCharacters(in: .whitespacesAndNewlines).isEmpty) | ||
| } |
There was a problem hiding this comment.
Save button disabled blocks legitimate model-only changes.
.disabled(apiKeyDraft.trimmingCharacters(...).isEmpty) prevents saving whenever the draft is empty. But onChange(of: settingsProvider) (line 103-106) re-seeds apiKeyDraft from storage, and for providers where no key has been configured yet the draft starts empty — so a user who just wants to change model or switch the active provider (without re-entering a key) can't press Save. Consider only disabling when both the draft is empty and there's no stored key for the selected provider, or split into separate "Save Key" / "Save" actions.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@Sources/ChatPanelView.swift` around lines 137 - 147, The Save button is
disabled purely based on apiKeyDraft being empty which blocks model-only
changes; update the disable condition to allow saving when a stored API key
already exists for the selected provider. Specifically, change the
.disabled(...) check on the Button (referencing apiKeyDraft, settingsProvider,
and chatService) to only disable when apiKeyDraft.trimmed().isEmpty AND there is
no existing stored key for settingsProvider (query via the chatService API such
as a getter like chatService.apiKey(for: settingsProvider) or equivalent); leave
the rest of the action (setting selectedProvider, calling chatService.setApiKey
and chatService.setModel, clearing apiKeyDraft and closing isShowingSettings)
unchanged. Ensure the new predicate uses the same trimming call used elsewhere
to avoid whitespace-only keys being treated as valid.
| enum ChatPanelHitTestRegistry { | ||
| private struct MarkerEntry { | ||
| let windowId: ObjectIdentifier | ||
| let rectInWindow: NSRect | ||
| } | ||
|
|
||
| private static var entriesByMarkerId: [ObjectIdentifier: MarkerEntry] = [:] | ||
| private static var focusOwnerTable: NSHashTable<NSView> = .weakObjects() | ||
|
|
||
| static func update(marker: NSView) { | ||
| let markerId = ObjectIdentifier(marker) | ||
| guard let window = marker.window else { | ||
| entriesByMarkerId.removeValue(forKey: markerId) | ||
| return | ||
| } | ||
| let rect = marker.convert(marker.bounds, to: nil) | ||
| entriesByMarkerId[markerId] = MarkerEntry( | ||
| windowId: ObjectIdentifier(window), | ||
| rectInWindow: rect | ||
| ) | ||
| } | ||
|
|
||
| static func remove(marker: NSView) { | ||
| entriesByMarkerId.removeValue(forKey: ObjectIdentifier(marker)) | ||
| } | ||
|
|
||
| static func contains(windowPoint: NSPoint, in window: NSWindow?) -> Bool { | ||
| guard let window else { return false } | ||
| let windowId = ObjectIdentifier(window) | ||
| return entriesByMarkerId.values.contains { | ||
| $0.windowId == windowId && $0.rectInWindow.contains(windowPoint) | ||
| } | ||
| } | ||
|
|
||
| static func registerFocusOwner(_ view: NSView) { | ||
| focusOwnerTable.add(view) | ||
| } | ||
|
|
||
| static func unregisterFocusOwner(_ view: NSView) { | ||
| focusOwnerTable.remove(view) | ||
| } | ||
|
|
||
| /// Returns true if the responder is (or is a descendant of) a chat panel input view. | ||
| static func ownsFocusResponder(_ responder: NSResponder) -> Bool { | ||
| guard let view = responder as? NSView else { return false } | ||
| for owner in focusOwnerTable.allObjects { | ||
| if view === owner || view.isDescendant(of: owner) { | ||
| return true | ||
| } | ||
| } | ||
| return false | ||
| } | ||
| } |
There was a problem hiding this comment.
Major: ChatPanelHitTestRegistry mutable static state isn't thread-confined.
entriesByMarkerId (a [ObjectIdentifier: MarkerEntry] dictionary) and focusOwnerTable are plain static mutable storage, with no lock, no actor, and no @MainActor annotation. Swift dictionaries are not thread-safe; concurrent reads and writes will corrupt internal state and crash.
The callers shown in the cross-file snippets (AppDelegate.swift:5696-5703, GhosttyTerminalView.swift:10582-10590) invoke ownsFocusResponder(_:) from first-responder / find-overlay paths, and MarkerView.deinit calls remove(marker:) from wherever ARC happens to release the view. Nothing here guarantees main-thread-only access, so a deinit on a background queue (common for SwiftUI view teardown) racing with an ownsFocusResponder call is a real TOCTOU/data-race risk.
Annotate the registry with @MainActor (matching the SwiftUI/AppKit reality of its callers) and mark the MarkerView overrides accordingly, e.g.:
-enum ChatPanelHitTestRegistry {
+@MainActor
+enum ChatPanelHitTestRegistry {
private struct MarkerEntry {
let windowId: ObjectIdentifier
let rectInWindow: NSRect
}…and in the deinit, hop to main explicitly since deinit cannot be @MainActor:
deinit {
- ChatPanelHitTestRegistry.remove(marker: self)
+ let markerId = ObjectIdentifier(self)
+ DispatchQueue.main.async {
+ ChatPanelHitTestRegistry._remove(markerId: markerId)
+ }
}(exposing a static func _remove(markerId:) nonisolated helper or similar). Otherwise a debug build with Thread Sanitizer will flag this immediately and release builds will eventually crash under load.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@Sources/ChatPanelView.swift` around lines 363 - 415, Annotate
ChatPanelHitTestRegistry with `@MainActor` to ensure all access to its mutable
static state (entriesByMarkerId and focusOwnerTable) runs on the main actor,
then add a nonisolated public helper (e.g. static func _remove(markerId:
ObjectIdentifier)) that schedules the actual deletion onto the main actor so
deinit callers can safely invoke it; finally update MarkerView.deinit to call
that helper with ObjectIdentifier(self) (or dispatch to the main actor via
Task/DispatchQueue.main.async to call ChatPanelHitTestRegistry.remove) so
removals never race with ownsFocusResponder or other registry accesses.
| do { | ||
| let request = try Self.buildRequest(messages: sendMessages, provider: provider, apiKey: key, model: model) | ||
| let (bytes, response) = try await URLSession.shared.bytes(for: request) | ||
|
|
||
| guard let http = response as? HTTPURLResponse else { | ||
| throw ChatAPIError.invalidResponse | ||
| } | ||
| guard (200..<300).contains(http.statusCode) else { | ||
| var body = "" | ||
| for try await line in bytes.lines { body += line } | ||
| throw ChatAPIError.httpError(http.statusCode, body) | ||
| } | ||
|
|
||
| for try await line in bytes.lines { | ||
| if Task.isCancelled { break } | ||
| guard line.hasPrefix("data: ") else { continue } | ||
| let data = String(line.dropFirst(6)) | ||
| if data == "[DONE]" { break } | ||
| guard let chunk = Self.parseSSEChunk(data, provider: provider) else { continue } | ||
|
|
||
| assistantContent += chunk | ||
| if !appendedAssistant { | ||
| messages.append(Message(role: .assistant, content: assistantContent)) | ||
| appendedAssistant = true | ||
| } else if let idx = messages.indices.last { | ||
| messages[idx].content = assistantContent | ||
| } | ||
| } | ||
|
|
||
| if !appendedAssistant { | ||
| messages.append(Message(role: .assistant, content: assistantContent)) | ||
| } | ||
| } catch { | ||
| if !Task.isCancelled { | ||
| streamingError = (error as? ChatAPIError)?.errorDescription ?? error.localizedDescription | ||
| } | ||
| if assistantContent.isEmpty && appendedAssistant { | ||
| messages.removeLast() | ||
| } | ||
| } | ||
|
|
||
| isStreaming = false | ||
| } |
There was a problem hiding this comment.
Minor bugs in streaming error handling.
Two small issues in this block:
-
Unbounded error body read (lines 226-228). When a non-2xx status is received, the loop appends every line of the response to
bodywith no cap. A large error payload (or a server that happens to stream) could balloon memory. Cap it (e.g.,if body.count > 4096 { break }) before the throw; only the first ~200 chars are used downstream anyway (line 445). -
Dead code at lines 254-256.
appendedAssistantis only set totrueafterassistantContent += chunkwith a non-emptychunk(lines 238-241), soassistantContent.isEmpty && appendedAssistantcan never be true and theremoveLast()never fires. If the intent was to drop a trailing empty assistant bubble on error, the condition is inverted — consider:
- if assistantContent.isEmpty && appendedAssistant {
- messages.removeLast()
- }
+ if appendedAssistant && assistantContent.isEmpty {
+ // unreachable today; kept guard if future refactor appends placeholder first
+ messages.removeLast()
+ }Otherwise, just delete the block.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@Sources/ChatService.swift` around lines 218 - 260, When handling non-2xx
responses in the URLSession stream (inside the do-catch around
buildRequest/bytes(for:)), cap the accumulated error body read from bytes.lines
(e.g., stop appending once body.count exceeds a small threshold like 4096)
before throwing ChatAPIError.httpError so a large/streaming payload can't blow
memory; locate the logic that builds `body` and add the length-check/break. Also
remove or correct the dead post-error cleanup that checks `if
assistantContent.isEmpty && appendedAssistant { messages.removeLast() }` —
either invert the condition to `if !assistantContent.isEmpty &&
appendedAssistant` if the intent was to drop a trailing empty assistant message,
or simply delete that block; this affects the variables `assistantContent`,
`appendedAssistant`, and `messages` in the streaming loop/error handler.
| private static func buildGeminiRequest(messages: [Message], apiKey: String, model: String) throws -> URLRequest { | ||
| var components = URLComponents(string: "https://generativelanguage.googleapis.com/v1beta/models/\(model):streamGenerateContent")! | ||
| components.queryItems = [ | ||
| URLQueryItem(name: "alt", value: "sse"), | ||
| URLQueryItem(name: "key", value: apiKey) | ||
| ] | ||
|
|
||
| var request = URLRequest(url: components.url!) | ||
| request.httpMethod = "POST" | ||
| request.setValue("application/json", forHTTPHeaderField: "content-type") | ||
|
|
||
| let body: [String: Any] = [ | ||
| "contents": messages.map { | ||
| [ | ||
| "role": $0.role == .assistant ? "model" : "user", | ||
| "parts": [["text": $0.content]] | ||
| ] | ||
| } | ||
| ] | ||
| request.httpBody = try JSONSerialization.data(withJSONObject: body) | ||
| return request | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🌐 Web query:
Google Gemini generativelanguage API x-goog-api-key header authentication
💡 Result:
Yes, the Google Gemini API (also known as the Generative Language API at generativelanguage.googleapis.com) uses the x-goog-api-key header for authentication with an API key. To authenticate REST requests, include the header "x-goog-api-key: YOUR_API_KEY" along with Content-Type: application/json. Example curl request from official docs: curl "https://generativelanguage.googleapis.com/v1beta/models/gemini-2.5-flash:generateContent" \ -H "x-goog-api-key: $GEMINI_API_KEY" \ -H 'Content-Type: application/json' \ -X POST \ -d '{ "contents": [{ "parts": [{"text": "Explain how AI works"}] }] }' API keys are created in Google AI Studio (aistudio.google.com/app/apikey). Set as environment variable GEMINI_API_KEY or GOOGLE_API_KEY for SDKs, but REST requires explicit header. Official docs confirm this is the standard method as of 2026. OAuth is available for stricter controls but API key is primary for Gemini API.
Citations:
- 1: https://ai.google.dev/gemini-api/docs/api-key
- 2: https://ai.google.dev/gemini-api/docs/api-key?authuser=9
- 3: https://ai.google.dev/api
- 4: https://ai.google.dev/api?authuser=1
- 5: https://ai.google.dev/gemini-api/docs/text-generation
🏁 Script executed:
# First, verify the file exists and check the actual code at the specified lines
fd ChatService.swiftRepository: manaflow-ai/cmux
Length of output: 85
🏁 Script executed:
# Once we confirm the file, let's examine the actual code at lines 366-387
if [ -f "Sources/ChatService.swift" ]; then
echo "=== Lines 366-387 of ChatService.swift ==="
sed -n '366,387p' Sources/ChatService.swift
echo ""
echo "=== Full buildGeminiRequest function ==="
grep -n "buildGeminiRequest" Sources/ChatService.swift
fiRepository: manaflow-ai/cmux
Length of output: 1304
🏁 Script executed:
# Search for any other Gemini API references or streaming content type handling
rg "generativelanguage\.googleapis\.com|streamGenerateContent|x-goog-api-key" --type swiftRepository: manaflow-ai/cmux
Length of output: 223
🏁 Script executed:
# Search for other potential Gemini API calls or auth patterns
rg "apiKey|API.*key|authorization|auth.*header" Sources/ChatService.swift -A 2 -B 2Repository: manaflow-ai/cmux
Length of output: 4770
🏁 Script executed:
# Check the context around line 278 where buildGeminiRequest is called
sed -n '270,290p' Sources/ChatService.swiftRepository: manaflow-ai/cmux
Length of output: 1038
🏁 Script executed:
# Search for any other streaming or Gemini-related code
rg "alt.*sse|streamGenerateContent|Gemini" --type swiftRepository: manaflow-ai/cmux
Length of output: 3241
Move Gemini API key from URL query parameter to x-goog-api-key header.
Passing key=<apiKey> in the URL means the credential appears in URLSession logs, crash reports, system logs, and proxy logs. Google's Gemini API officially accepts authentication via the x-goog-api-key header, which is the secure standard method. The proposed fix addresses this and also removes force-unwraps that could crash if the model parameter contains unexpected characters.
Proposed fix
private static func buildGeminiRequest(messages: [Message], apiKey: String, model: String) throws -> URLRequest {
- var components = URLComponents(string: "https://generativelanguage.googleapis.com/v1beta/models/\(model):streamGenerateContent")!
- components.queryItems = [
- URLQueryItem(name: "alt", value: "sse"),
- URLQueryItem(name: "key", value: apiKey)
- ]
-
- var request = URLRequest(url: components.url!)
+ let encodedModel = model.addingPercentEncoding(withAllowedCharacters: .urlPathAllowed) ?? model
+ var components = URLComponents(string: "https://generativelanguage.googleapis.com/v1beta/models/\(encodedModel):streamGenerateContent")!
+ components.queryItems = [URLQueryItem(name: "alt", value: "sse")]
+ guard let url = components.url else { throw ChatAPIError.invalidResponse }
+ var request = URLRequest(url: url)
request.httpMethod = "POST"
request.setValue("application/json", forHTTPHeaderField: "content-type")
+ request.setValue(apiKey, forHTTPHeaderField: "x-goog-api-key")🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@Sources/ChatService.swift` around lines 366 - 387, In
buildGeminiRequest(messages:apiKey:model:) stop sending the API key as a URL
query item and instead set it in the x-goog-api-key header
(request.setValue(apiKey, forHTTPHeaderField: "x-goog-api-key")), remove the
URLQueryItem(name: "key", ...) entry, and eliminate force-unwraps: don't
force-unwrap URLComponents or components.url — guard/throw (e.g.,
URLError(.badURL) or a custom error) if URLComponents(string: ...) or
components.url is nil; keep using JSONSerialization.data(withJSONObject:) with
try as before and preserve the content-type header.
| Button( | ||
| UserDefaults.standard.bool(forKey: "chatPanelVisible") | ||
| ? String(localized: "menu.view.hideChatPanel", defaultValue: "Hide AI Chat Panel") | ||
| : String(localized: "menu.view.showChatPanel", defaultValue: "Show AI Chat Panel") | ||
| ) { | ||
| let current = UserDefaults.standard.object(forKey: "chatPanelVisible") as? Bool ?? true | ||
| UserDefaults.standard.set(!current, forKey: "chatPanelVisible") | ||
| } | ||
| .keyboardShortcut("\\", modifiers: [.command, .shift]) |
There was a problem hiding this comment.
Make the menu item use the same reactive default as the panel.
Line 678 defaults missing chatPanelVisible to false, while Line 682 defaults it to true; first launch can show “Show AI Chat Panel” even though the panel is visible, and the command title may stay stale because the label reads UserDefaults directly. Use @AppStorage here too.
Proposed fix
struct cmuxApp: App {
@@
`@AppStorage`(SocketControlSettings.appStorageKey) private var socketControlMode = SocketControlSettings.defaultMode.rawValue
`@AppStorage`(BrowserToolbarAccessorySpacingDebugSettings.key) private var browserToolbarAccessorySpacingRaw = BrowserToolbarAccessorySpacingDebugSettings.defaultSpacing
+ `@AppStorage`("chatPanelVisible") private var chatPanelVisible = true
`@NSApplicationDelegateAdaptor`(AppDelegate.self) private var appDelegate
@@
- Button(
- UserDefaults.standard.bool(forKey: "chatPanelVisible")
- ? String(localized: "menu.view.hideChatPanel", defaultValue: "Hide AI Chat Panel")
- : String(localized: "menu.view.showChatPanel", defaultValue: "Show AI Chat Panel")
- ) {
- let current = UserDefaults.standard.object(forKey: "chatPanelVisible") as? Bool ?? true
- UserDefaults.standard.set(!current, forKey: "chatPanelVisible")
- }
+ Button(
+ chatPanelVisible
+ ? String(localized: "menu.view.hideChatPanel", defaultValue: "Hide AI Chat Panel")
+ : String(localized: "menu.view.showChatPanel", defaultValue: "Show AI Chat Panel")
+ ) {
+ chatPanelVisible.toggle()
+ }
.keyboardShortcut("\\", modifiers: [.command, .shift])🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@Sources/cmuxApp.swift` around lines 677 - 685, Replace the direct
UserDefaults reads/writes in the Button with a single `@AppStorage-backed` Bool so
the menu label and toggle stay in sync with the panel's reactive default; add an
`@AppStorage`("chatPanelVisible") var chatPanelVisible: Bool = false (matching the
panel's default) in the view, use chatPanelVisible in the Button's ternary label
instead of UserDefaults.standard.bool(...), and toggle chatPanelVisible.toggle()
in the action (keep the keyboardShortcut as-is).
| // Don't steal focus from the chat panel input. | ||
| if let fr = window.firstResponder, ChatPanelHitTestRegistry.ownsFocusResponder(fr) { | ||
| #if DEBUG | ||
| dlog("find.applyFirstResponder SKIP surface=\(surfaceShort) reason=chatPanelFocused") | ||
| #endif | ||
| return | ||
| } |
There was a problem hiding this comment.
🧩 Analysis chain
🏁 Script executed:
#!/bin/bash
# Description: Inspect ChatPanelHitTestRegistry ownership checks for AppKit field-editor handling.
# Expected: ownsFocusResponder(_:) should handle NSTextView field editors via currentEditor(),
# isFieldEditor + nextResponder/superview traversal, or an existing cmux field-editor owner helper.
rg -n -C5 \
'ChatPanelHitTestRegistry|ownsFocusResponder|currentEditor\(|isFieldEditor|NSTextView|nextResponder|cmuxFieldEditorOwnerView' \
--glob '*.swift'Repository: manaflow-ai/cmux
Length of output: 50373
🏁 Script executed:
#!/bin/bash
# Retrieve ChatPanelHitTestRegistry.ownsFocusResponder implementation
# Truncation was at ChatPanelView.swift:363; retrieve full implementation
rg -n 'ownsFocusResponder' Sources/ChatPanelView.swift -A 20 -B 2Repository: manaflow-ai/cmux
Length of output: 880
This guard only works if the user is not typing in chat.
ChatPanelHitTestRegistry.ownsFocusResponder casts the responder to NSView and checks identity/descendant relationships with registered chat panel owners. This works for direct view responders, but fails for the common case where window.firstResponder is the shared NSTextView field editor while the user is editing a chat input field. Field editors are managed by AppKit's text system and reside in the responder chain, not in the view hierarchy as children of the chat panel owner. The guard will return false, allowing the terminal auto-focus to steal focus back while the user is typing in chat.
To fix this, ownsFocusResponder should also detect field-editor ownership using the pattern from cmuxFieldEditorOwnerView in AppDelegate.swift: check isFieldEditor, then traverse nextResponder or superview to find the owning view and test it against registered owners. Alternatively, resolve the field editor's owner view before calling this check.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@Sources/GhosttyTerminalView.swift` around lines 10584 - 10590, The current
guard using ChatPanelHitTestRegistry.ownsFocusResponder(window.firstResponder)
misses cases where window.firstResponder is the shared NSTextView field editor;
update the logic so that if the responder is a field editor
(responder.responds(to: Selector("isFieldEditor")) / cast to NSTextView and
isFieldEditor == true) you resolve its owner (use the cmuxFieldEditorOwnerView
pattern from AppDelegate.swift: walk nextResponder and/or superview to find the
actual owning NSView) and then call ChatPanelHitTestRegistry.ownsFocusResponder
with that owner view (or extend ownsFocusResponder to accept an owner view)
before returning to avoid stealing focus while typing in chat.
There was a problem hiding this comment.
6 issues found across 8 files
Prompt for AI agents (unresolved issues)
Check if these issues are valid — if so, understand the root cause of each and fix them. If appropriate, use sub-agents to investigate and fix each issue separately.
<file name="Sources/ChatService.swift">
<violation number="1" location="Sources/ChatService.swift:171">
P1: API keys are persisted in UserDefaults (unencrypted settings store) instead of Keychain, exposing sensitive credentials to local extraction.</violation>
<violation number="2" location="Sources/ChatService.swift:192">
P2: Clearing chat does not cancel the in-flight stream, allowing streamed chunks to repopulate messages after a clear.</violation>
<violation number="3" location="Sources/ChatService.swift:239">
P2: Canceled stream can still append an empty assistant message because cancellation breaks the loop but does not guard the post-loop append.</violation>
</file>
<file name="Sources/TerminalWindowPortal.swift">
<violation number="1" location="Sources/TerminalWindowPortal.swift:149">
P2: Chat-panel exclusion was added only to hit-testing; tracking-area cursor updates still run divider detection over chat regions, causing incorrect resize cursors while hovering chat UI.</violation>
</file>
<file name="Sources/cmuxApp.swift">
<violation number="1" location="Sources/cmuxApp.swift:678">
P2: `chatPanelVisible` is read with inconsistent defaults, causing first-launch menu label/action mismatch.</violation>
<violation number="2" location="Sources/cmuxApp.swift:678">
P2: The new AI Chat Panel command title is derived from raw `UserDefaults` instead of observed SwiftUI state, so the menu label can become stale and not reliably reflect current visibility.</violation>
</file>
Reply with feedback, questions, or to request a fix. Tag @cubic-dev-ai to re-run a review.
| guard let chunk = Self.parseSSEChunk(data, provider: provider) else { continue } | ||
|
|
||
| assistantContent += chunk | ||
| if !appendedAssistant { |
There was a problem hiding this comment.
P2: Canceled stream can still append an empty assistant message because cancellation breaks the loop but does not guard the post-loop append.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At Sources/ChatService.swift, line 239:
<comment>Canceled stream can still append an empty assistant message because cancellation breaks the loop but does not guard the post-loop append.</comment>
<file context>
@@ -0,0 +1,449 @@
+ guard let chunk = Self.parseSSEChunk(data, provider: provider) else { continue }
+
+ assistantContent += chunk
+ if !appendedAssistant {
+ messages.append(Message(role: .assistant, content: assistantContent))
+ appendedAssistant = true
</file context>
| if !appendedAssistant { | |
| if !Task.isCancelled && !assistantContent.isEmpty && !appendedAssistant { |
| func clearMessages() { | ||
| messages = [] | ||
| streamingError = nil | ||
| } |
There was a problem hiding this comment.
P2: Clearing chat does not cancel the in-flight stream, allowing streamed chunks to repopulate messages after a clear.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At Sources/ChatService.swift, line 192:
<comment>Clearing chat does not cancel the in-flight stream, allowing streamed chunks to repopulate messages after a clear.</comment>
<file context>
@@ -0,0 +1,449 @@
+ return [selected] + provider.modelOptions
+ }
+
+ func clearMessages() {
+ messages = []
+ streamingError = nil
</file context>
| func clearMessages() { | |
| messages = [] | |
| streamingError = nil | |
| } | |
| func clearMessages() { | |
| cancelStreaming() | |
| messages = [] | |
| streamingError = nil | |
| } |
| } | ||
|
|
||
| if isPointerEvent { | ||
| if ChatPanelHitTestRegistry.contains(windowPoint: convert(point, to: nil), in: window) { |
There was a problem hiding this comment.
P2: Chat-panel exclusion was added only to hit-testing; tracking-area cursor updates still run divider detection over chat regions, causing incorrect resize cursors while hovering chat UI.
Prompt for AI agents
Check if this issue is valid — if so, understand the root cause and fix it. At Sources/TerminalWindowPortal.swift, line 149:
<comment>Chat-panel exclusion was added only to hit-testing; tracking-area cursor updates still run divider detection over chat regions, causing incorrect resize cursors while hovering chat UI.</comment>
<file context>
@@ -146,6 +146,11 @@ final class WindowTerminalHostView: NSView {
}
if isPointerEvent {
+ if ChatPanelHitTestRegistry.contains(windowPoint: convert(point, to: nil), in: window) {
+ clearActiveDividerCursor(restoreArrow: true)
+ return nil
</file context>
- Store API keys in macOS Keychain instead of plaintext UserDefaults - Send Gemini API key via x-goog-api-key header instead of URL query param - Make Show/Hide AI Chat Panel menu label reactive via @AppStorage - Route chat panel shortcut through KeyboardShortcutSettings (toggleAIChatPanel) - Guard isStreaming against stale canceled task by tracking stream ID - Switch ContentView from @StateObject to @ObservedObject for singleton - Add all chatPanel.* and related localization keys to Localizable.xcstrings Co-authored-by: Copilot <223556219+Copilot@users.noreply.github.com>
There was a problem hiding this comment.
Actionable comments posted: 2
♻️ Duplicate comments (2)
Sources/ChatService.swift (2)
280-287:⚠️ Potential issue | 🟡 MinorDead cleanup branch on stream error.
appendedAssistantis only set totrueafterassistantContent += chunkwith a non-empty chunk, soassistantContent.isEmpty && appendedAssistantis unreachable andmessages.removeLast()never fires. If the intent is to drop a trailing empty assistant bubble on error, invert the condition; otherwise delete the block.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@Sources/ChatService.swift` around lines 280 - 287, In the catch block in ChatService.swift, the condition `assistantContent.isEmpty && appendedAssistant` is unreachable because `appendedAssistant` is only set after appending non-empty chunks; either remove the dead block or invert the check to `appendedAssistant && assistantContent.isEmpty` so `messages.removeLast()` will run when an assistant message was appended but remains empty; update the catch to use the inverted condition (or delete the entire if block if you don't want to drop trailing empty assistant bubbles) and keep `streamingError` assignment as-is.
255-259:⚠️ Potential issue | 🟡 MinorCap the error body read to avoid unbounded memory growth.
On non-2xx responses the loop appends every line of the response to
bodywith no cap. A large/streamed error payload can balloon memory; only the first ~200 chars are used downstream (line 479).🛡️ Proposed fix
guard (200..<300).contains(http.statusCode) else { var body = "" - for try await line in bytes.lines { body += line } + for try await line in bytes.lines { + body += line + if body.count > 4096 { break } + } throw ChatAPIError.httpError(http.statusCode, body) }🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@Sources/ChatService.swift` around lines 255 - 259, The error handling currently concatenates every line from bytes.lines into variable body on non-2xx responses, which can grow without bound; change it to enforce a maximum read length (e.g., MAX_BODY_CHARS = 200 or 512), appending incoming lines only until body.count >= MAX_BODY_CHARS and then stop reading further (or break the loop), and ensure the thrown ChatAPIError.httpError(http.statusCode, body) uses this capped body; update the guard block that reads from bytes.lines and the variable body so the loop checks and respects the max length before appending.
🧹 Nitpick comments (2)
Sources/ChatService.swift (2)
296-300:cancelStreamingleavesstreamingErrorstale and doesn't advance the stream ID.Two small concerns:
- The in-flight task's completion block (lines 289-292) still matches
currentStreamIDbecause it wasn't rotated, so the cancelled task will still writeisStreaming = falseafter you already set it here. Harmless today but defeats the guard's purpose — bumpcurrentStreamID = UUID()here so a stale task can't stomp state if a new send happens in between.- Consider clearing
streamingErroron explicit cancel so a prior error banner doesn't linger once the user has visibly stopped the stream.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@Sources/ChatService.swift` around lines 296 - 300, In cancelStreaming(), rotate the stream identifier and clear any prior error before cancelling so a stale in-flight completion cannot overwrite state: set currentStreamID = UUID() and streamingError = nil, then cancel streamTask, set streamTask = nil and isStreaming = false (referencing cancelStreaming, currentStreamID, streamingError, streamTask, isStreaming). This ensures any completion block that checks currentStreamID won't stomp new state and that previous errors don't linger after an explicit cancel.
168-201: Keychain calls run on the MainActor.
ChatServiceis@MainActor, soapiKey(for:)/setApiKey(_:for:)invokeSecItemCopyMatching/SecItemAdd/SecItemUpdateon the main thread.SecItem*can block (keychain prompts, locked keychain, contention) and is structurally uncancellable.apiKey(for:)in particular is also invoked insidesendMessagevia the computedapiKeyproperty (line 239) and insidehasApiKeyon the hot path. Consider moving these toTask.detached(matching theProviderAccountStorepattern) and caching the value in memory, or at minimum reading the key off-main before streaming starts.🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed. In `@Sources/ChatService.swift` around lines 168 - 201, ChatService's synchronous apiKey(for:)/setApiKey(_:for:) perform blocking SecItem* calls on the `@MainActor`; move those operations off the main thread by making keychain access asynchronous and running the SecItemCopyMatching/SecItemAdd/SecItemUpdate/SecItemDelete calls inside Task.detached (or another background executor) similar to ProviderAccountStore, and expose async methods or nonisolated helpers that return via continuations; add an in-memory cache (e.g., a dictionary keyed by provider.apiKeyDefaultsKey) that is read synchronously from the actor for hot-path checks (hasApiKey and the apiKey property) and updated from the detached tasks when keychain reads/writes complete, and ensure UI updates (objectWillChange.send()) are invoked on the MainActor after cache changes.
🤖 Prompt for all review comments with AI agents
Verify each finding against the current code and only fix it if needed.
Inline comments:
In `@Sources/ChatService.swift`:
- Around line 309-358: The buildRequest function and its helpers (notably
buildGeminiRequest and buildOpenAICompatibleRequest) currently force-unwrap
URL(string:) and URLComponents(...) results which will crash on malformed model
or provider input; change these to safely unwrap URL/URLComponents returns and
throw ChatAPIError.invalidResponse (or add a dedicated ChatAPIError.invalidURL)
when URL creation fails, and ensure the Gemini path properly percent-encodes the
model when constructing the path before creating URLComponents to avoid invalid
characters; update all branches that do URL(string: "…")! and any
components.url! to guarded let bindings that throw on nil.
In `@Sources/cmuxApp.swift`:
- Line 152: The new AppStorage property `@AppStorage`("chatPanelVisible") private
var isChatPanelVisible: Bool = true (in cmuxApp.swift) is not reset by
SettingsView.resetAllSettings(); update SettingsView.resetAllSettings() to
explicitly restore the "chatPanelVisible"/AppStorage key to its default true
value (i.e., set isChatPanelVisible = true or remove the persisted key) so the
Reset All Settings action re-displays the chat panel; locate the
resetAllSettings logic in SettingsView and add handling for the new
isChatPanelVisible/AppStorage key alongside the other defaults.
---
Duplicate comments:
In `@Sources/ChatService.swift`:
- Around line 280-287: In the catch block in ChatService.swift, the condition
`assistantContent.isEmpty && appendedAssistant` is unreachable because
`appendedAssistant` is only set after appending non-empty chunks; either remove
the dead block or invert the check to `appendedAssistant &&
assistantContent.isEmpty` so `messages.removeLast()` will run when an assistant
message was appended but remains empty; update the catch to use the inverted
condition (or delete the entire if block if you don't want to drop trailing
empty assistant bubbles) and keep `streamingError` assignment as-is.
- Around line 255-259: The error handling currently concatenates every line from
bytes.lines into variable body on non-2xx responses, which can grow without
bound; change it to enforce a maximum read length (e.g., MAX_BODY_CHARS = 200 or
512), appending incoming lines only until body.count >= MAX_BODY_CHARS and then
stop reading further (or break the loop), and ensure the thrown
ChatAPIError.httpError(http.statusCode, body) uses this capped body; update the
guard block that reads from bytes.lines and the variable body so the loop checks
and respects the max length before appending.
---
Nitpick comments:
In `@Sources/ChatService.swift`:
- Around line 296-300: In cancelStreaming(), rotate the stream identifier and
clear any prior error before cancelling so a stale in-flight completion cannot
overwrite state: set currentStreamID = UUID() and streamingError = nil, then
cancel streamTask, set streamTask = nil and isStreaming = false (referencing
cancelStreaming, currentStreamID, streamingError, streamTask, isStreaming). This
ensures any completion block that checks currentStreamID won't stomp new state
and that previous errors don't linger after an explicit cancel.
- Around line 168-201: ChatService's synchronous apiKey(for:)/setApiKey(_:for:)
perform blocking SecItem* calls on the `@MainActor`; move those operations off the
main thread by making keychain access asynchronous and running the
SecItemCopyMatching/SecItemAdd/SecItemUpdate/SecItemDelete calls inside
Task.detached (or another background executor) similar to ProviderAccountStore,
and expose async methods or nonisolated helpers that return via continuations;
add an in-memory cache (e.g., a dictionary keyed by provider.apiKeyDefaultsKey)
that is read synchronously from the actor for hot-path checks (hasApiKey and the
apiKey property) and updated from the detached tasks when keychain reads/writes
complete, and ensure UI updates (objectWillChange.send()) are invoked on the
MainActor after cache changes.
🪄 Autofix (Beta)
Fix all unresolved CodeRabbit comments on this PR:
- Push a commit to this branch (recommended)
- Create a new PR with the fixes
ℹ️ Review info
⚙️ Run configuration
Configuration used: defaults
Review profile: CHILL
Plan: Pro
Run ID: 1bfa84e9-4146-424f-acbf-eedf8e025d8e
📒 Files selected for processing (5)
Resources/Localizable.xcstringsSources/ChatService.swiftSources/ContentView.swiftSources/KeyboardShortcutSettings.swiftSources/cmuxApp.swift
✅ Files skipped from review due to trivial changes (1)
- Resources/Localizable.xcstrings
🚧 Files skipped from review as they are similar to previous changes (1)
- Sources/ContentView.swift
| private static func buildRequest(messages: [Message], provider: Provider, apiKey: String, model: String) throws -> URLRequest { | ||
| switch provider { | ||
| case .gemini: | ||
| return try buildGeminiRequest(messages: messages, apiKey: apiKey, model: model) | ||
| case .gpt: | ||
| return try buildOpenAICompatibleRequest( | ||
| url: URL(string: "https://api.openai.com/v1/chat/completions")!, | ||
| messages: messages, | ||
| apiKey: apiKey, | ||
| model: model, | ||
| extraHeaders: [:] | ||
| ) | ||
| case .claude: | ||
| return try buildClaudeRequest(messages: messages, apiKey: apiKey, model: model) | ||
| case .openrouter: | ||
| return try buildOpenAICompatibleRequest( | ||
| url: URL(string: "https://openrouter.ai/api/v1/chat/completions")!, | ||
| messages: messages, | ||
| apiKey: apiKey, | ||
| model: model, | ||
| extraHeaders: [ | ||
| "HTTP-Referer": "https://cmux.local", | ||
| "X-Title": "cmux" | ||
| ] | ||
| ) | ||
| case .deepseek: | ||
| return try buildOpenAICompatibleRequest( | ||
| url: URL(string: "https://api.deepseek.com/chat/completions")!, | ||
| messages: messages, | ||
| apiKey: apiKey, | ||
| model: model, | ||
| extraHeaders: [:] | ||
| ) | ||
| case .qwen: | ||
| return try buildOpenAICompatibleRequest( | ||
| url: URL(string: "https://dashscope.aliyuncs.com/compatible-mode/v1/chat/completions")!, | ||
| messages: messages, | ||
| apiKey: apiKey, | ||
| model: model, | ||
| extraHeaders: [:] | ||
| ) | ||
| case .minimax: | ||
| return try buildOpenAICompatibleRequest( | ||
| url: URL(string: "https://api.minimax.io/v1/chat/completions")!, | ||
| messages: messages, | ||
| apiKey: apiKey, | ||
| model: model, | ||
| extraHeaders: [:] | ||
| ) | ||
| } |
There was a problem hiding this comment.
Force-unwrapped URLs will crash on malformed model/provider input.
Every buildRequest branch force-unwraps URL(string:)! and buildGeminiRequest also force-unwraps URLComponents(string:)! / components.url!. The Gemini path is especially fragile because model is interpolated unescaped into the path — if a user enters a model containing spaces or other non-URL characters in settings, this crashes the app. Guard and throw ChatAPIError.invalidResponse (or a dedicated case) instead.
🛡️ Proposed fix for Gemini
private static func buildGeminiRequest(messages: [Message], apiKey: String, model: String) throws -> URLRequest {
- var components = URLComponents(string: "https://generativelanguage.googleapis.com/v1beta/models/\(model):streamGenerateContent")!
- components.queryItems = [
- URLQueryItem(name: "alt", value: "sse"),
- ]
-
- var request = URLRequest(url: components.url!)
+ let encodedModel = model.addingPercentEncoding(withAllowedCharacters: .urlPathAllowed) ?? model
+ guard var components = URLComponents(string: "https://generativelanguage.googleapis.com/v1beta/models/\(encodedModel):streamGenerateContent") else {
+ throw ChatAPIError.invalidResponse
+ }
+ components.queryItems = [URLQueryItem(name: "alt", value: "sse")]
+ guard let url = components.url else { throw ChatAPIError.invalidResponse }
+ var request = URLRequest(url: url)Also applies to: 383-406
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@Sources/ChatService.swift` around lines 309 - 358, The buildRequest function
and its helpers (notably buildGeminiRequest and buildOpenAICompatibleRequest)
currently force-unwrap URL(string:) and URLComponents(...) results which will
crash on malformed model or provider input; change these to safely unwrap
URL/URLComponents returns and throw ChatAPIError.invalidResponse (or add a
dedicated ChatAPIError.invalidURL) when URL creation fails, and ensure the
Gemini path properly percent-encodes the model when constructing the path before
creating URLComponents to avoid invalid characters; update all branches that do
URL(string: "…")! and any components.url! to guarded let bindings that throw on
nil.
| private var showSidebarDevBuildBanner = DevBuildBannerDebugSettings.defaultShowSidebarBanner | ||
| @AppStorage(SocketControlSettings.appStorageKey) private var socketControlMode = SocketControlSettings.defaultMode.rawValue | ||
| @AppStorage(BrowserToolbarAccessorySpacingDebugSettings.key) private var browserToolbarAccessorySpacingRaw = BrowserToolbarAccessorySpacingDebugSettings.defaultSpacing | ||
| @AppStorage("chatPanelVisible") private var isChatPanelVisible: Bool = true |
There was a problem hiding this comment.
Reset the new chat panel visibility setting with the rest of AppStorage defaults.
Line 152 adds a persisted UI toggle, but resetAllSettings() does not restore chatPanelVisible, so “Reset All Settings” can leave the AI chat panel hidden.
Proposed fix
workspacePresentationMode = WorkspacePresentationModeSettings.defaultMode.rawValue
let defaults = UserDefaults.standard
+ defaults.set(true, forKey: "chatPanelVisible")
defaults.removeObject(forKey: WorkspaceTitlebarSettings.showTitlebarKey)Based on learnings, SettingsView.resetAllSettings() must reset newly added AppStorage toggles to defaults.
Also applies to: 6177-6265
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.
In `@Sources/cmuxApp.swift` at line 152, The new AppStorage property
`@AppStorage`("chatPanelVisible") private var isChatPanelVisible: Bool = true (in
cmuxApp.swift) is not reset by SettingsView.resetAllSettings(); update
SettingsView.resetAllSettings() to explicitly restore the
"chatPanelVisible"/AppStorage key to its default true value (i.e., set
isChatPanelVisible = true or remove the persisted key) so the Reset All Settings
action re-displays the chat panel; locate the resetAllSettings logic in
SettingsView and add handling for the new isChatPanelVisible/AppStorage key
alongside the other defaults.
Summary
Add a collapsible AI chat panel integrated alongside the terminal.
Changes
@AppStorage("chatPanelVisible")Screenshots
The chat panel appears to the right of the terminal and can be toggled via the View menu or Cmd+Shift+\.
Summary by cubic
Adds a collapsible AI chat panel next to the terminal with streaming, multi‑provider chat. Improves security by moving API keys to the macOS Keychain and fixes streaming/menu behaviors.
New Features
ChatServicewith SSE streaming and provider‑specific parsers; cancel support.@AppStorage("chatPanelVisible").Bug Fixes
x-goog-api-keyheader (not URL query).@AppStorage.KeyboardShortcutSettings(toggleAIChatPanel).isStreamingby tracking a stream ID.@ObservedObjectfor theChatServicesingleton inContentView.Written for commit ff8f0fb. Summary will update on new commits.
Summary by CodeRabbit
New Features