From 90fafae8a7c66883ee5d3c44ef86e44dd0d96e2d Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Sun, 26 Oct 2025 04:42:50 +0000 Subject: [PATCH 01/11] Initial plan From 88e5785ddfb1c2d651682213a55438b1003838d7 Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Sun, 26 Oct 2025 04:48:05 +0000 Subject: [PATCH 02/11] Add SLM_DESIGN.md with comprehensive AI integration plan Co-authored-by: anicolao <1145048+anicolao@users.noreply.github.com> --- SLM_DESIGN.md | 245 ++++++++++++++++++++++++++++++++++++++++++++++++++ 1 file changed, 245 insertions(+) create mode 100644 SLM_DESIGN.md diff --git a/SLM_DESIGN.md b/SLM_DESIGN.md new file mode 100644 index 0000000..828fbd2 --- /dev/null +++ b/SLM_DESIGN.md @@ -0,0 +1,245 @@ +# AI Integration for DikuMUD Parsing Failures - Design Document + +## Overview + +This document outlines the design for integrating AI assistance into the DikuMUD client to help with parsing failures and provide command suggestions. The system will use Small Language Models (SLMs) or other AI endpoints (OpenAI, Ollama) to provide assistance when the MUD doesn't understand user commands. + +## Components + +### 1. Trigger Enhancement: `` Variable + +**Purpose**: Capture the last command sent by the user so it can be referenced in trigger actions. + +**Implementation**: +- Add `lastCommand` field to the `Model` struct in `internal/tui/app.go` +- Update the command sending logic to store commands in `lastCommand` before sending +- Extend the trigger action substitution to recognize `` as a special placeholder +- When triggers fire, substitute `` with the actual last command + +**Location**: `internal/tui/app.go`, `internal/triggers/trigger.go` + +### 2. AI Configuration Storage + +**Purpose**: Store AI endpoint configuration (URL, type, API key) persistently. + +**Implementation**: +- Add `AIConfig` struct to `internal/config/account.go`: + ```go + type AIConfig struct { + Type string `json:"type"` // "openai", "ollama", or custom + URL string `json:"url"` // API endpoint URL + APIKey string `json:"-"` // Not stored in JSON, stored in passwords file + } + ``` +- Add `AIConfig` field to the `Config` struct +- Add methods to get/set AI configuration +- Store API key separately in the password store for security + +**Location**: `internal/config/account.go`, `internal/config/passwords.go` + +### 3. AI Prompt Configuration + +**Purpose**: Store and manage AI prompts with presets for specific MUDs. + +**Implementation**: +- Add `AIPrompt` field to the `Config` struct +- Provide default prompt for Barsoom MUD: "PLACEHOLDER" (as specified) +- Allow users to customize the prompt via `/ai-prompt` command +- The prompt should instruct the AI on how to interpret the failed command and suggest alternatives + +**Location**: `internal/config/account.go` + +### 4. `/configure-ai` Command + +**Purpose**: Configure the AI endpoint settings. + +**Syntax**: +``` +/configure-ai [api-key] +``` + +**Examples**: +``` +/configure-ai openai https://api.openai.com/v1/chat/completions sk-... +/configure-ai ollama http://localhost:11434/api/generate +``` + +**Implementation**: +- Add `handleConfigureAICommand` function in `internal/tui/app.go` +- Parse the command arguments +- Store configuration in the `Config` struct +- Save API key to password store if provided +- Provide feedback to user + +**Location**: `internal/tui/app.go` + +### 5. `/ai-prompt` Command + +**Purpose**: Configure the AI prompt template. + +**Syntax**: +``` +/ai-prompt "" +/ai-prompt preset barsoom +``` + +**Examples**: +``` +/ai-prompt "You are a helpful DikuMUD assistant. The user tried this command but it failed: {command}. Suggest a correct alternative." +/ai-prompt preset barsoom +``` + +**Implementation**: +- Add `handleAIPromptCommand` function in `internal/tui/app.go` +- Support both custom prompts and presets +- Store in configuration +- For now, Barsoom preset shows "PLACEHOLDER" + +**Location**: `internal/tui/app.go` + +### 6. `/ai` Command + +**Purpose**: Send a prompt to the AI and execute the suggested command. + +**Syntax**: +``` +/ai +``` + +**Example**: +``` +/ai +``` + +**Implementation**: + +#### CLI Mode (local Go process): +- Add `handleAICommand` function in `internal/tui/app.go` +- Create AI client in `internal/ai/client.go`: + - Support OpenAI API format + - Support Ollama API format + - Make HTTP request to configured endpoint + - Parse response and extract suggested command +- Send the suggested command to the MUD +- Display the AI's response/suggestion to the user + +#### Web Mode (browser makes request): +- Detect web mode using existing `webSessionID` field +- If in web mode, send a special message to the web client via WebSocket +- Extend `internal/web/websocket.go` to handle AI request messages +- Browser-side JavaScript (in `web/static/app.js`): + - Receive AI request message + - Make HTTP request to AI endpoint from browser + - Send response back via WebSocket + - TUI receives response and sends command to MUD + +**Rationale for Web Mode**: In web mode, the browser makes the AI request because: +1. The API key should not be sent to the server +2. CORS policies may restrict server-side requests +3. The user's browser can handle authentication directly + +**Location**: `internal/tui/app.go`, `internal/ai/client.go` (new), `internal/web/websocket.go`, `web/static/app.js` + +### 7. `/howto` Command + +**Purpose**: Ask the AI how to do something, but just display the answer (don't execute). + +**Syntax**: +``` +/howto +``` + +**Example**: +``` +/howto heal myself +``` + +**Implementation**: +- Similar to `/ai` command but: + - Display AI response as informational output (like trigger matches) + - Do not automatically send any commands to the MUD + - Format output in the same style as trigger match notifications + +**Location**: `internal/tui/app.go` + +### 8. Integration Example + +**Setup**: +``` +/configure-ai ollama http://localhost:11434/api/generate +/ai-prompt preset barsoom +/trigger "Huh?!" "/ai " +``` + +**Usage Flow**: +1. User types: `heall` (misspelled command) +2. MUD responds: `Huh?!` +3. Trigger fires, capturing "heall" as `` +4. Trigger executes: `/ai heall` +5. AI receives prompt with the failed command +6. AI suggests: `heal` +7. Client sends `heal` to the MUD +8. User sees: `[Trigger: /ai heall]` and `[AI suggests: heal]` + +## File Structure + +### New Files: +- `internal/ai/client.go` - AI client implementation for CLI mode +- `internal/ai/client_test.go` - Tests for AI client + +### Modified Files: +- `internal/config/account.go` - Add AI configuration storage +- `internal/config/passwords.go` - Store API keys securely +- `internal/triggers/trigger.go` - Support `` substitution +- `internal/tui/app.go` - Add new commands and lastCommand tracking +- `internal/web/websocket.go` - Handle AI requests in web mode +- `web/static/app.js` - Handle AI requests from browser + +## Implementation Plan + +### Phase 1: Core Infrastructure +1. Add `lastCommand` tracking to Model +2. Extend trigger system to support `` +3. Add AI configuration to Config struct +4. Implement `/configure-ai` command +5. Implement `/ai-prompt` command + +### Phase 2: AI Client (CLI Mode) +1. Create `internal/ai/client.go` with support for: + - OpenAI API format + - Ollama API format +2. Implement `/ai` command for CLI mode +3. Implement `/howto` command for CLI mode + +### Phase 3: Web Mode Support +1. Extend WebSocket protocol for AI requests +2. Update browser JavaScript to handle AI requests +3. Implement `/ai` and `/howto` for web mode + +### Phase 4: Testing and Documentation +1. Add unit tests for AI client +2. Add integration tests for commands +3. Test trigger integration: `/trigger "Huh?!" "/ai "` +4. Update README.md with AI features + +## Security Considerations + +1. **API Key Storage**: Store API keys in the password store, not in accounts.json +2. **Web Mode**: Browser makes AI requests to prevent exposing API keys to the server +3. **Input Validation**: Sanitize prompts before sending to AI +4. **Rate Limiting**: Consider adding rate limiting to prevent API abuse + +## Error Handling + +1. Network errors: Display error message, don't crash +2. Invalid AI response: Show error, don't send bad commands +3. Missing configuration: Prompt user to run `/configure-ai` +4. API key errors: Clear error message about authentication + +## Future Enhancements + +1. Support for additional AI providers (Anthropic, etc.) +2. Command history analysis for better suggestions +3. Context-aware prompts (current room, inventory, etc.) +4. Caching of common suggestions +5. Multi-turn conversations with AI From a873a68b9a66afbc415ba69c33988171743d5cdd Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Sun, 26 Oct 2025 04:54:19 +0000 Subject: [PATCH 03/11] Implement AI integration core features - , /configure-ai, /ai-prompt, /ai, /howto Co-authored-by: anicolao <1145048+anicolao@users.noreply.github.com> --- internal/ai/client.go | 174 +++++++++++++++++++++ internal/config/account.go | 32 ++++ internal/tui/app.go | 304 ++++++++++++++++++++++++++++++++++++- 3 files changed, 509 insertions(+), 1 deletion(-) create mode 100644 internal/ai/client.go diff --git a/internal/ai/client.go b/internal/ai/client.go new file mode 100644 index 0000000..b309094 --- /dev/null +++ b/internal/ai/client.go @@ -0,0 +1,174 @@ +package ai + +import ( + "bytes" + "encoding/json" + "fmt" + "io" + "net/http" + "strings" + "time" +) + +// Client represents an AI client +type Client struct { + Type string // "openai" or "ollama" + URL string + APIKey string +} + +// NewClient creates a new AI client +func NewClient(aiType, url, apiKey string) *Client { + return &Client{ + Type: strings.ToLower(aiType), + URL: url, + APIKey: apiKey, + } +} + +// GenerateResponse sends a prompt to the AI and returns the response +func (c *Client) GenerateResponse(prompt string) (string, error) { + switch c.Type { + case "openai": + return c.generateOpenAI(prompt) + case "ollama": + return c.generateOllama(prompt) + default: + return "", fmt.Errorf("unsupported AI type: %s", c.Type) + } +} + +// OpenAI API request/response structures +type openAIRequest struct { + Model string `json:"model"` + Messages []openAIMessage `json:"messages"` +} + +type openAIMessage struct { + Role string `json:"role"` + Content string `json:"content"` +} + +type openAIResponse struct { + Choices []struct { + Message openAIMessage `json:"message"` + } `json:"choices"` + Error *struct { + Message string `json:"message"` + } `json:"error,omitempty"` +} + +// generateOpenAI generates a response using OpenAI API +func (c *Client) generateOpenAI(prompt string) (string, error) { + reqBody := openAIRequest{ + Model: "gpt-3.5-turbo", + Messages: []openAIMessage{ + { + Role: "user", + Content: prompt, + }, + }, + } + + jsonData, err := json.Marshal(reqBody) + if err != nil { + return "", fmt.Errorf("failed to marshal request: %w", err) + } + + req, err := http.NewRequest("POST", c.URL, bytes.NewBuffer(jsonData)) + if err != nil { + return "", fmt.Errorf("failed to create request: %w", err) + } + + req.Header.Set("Content-Type", "application/json") + if c.APIKey != "" { + req.Header.Set("Authorization", "Bearer "+c.APIKey) + } + + client := &http.Client{Timeout: 30 * time.Second} + resp, err := client.Do(req) + if err != nil { + return "", fmt.Errorf("failed to send request: %w", err) + } + defer resp.Body.Close() + + body, err := io.ReadAll(resp.Body) + if err != nil { + return "", fmt.Errorf("failed to read response: %w", err) + } + + if resp.StatusCode != http.StatusOK { + return "", fmt.Errorf("API error (status %d): %s", resp.StatusCode, string(body)) + } + + var response openAIResponse + if err := json.Unmarshal(body, &response); err != nil { + return "", fmt.Errorf("failed to parse response: %w", err) + } + + if response.Error != nil { + return "", fmt.Errorf("API error: %s", response.Error.Message) + } + + if len(response.Choices) == 0 { + return "", fmt.Errorf("no response from API") + } + + return response.Choices[0].Message.Content, nil +} + +// Ollama API request/response structures +type ollamaRequest struct { + Model string `json:"model"` + Prompt string `json:"prompt"` + Stream bool `json:"stream"` +} + +type ollamaResponse struct { + Response string `json:"response"` + Done bool `json:"done"` +} + +// generateOllama generates a response using Ollama API +func (c *Client) generateOllama(prompt string) (string, error) { + reqBody := ollamaRequest{ + Model: "llama2", // Default model, could be configurable + Prompt: prompt, + Stream: false, + } + + jsonData, err := json.Marshal(reqBody) + if err != nil { + return "", fmt.Errorf("failed to marshal request: %w", err) + } + + req, err := http.NewRequest("POST", c.URL, bytes.NewBuffer(jsonData)) + if err != nil { + return "", fmt.Errorf("failed to create request: %w", err) + } + + req.Header.Set("Content-Type", "application/json") + + client := &http.Client{Timeout: 30 * time.Second} + resp, err := client.Do(req) + if err != nil { + return "", fmt.Errorf("failed to send request: %w", err) + } + defer resp.Body.Close() + + body, err := io.ReadAll(resp.Body) + if err != nil { + return "", fmt.Errorf("failed to read response: %w", err) + } + + if resp.StatusCode != http.StatusOK { + return "", fmt.Errorf("API error (status %d): %s", resp.StatusCode, string(body)) + } + + var response ollamaResponse + if err := json.Unmarshal(body, &response); err != nil { + return "", fmt.Errorf("failed to parse response: %w", err) + } + + return response.Response, nil +} diff --git a/internal/config/account.go b/internal/config/account.go index 27fbb6a..080da5a 100644 --- a/internal/config/account.go +++ b/internal/config/account.go @@ -32,12 +32,21 @@ type Account struct { Password string `json:"-"` // Never serialize to JSON } +// AIConfig represents AI endpoint configuration +type AIConfig struct { + Type string `json:"type"` // "openai", "ollama", or custom + URL string `json:"url"` // API endpoint URL + APIKey string `json:"-"` // Not stored in JSON, stored in passwords file +} + // Config represents the application configuration type Config struct { Servers []Server `json:"servers,omitempty"` Characters []Character `json:"characters,omitempty"` Accounts []Account `json:"accounts"` // Legacy field for backward compatibility DefaultAccount string `json:"default_account,omitempty"` + AI AIConfig `json:"ai,omitempty"` // AI configuration + AIPrompt string `json:"ai_prompt,omitempty"` // AI prompt template configPath string // Path to the config file (for testing) } @@ -257,6 +266,29 @@ func (c *Config) DeleteCharacter(username string, host string, port int) error { return fmt.Errorf("character '%s' not found on %s:%d", username, host, port) } +// SetAIConfig updates the AI configuration +func (c *Config) SetAIConfig(aiType, url string) error { + c.AI.Type = aiType + c.AI.URL = url + return c.SaveConfig() +} + +// GetAIConfig returns the AI configuration +func (c *Config) GetAIConfig() AIConfig { + return c.AI +} + +// SetAIPrompt updates the AI prompt template +func (c *Config) SetAIPrompt(prompt string) error { + c.AIPrompt = prompt + return c.SaveConfig() +} + +// GetAIPrompt returns the AI prompt template +func (c *Config) GetAIPrompt() string { + return c.AIPrompt +} + // Helper function to filter out characters for a specific server func filterCharactersByServer(characters []Character, host string, port int) []Character { var filtered []Character diff --git a/internal/tui/app.go b/internal/tui/app.go index 0c3a305..9a6652c 100644 --- a/internal/tui/app.go +++ b/internal/tui/app.go @@ -10,8 +10,10 @@ import ( "strings" "time" + "github.com/anicolao/dikuclient/internal/ai" "github.com/anicolao/dikuclient/internal/aliases" "github.com/anicolao/dikuclient/internal/client" + "github.com/anicolao/dikuclient/internal/config" "github.com/anicolao/dikuclient/internal/history" "github.com/anicolao/dikuclient/internal/mapper" "github.com/anicolao/dikuclient/internal/ticktimer" @@ -95,6 +97,7 @@ type Model struct { tickTimerManager *ticktimer.Manager // Tick timer manager lastFiredTickTime int // Last tick time when triggers were fired (to avoid duplicates) lastTriggerAction string // Last trigger action string enqueued (to avoid duplicate trigger actions) + lastCommand string // Last command sent to MUD (for trigger variable) } // XPStat represents XP per second statistics for a creature @@ -397,6 +400,9 @@ func (m *Model) Update(msg tea.Msg) (tea.Model, tea.Cmd) { m.mapLegendRooms = nil } + // Store this as the last command (for trigger variable) + m.lastCommand = command + // Send command to MUD server m.conn.Send(command) @@ -675,6 +681,9 @@ func (m *Model) Update(msg tea.Msg) (tea.Model, tea.Cmd) { } m.lastTriggerAction = action + // Substitute with the actual last command + action = strings.ReplaceAll(action, "", m.lastCommand) + // Split action on `;` to support multiple commands commands := strings.Split(action, ";") for i := range commands { @@ -2017,6 +2026,18 @@ func (m *Model) handleClientCommand(command string) tea.Cmd { case "share": m.handleShareCommand() return nil + case "configure-ai": + m.handleConfigureAICommand(args) + return nil + case "ai-prompt": + m.handleAIPromptCommand(command) + return nil + case "ai": + m.handleAICommand(command) + return nil + case "howto": + m.handleHowtoCommand(command) + return nil case "help": m.handleHelpCommand(args) return nil @@ -2241,6 +2262,211 @@ func (m *Model) handleShareCommand() { m.output = append(m.output, "\x1b[90mAnyone who opens this URL will see and control the same session\x1b[0m") } +// handleConfigureAICommand configures the AI endpoint +func (m *Model) handleConfigureAICommand(args []string) { + if len(args) < 2 { + m.output = append(m.output, "\x1b[91mUsage: /configure-ai [api-key]\x1b[0m") + m.output = append(m.output, "\x1b[93mExamples:\x1b[0m") + m.output = append(m.output, " /configure-ai openai https://api.openai.com/v1/chat/completions sk-...") + m.output = append(m.output, " /configure-ai ollama http://localhost:11434/api/generate") + return + } + + aiType := strings.ToLower(args[0]) + url := args[1] + var apiKey string + if len(args) > 2 { + apiKey = args[2] + } + + // Load config + cfg, err := config.LoadConfig() + if err != nil { + m.output = append(m.output, fmt.Sprintf("\x1b[91mError loading config: %v\x1b[0m", err)) + return + } + + // Set AI config + if err := cfg.SetAIConfig(aiType, url); err != nil { + m.output = append(m.output, fmt.Sprintf("\x1b[91mError saving AI config: %v\x1b[0m", err)) + return + } + + // Store API key if provided + if apiKey != "" { + // Store in AI config (it will be handled by password store in the future) + cfg.AI.APIKey = apiKey + } + + m.output = append(m.output, fmt.Sprintf("\x1b[92mAI configured: %s at %s\x1b[0m", aiType, url)) +} + +// handleAIPromptCommand configures the AI prompt template +func (m *Model) handleAIPromptCommand(command string) { + // Remove "/ai-prompt " prefix + command = strings.TrimPrefix(command, "ai-prompt ") + command = strings.TrimSpace(command) + + // Check for preset + if strings.HasPrefix(command, "preset ") { + preset := strings.TrimPrefix(command, "preset ") + preset = strings.TrimSpace(preset) + + switch strings.ToLower(preset) { + case "barsoom": + command = "PLACEHOLDER" + default: + m.output = append(m.output, fmt.Sprintf("\x1b[91mError: Unknown preset '%s'\x1b[0m", preset)) + m.output = append(m.output, "\x1b[93mAvailable presets: barsoom\x1b[0m") + return + } + } else { + // Parse quoted string for custom prompt + if !strings.HasPrefix(command, "\"") { + m.output = append(m.output, "\x1b[91mUsage: /ai-prompt \"\" or /ai-prompt preset \x1b[0m") + m.output = append(m.output, "\x1b[93mExample: /ai-prompt \"You are a helpful MUD assistant\"\x1b[0m") + m.output = append(m.output, "\x1b[93mExample: /ai-prompt preset barsoom\x1b[0m") + return + } + + // Extract quoted string + endQuote := strings.Index(command[1:], "\"") + if endQuote == -1 { + m.output = append(m.output, "\x1b[91mError: Missing closing quote\x1b[0m") + return + } + command = command[1 : endQuote+1] + } + + // Load config + cfg, err := config.LoadConfig() + if err != nil { + m.output = append(m.output, fmt.Sprintf("\x1b[91mError loading config: %v\x1b[0m", err)) + return + } + + // Set AI prompt + if err := cfg.SetAIPrompt(command); err != nil { + m.output = append(m.output, fmt.Sprintf("\x1b[91mError saving AI prompt: %v\x1b[0m", err)) + return + } + + m.output = append(m.output, fmt.Sprintf("\x1b[92mAI prompt set: %s\x1b[0m", command)) +} + +// handleAICommand sends a prompt to the AI and executes the suggested command +func (m *Model) handleAICommand(command string) { + // Remove "/ai " prefix + command = strings.TrimPrefix(command, "ai ") + command = strings.TrimSpace(command) + + if command == "" { + m.output = append(m.output, "\x1b[91mUsage: /ai \x1b[0m") + m.output = append(m.output, "\x1b[93mExample: /ai \x1b[0m") + return + } + + // Substitute + prompt := strings.ReplaceAll(command, "", m.lastCommand) + + // Load config to get AI settings + cfg, err := config.LoadConfig() + if err != nil { + m.output = append(m.output, fmt.Sprintf("\x1b[91mError loading config: %v\x1b[0m", err)) + return + } + + if cfg.AI.Type == "" || cfg.AI.URL == "" { + m.output = append(m.output, "\x1b[91mError: AI not configured. Use /configure-ai first.\x1b[0m") + m.output = append(m.output, "\x1b[93mExample: /configure-ai ollama http://localhost:11434/api/generate\x1b[0m") + return + } + + // Build full prompt with template if set + fullPrompt := prompt + if cfg.AIPrompt != "" { + fullPrompt = strings.ReplaceAll(cfg.AIPrompt, "{command}", prompt) + } + + // Check if we're in web mode + if m.webSessionID != "" { + // TODO: Send request to web client to make the AI call + m.output = append(m.output, "\x1b[93m[AI request in web mode - not yet implemented]\x1b[0m") + return + } + + // CLI mode - make the AI request directly + m.output = append(m.output, "\x1b[90m[AI: Generating response...]\x1b[0m") + + aiClient := ai.NewClient(cfg.AI.Type, cfg.AI.URL, cfg.AI.APIKey) + response, err := aiClient.GenerateResponse(fullPrompt) + if err != nil { + m.output = append(m.output, fmt.Sprintf("\x1b[91mAI Error: %v\x1b[0m", err)) + return + } + + // Display AI response + m.output = append(m.output, fmt.Sprintf("\x1b[96m[AI suggests: %s]\x1b[0m", response)) + + // Send the suggested command to the MUD + if m.conn != nil { + m.conn.Send(response) + m.lastCommand = response + } +} + +// handleHowtoCommand asks the AI how to do something but doesn't execute +func (m *Model) handleHowtoCommand(command string) { + // Remove "/howto " prefix + command = strings.TrimPrefix(command, "howto ") + command = strings.TrimSpace(command) + + if command == "" { + m.output = append(m.output, "\x1b[91mUsage: /howto \x1b[0m") + m.output = append(m.output, "\x1b[93mExample: /howto heal myself\x1b[0m") + return + } + + // Load config to get AI settings + cfg, err := config.LoadConfig() + if err != nil { + m.output = append(m.output, fmt.Sprintf("\x1b[91mError loading config: %v\x1b[0m", err)) + return + } + + if cfg.AI.Type == "" || cfg.AI.URL == "" { + m.output = append(m.output, "\x1b[91mError: AI not configured. Use /configure-ai first.\x1b[0m") + m.output = append(m.output, "\x1b[93mExample: /configure-ai ollama http://localhost:11434/api/generate\x1b[0m") + return + } + + // Build prompt asking for instructions + fullPrompt := "How do I " + command + " in a MUD game?" + if cfg.AIPrompt != "" { + fullPrompt = strings.ReplaceAll(cfg.AIPrompt, "{command}", "How to: "+command) + } + + // Check if we're in web mode + if m.webSessionID != "" { + // TODO: Send request to web client to make the AI call + m.output = append(m.output, "\x1b[93m[AI request in web mode - not yet implemented]\x1b[0m") + return + } + + // CLI mode - make the AI request directly + m.output = append(m.output, "\x1b[90m[AI: Generating response...]\x1b[0m") + + aiClient := ai.NewClient(cfg.AI.Type, cfg.AI.URL, cfg.AI.APIKey) + response, err := aiClient.GenerateResponse(fullPrompt) + if err != nil { + m.output = append(m.output, fmt.Sprintf("\x1b[91mAI Error: %v\x1b[0m", err)) + return + } + + // Display AI response (like a trigger match) + m.output = append(m.output, fmt.Sprintf("\x1b[90m[AI: %s]\x1b[0m", response)) +} + // handleHelpCommand shows available client commands or detailed help for a specific command func (m *Model) handleHelpCommand(args []string) { // If a specific command is requested, show detailed help @@ -2268,6 +2494,10 @@ func (m *Model) handleHelpCommand(args []string) { m.output = append(m.output, " \x1b[96m/alias \"name\" \"tmpl\"\x1b[0m - Add an alias (template can use )") m.output = append(m.output, " \x1b[96m/aliases list\x1b[0m - List all aliases") m.output = append(m.output, " \x1b[96m/aliases remove \x1b[0m - Remove alias by number") + m.output = append(m.output, " \x1b[96m/configure-ai \x1b[0m - Configure AI endpoint") + m.output = append(m.output, " \x1b[96m/ai-prompt \"prompt\"\x1b[0m - Set AI prompt template") + m.output = append(m.output, " \x1b[96m/ai \x1b[0m - Get AI suggestion and execute it") + m.output = append(m.output, " \x1b[96m/howto \x1b[0m - Ask AI how to do something (info only)") m.output = append(m.output, " \x1b[96m/share\x1b[0m - Get shareable URL (web mode only)") m.output = append(m.output, " \x1b[96m/help [command]\x1b[0m - Show this help or detailed help for a command") m.output = append(m.output, "") @@ -2506,6 +2736,77 @@ func (m *Model) showDetailedHelp(cmd string) { m.output = append(m.output, "\x1b[90mNote: Only available in web mode\x1b[0m") m.output = append(m.output, "\x1b[90mStart web mode with: dikuclient --web\x1b[0m") + case "configure-ai": + m.output = append(m.output, "\x1b[92m=== /configure-ai - Configure AI Endpoint ===\x1b[0m") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[96mUsage:\x1b[0m") + m.output = append(m.output, " /configure-ai [api-key]") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[96mDescription:\x1b[0m") + m.output = append(m.output, " Configures the AI endpoint for command suggestions.") + m.output = append(m.output, " Supported types: openai, ollama") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[96mExamples:\x1b[0m") + m.output = append(m.output, " /configure-ai openai https://api.openai.com/v1/chat/completions sk-...") + m.output = append(m.output, " /configure-ai ollama http://localhost:11434/api/generate") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[90mSee also: /help ai-prompt, /help ai\x1b[0m") + + case "ai-prompt": + m.output = append(m.output, "\x1b[92m=== /ai-prompt - Configure AI Prompt ===\x1b[0m") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[96mUsage:\x1b[0m") + m.output = append(m.output, " /ai-prompt \"\"") + m.output = append(m.output, " /ai-prompt preset ") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[96mDescription:\x1b[0m") + m.output = append(m.output, " Sets the prompt template for AI requests.") + m.output = append(m.output, " Use {command} as a placeholder for the user's input.") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[96mExamples:\x1b[0m") + m.output = append(m.output, " /ai-prompt \"You are a helpful MUD assistant. The command was: {command}\"") + m.output = append(m.output, " /ai-prompt preset barsoom") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[90mAvailable presets: barsoom\x1b[0m") + + case "ai": + m.output = append(m.output, "\x1b[92m=== /ai - Get AI Command Suggestion ===\x1b[0m") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[96mUsage:\x1b[0m") + m.output = append(m.output, " /ai ") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[96mDescription:\x1b[0m") + m.output = append(m.output, " Sends a prompt to the configured AI and executes the suggested command.") + m.output = append(m.output, " Use to reference the last command you sent.") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[96mExamples:\x1b[0m") + m.output = append(m.output, " /ai - Get suggestion for last command") + m.output = append(m.output, " /ai fix my health - Ask AI to suggest healing command") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[96mCommon Pattern:\x1b[0m") + m.output = append(m.output, " /trigger \"Huh?!\" \"/ai \"") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[90mRequires: /configure-ai to be set first\x1b[0m") + m.output = append(m.output, "\x1b[90mSee also: /help howto, /help trigger\x1b[0m") + + case "howto": + m.output = append(m.output, "\x1b[92m=== /howto - Ask AI How To Do Something ===\x1b[0m") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[96mUsage:\x1b[0m") + m.output = append(m.output, " /howto ") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[96mDescription:\x1b[0m") + m.output = append(m.output, " Asks the AI how to do something and displays the answer.") + m.output = append(m.output, " Unlike /ai, this does not execute any commands.") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[96mExamples:\x1b[0m") + m.output = append(m.output, " /howto heal myself") + m.output = append(m.output, " /howto find the marketplace") + m.output = append(m.output, " /howto cast spells") + m.output = append(m.output, "") + m.output = append(m.output, "\x1b[90mRequires: /configure-ai to be set first\x1b[0m") + m.output = append(m.output, "\x1b[90mSee also: /help ai\x1b[0m") + case "help": m.output = append(m.output, "\x1b[92m=== /help - Show Help Information ===\x1b[0m") m.output = append(m.output, "") @@ -2526,7 +2827,8 @@ func (m *Model) showDetailedHelp(cmd string) { m.output = append(m.output, "") m.output = append(m.output, "Available commands for detailed help:") m.output = append(m.output, " point, wayfind, go, stop, map, rooms, nearby, legend,") - m.output = append(m.output, " trigger, triggers, ticktrigger, ticktriggers, alias, aliases, share, help") + m.output = append(m.output, " trigger, triggers, ticktrigger, ticktriggers, alias, aliases,") + m.output = append(m.output, " configure-ai, ai-prompt, ai, howto, share, help") m.output = append(m.output, "") m.output = append(m.output, "Use /help to see all commands") } From b5dc0e5a4c215b5dbabf3c33069078aa791fb868 Mon Sep 17 00:00:00 2001 From: "copilot-swe-agent[bot]" <198982749+Copilot@users.noreply.github.com> Date: Sun, 26 Oct 2025 04:57:55 +0000 Subject: [PATCH 04/11] Add tests, documentation, and integration script for AI features Co-authored-by: anicolao <1145048+anicolao@users.noreply.github.com> --- README.md | 37 +++++++++++++++- internal/ai/client_test.go | 59 ++++++++++++++++++++++++++ internal/triggers/last_command_test.go | 50 ++++++++++++++++++++++ test_ai_integration.sh | 52 +++++++++++++++++++++++ 4 files changed, 197 insertions(+), 1 deletion(-) create mode 100644 internal/ai/client_test.go create mode 100644 internal/triggers/last_command_test.go create mode 100755 test_ai_integration.sh diff --git a/README.md b/README.md index 0c3812a..f11ce6b 100644 --- a/README.md +++ b/README.md @@ -146,11 +146,17 @@ The client automatically builds a map as you explore: - `/ticktrigger