v1.18.0 - code-server (VS Code in the Browser)
Release Date: 2026-01-26
β¨ New Features
π₯οΈ code-server - VS Code in the Browser
Full Visual Studio Code experience running in your browser with AI coding assistant support.
- AI Coding Extensions: Support for Claude Code, OpenCode, Continue, and other AI assistants
- Persistent Workspace: Files, settings, and extensions persist across sessions
- Shared Folder Integration: Direct access to AI LaunchKit's
./sharedfolder - Full Terminal Access: Complete Linux terminal with sudo capabilities
- Extension Marketplace: Access to Open VSX Registry
Access: https://code.yourdomain.com
New Environment Variables:
CODESERVER_HOSTNAME- Subdomain for code-serverCODESERVER_PASSWORD- Login password (auto-generated)CODESERVER_SUDO_PASSWORD- Sudo password for terminal (auto-generated)
π Installation
Select "code-server" in the installation wizard under AI-Powered Development tools.
π§ Post-Installation: Enable AI Coding Tools
Node.js for Claude Code Extension
code-server requires Node.js 18+ for the Claude Code extension. This is automatically installed via init script on container start.
Manual installation (if needed):
docker exec -it code-server bash -c "curl -fsSL https://deb.nodesource.com/setup_20.x | bash - && apt-get install -y nodejs"Install OpenCode CLI
docker exec -it code-server npm i -g opencode-aiInstall Ollama CLI (for ollama launch)
docker exec -it code-server bash -c "apt-get update && apt-get install -y zstd && curl -fsSL https://ollama.com/install.sh | sh"Configure OpenCode with Ollama Cloud
For cloud models (recommended for larger context windows):
-
Get your API key from https://ollama.com/settings/keys
-
Create the config file in code-server terminal:
mkdir -p /config/.config/opencode
cat > /config/.config/opencode/opencode.json << 'EOF'
{
"$schema": "https://opencode.ai/config.json",
"model": "ollama/glm-4.7:cloud",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"name": "Ollama Cloud",
"options": {
"baseURL": "https://ollama.com/v1",
"apiKey": "YOUR_OLLAMA_API_KEY"
},
"models": {
"glm-4.7:cloud": {
"name": "GLM 4.7 Cloud"
}
}
}
}
}
EOF- Start OpenCode:
opencodeConfigure OpenCode with Local Ollama
For local models via AI LaunchKit's Ollama container:
mkdir -p /config/.config/opencode
cat > /config/.config/opencode/opencode.json << 'EOF'
{
"$schema": "https://opencode.ai/config.json",
"model": "ollama/qwen2.5:7b-instruct-q4_K_M",
"provider": {
"ollama": {
"npm": "@ai-sdk/openai-compatible",
"name": "Ollama Local",
"options": {
"baseURL": "http://ollama:11434/v1"
},
"models": {
"qwen2.5:7b-instruct-q4_K_M": {
"name": "Qwen 2.5 7B"
}
}
}
}
}
EOFn8n-MCP for Workflow Generation
If you have n8n-MCP enabled in AI LaunchKit, you can install the MCP client in code-server:
docker exec -it code-server npm i -g n8n-mcpConfiguration for OpenCode/Claude Code:
# In code-server terminal:
cat >> ~/.bashrc << 'EOF'
export N8N_MCP_URL="http://n8nmcp:3000"
export N8N_MCP_TOKEN="your-token-from-env"
export N8N_API_URL="http://n8n:5678"
export N8N_API_KEY="your-n8n-api-key"
EOF
source ~/.bashrcThis enables AI assistants to generate n8n workflows directly!
π Integration with AI LaunchKit Services
| Service | Access from code-server |
|---|---|
| n8n | http://n8n:5678 |
| n8n-MCP | http://n8nmcp:3000 |
| Ollama (local) | http://ollama:11434 |
| Ollama Cloud | https://ollama.com/v1 |
| Gitea | http://gitea:3000 |
| Shared Folder | /config/workspace/shared |
π Documentation
Full documentation added to README.md including:
- Initial setup guide
- AI extension configuration (Claude Code, OpenCode)
- Ollama Cloud and local model configuration
- n8n-MCP integration for workflow generation
- Troubleshooting guide
Full Changelog: v1.17.2...v1.18.0