Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added playbooks/dependencies/assets/ChatwithLogs.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added playbooks/dependencies/assets/multi_modality.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
67 changes: 47 additions & 20 deletions playbooks/dependencies/lemonade.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,41 +2,68 @@

#### Installing Lemonade

Download and install Lemonade Server from [lemonade-server.ai](https://lemonade-server.ai).
<!-- @os:windows -->
Download the latest installer from [lemonade-server.ai](https://github.com/lemonade-sdk/lemonade/releases/latest/download/lemonade.msi) and run the `.msi` file. The installer adds `lemonade-server` to your system PATH automatically.

#### Starting Lemonade
You can also install silently from the command line:

<!-- @os:windows -->
<!-- @test:id=lemonade-chat-gpt-oss-120b timeout=1200 hidden=True -->
```powershell
lemonade-server --version
winget upgrade -e --id AMD.LemonadeServer
lemonade-server --version
```cmd
msiexec /i lemonade-server-minimal.msi /qn
```
<!-- @test:end -->
<!-- @os:end -->

<!-- @os:windows -->

1. Open PowerShell or Command Prompt
2. Start Lemonade with gpt-oss-120b:
```cmd
lemonade-server run gpt-oss-120b-mxfp4-GGUF
<!-- @os:linux -->
**Ubuntu (snap):**
```bash
sudo snap install lemonade-server
```

The server starts on `http://localhost:8000` with an OpenAI-compatible API at `/api/v1`.
**Arch Linux (AUR):**
```bash
yay -S lemonade-server
```

For other distributions or to install from source, see the [full installation options](https://lemonade-server.ai/install_options.html).
<!-- @os:end -->

<!-- @os:linux -->
#### Verifying Lemonade Installation

Start Lemonade with gpt-oss-120b and ROCm backend:
Open a terminal and run:

```
lemonade-server --version
```

You should see output like:

```
lemonade-server x.y.z
```

If you see a version number, Lemonade is installed and ready to go.


#### Starting Lemonade

To start the server, open a terminal and run:
```bash
lemonade-server run gpt-oss-120b-mxfp4-GGUF --llamacpp rocm
lemonade-server serve
```

The server starts on `http://localhost:8000` with an OpenAI-compatible API at `/api/v1`.

To run a specific model immediately, use the `run` command:

```bash
lemonade-server run Gemma-3-4b-it-GGUF
```

> **Tip**: Use `lemonade-server list` to see available models, or `lemonade-server pull <MODEL_NAME>` to download new ones.

<!-- @os:end -->
To start Lemonade server with ROCm backend:

```bash
lemonade-server serve --llamacpp rocm
```

For the latest installation options or troubleshooting, please refer to the official Lemonade documentation.
Loading
Loading