Skip to content
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added playbooks/dependencies/assets/ChatwithLogs.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added playbooks/dependencies/assets/multi_modality.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
61 changes: 49 additions & 12 deletions playbooks/dependencies/lemonade.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,31 +2,68 @@

#### Installing Lemonade

Download and install Lemonade Server from [lemonade-server.ai](https://lemonade-server.ai).

#### Starting Lemonade

<!-- @os:windows -->
Download the latest installer from [lemonade-server.ai](https://github.com/lemonade-sdk/lemonade/releases/latest/download/lemonade.msi) and run the `.msi` file. The installer adds `lemonade-server` to your system PATH automatically.

You can also install silently from the command line:

1. Open PowerShell or Command Prompt
2. Start Lemonade with gpt-oss-120b:
```cmd
lemonade-server run gpt-oss-120b-mxfp4-GGUF
msiexec /i lemonade-server-minimal.msi /qn
```
<!-- @os:end -->

The server starts on `http://localhost:8000` with an OpenAI-compatible API at `/api/v1`.
<!-- @os:linux -->
**Ubuntu (snap):**
```bash
sudo snap install lemonade-server
```

**Arch Linux (AUR):**
```bash
yay -S lemonade-server
```

For other distributions or to install from source, see the [full installation options](https://lemonade-server.ai/install_options.html).
<!-- @os:end -->

<!-- @os:linux -->
#### Verifying Lemonade Installation

Open a terminal and run:

```
lemonade-server --version
```

Start Lemonade with gpt-oss-120b and ROCm backend:
You should see output like:

```
lemonade-server x.y.z
```

If you see a version number, Lemonade is installed and ready to go.


#### Starting Lemonade

To start the server, open a terminal and run:
```bash
lemonade-server run gpt-oss-120b-mxfp4-GGUF --llamacpp rocm
lemonade-server serve
```

The server starts on `http://localhost:8000` with an OpenAI-compatible API at `/api/v1`.

To run a specific model immediately, use the `run` command:

```bash
lemonade-server run Gemma-3-4b-it-GGUF
```

> **Tip**: Use `lemonade-server list` to see available models, or `lemonade-server pull <MODEL_NAME>` to download new ones.

<!-- @os:end -->
To start Lemonade server with ROCm backend:

```bash
lemonade-server serve --llamacpp rocm
```

For the latest installation options or troubleshooting, please refer to the official Lemonade documentation.
Loading
Loading