Skip to content

Commit 75702fa

Browse files
mschristensenclaude
andcommitted
docs: reframe README intro around the problems Ably solves
Replace the feature list with a problem-first "Why" section that contrasts the default HTTP transport with what this SDK provides. Each bullet now leads with the limitation it addresses. Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
1 parent b97f455 commit 75702fa

1 file changed

Lines changed: 12 additions & 15 deletions

File tree

README.md

Lines changed: 12 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -8,21 +8,18 @@ A durable transport layer between AI agents and users. Streams AI responses over
88
99
---
1010

11-
## What it does
12-
13-
Without a durable session layer, AI streaming is fragile. Connections drop mid-response, refreshing the page loses the conversation, switching devices means starting over, and cancellation requires custom plumbing on every project.
14-
15-
This SDK handles the transport between your AI backend and your frontend:
16-
17-
- **Token streaming** — AI responses stream in real time over Ably channels
18-
- **Connection recovery** — Ably automatically reconnects after network blips and delivers any messages the client missed
19-
- **Resumable streams** — Clients that join or rejoin mid-response receive the in-progress stream immediately on subscribing to the channel
20-
- **Cancellation** — Client publishes a cancel signal, server aborts the generation
21-
- **Barge-in** — Users send new messages while the AI is still responding
22-
- **Branching conversations** — Regenerate or edit messages, creating forks in the conversation tree
23-
- **Multi-device sync** — Multiple clients on the same channel see the same conversation in real time
24-
- **History** — Conversations persist on the channel; new clients or returning sessions hydrate from history
25-
- **Turn management** — Concurrent turns, per-turn streams, turn lifecycle events
11+
## Why
12+
13+
The default AI SDK transport streams tokens over an HTTP response body. This works for simple single-tab chat, but breaks down when you need:
14+
15+
- **Resumable streaming** — If the user's connection drops mid-response, the HTTP stream is lost. With Ably, the client reconnects and picks up where it left off. No token loss, no restart.
16+
- **Multi-device and multi-tab sync** — Two browser tabs, a phone and a laptop, or multiple users on the same conversation. All devices subscribe to the same Ably channel and see the same stream in real time.
17+
- **Reliable cancellation** — Cancel signals travel over the Ably channel, not the HTTP connection. Cancellation works across devices and survives network interruptions.
18+
- **Concurrent turns** — Multiple request-response cycles can run in parallel on the same channel. Each turn has its own stream and abort signal. The transport multiplexes them automatically.
19+
- **Conversation history from the channel** — The Ably channel *is* the conversation history. Clients can hydrate their UI from channel history on load or reconnection — no separate database query needed.
20+
- **Serverless compatibility** — The AI response streams over Ably, not the HTTP response body. The HTTP request returns immediately, and Next.js `after()` keeps the function alive until streaming completes. No timeout risk.
21+
- **Branching conversations** — Regenerate or edit messages to create forks in the conversation tree. The SDK tracks parent/child relationships and exposes a navigable tree.
22+
- **Barge-in** — Users send new messages while the AI is still responding, with composable primitives for cancel-and-resend or queue-until-complete patterns.
2623

2724
The SDK is codec-agnostic. A `Codec` translates between your AI framework's types and the Ably wire format. A Vercel AI SDK codec ships built-in.
2825

0 commit comments

Comments
 (0)