Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
230 changes: 230 additions & 0 deletions .cursor/rules/vercel-ai-sdk.mdc
Original file line number Diff line number Diff line change
@@ -0,0 +1,230 @@
---
description: Correct AI SDK patterns for this project — prevents hallucination of outdated APIs
globs: packages/giselle/src/generations/**/*.ts
alwaysApply: false
Comment on lines +1 to +4

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Action required

1. V6 rule vs repo ai@5 🐞 Bug ✓ Correctness

The rule is labeled as “Vercel AI SDK v6 patterns” and applies to all *.ts/*.tsx, but the repo
catalog pins ai to 5.0.101; this can drive agents to generate APIs/patterns that may not exist in
the installed SDK, leading to compilation/runtime failures when code is added based on this rule.
Agent Prompt
## Issue description
The new Cursor rule is presented as “Vercel AI SDK v6 patterns” but the repo pins `ai` to 5.0.101. Since the rule’s `globs` cover all TS/TSX files, agents may generate code using APIs/patterns that don’t exist for the repo’s installed SDK version, causing compile/runtime errors.

## Issue Context
Cursor rules influence generated code. If the rule is version-specific, it should either be scoped to the relevant code areas or clearly gated by the dependency version.

## Fix Focus Areas
- .cursor/rules/vercel-ai-sdk.mdc[1-12]
- pnpm-workspace.yaml[52-60]

ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools

---

# AI SDK — Correct Patterns for This Project

> **How this rule differs from `update-ai-sdk.mdc`:** That rule covers *upgrading SDK versions* (running `pnpm outdated`, updating the catalog). This rule covers *writing correct code* against the current pinned SDK version.

LLMs frequently hallucinate outdated AI SDK APIs. Follow these rules strictly.

## 1. `toolContext` → `experimental_context`

**NEVER** use `toolContext`. It was removed.

```typescript
// ✅ CORRECT
import { tool } from "ai";
import { z } from "zod";

const myTool = tool({
description: "Do something with user context",
inputSchema: z.object({
query: z.string().describe("The search query"),
}),
execute: async (input, { experimental_context }) => {
const ctx = experimental_context as { userId: string };
// use ctx.userId
},
});

// Pass context at the call site:
const result = streamText({
model,
tools: { myTool },
experimental_context: { userId: "123" },
// ...
});
```

```typescript
// ❌ WRONG — removed API, will fail silently
execute: async (args, { toolContext }) => { ... }
```

**Source:** https://sdk.vercel.ai/docs/ai-sdk-core/tools-and-tool-calling#context-experimental

---

## 2. `parameters` → `inputSchema`

The `tool()` function property for defining input is `inputSchema`, not `parameters`.

```typescript
// ✅ CORRECT
tool({
description: "Get weather",
inputSchema: z.object({
location: z.string().describe("City name"),
}),
execute: async ({ location }) => { ... },
});
```

```typescript
// ❌ WRONG — old property name
tool({
parameters: z.object({ ... }), // Does not exist
});
```

**Source:** https://sdk.vercel.ai/docs/ai-sdk-core/tools-and-tool-calling#tool-calling

---

## 3. `strict: true` is a Tool Option (not Zod)

`strict` is a **top-level property on the tool**, not a method on the Zod schema. Always add `.describe()` to every field.

```typescript
// ✅ CORRECT
tool({
description: "Get weather",
inputSchema: z.object({
location: z.string().describe("City name, e.g. London"),
}),
strict: true, // <-- top-level tool option
execute: async ({ location }) => { ... },
});
```

```typescript
// ❌ WRONG — .strict() is a Zod method, not the SDK's strict mode
inputSchema: z.object({ ... }).strict(),
```

**Source:** https://sdk.vercel.ai/docs/ai-sdk-core/tools-and-tool-calling#strict-mode

---

## 4. `toDataStreamResponse()` → `toUIMessageStream()`

This project uses `toUIMessageStream()` — not `toDataStreamResponse()` or `toUIMessageStreamResponse()`.

```typescript
// ✅ CORRECT — what this project uses
const uiMessageStream = streamTextResult.toUIMessageStream({
onFinish: async ({ messages }) => {
// handle completion
},
});
```

```typescript
// ❌ WRONG — outdated helpers
result.toDataStreamResponse();
result.toUIMessageStreamResponse();
```
Comment on lines +115 to +119
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟡 Minor

toUIMessageStreamResponse() is mislabeled as "outdated."

toUIMessageStreamResponse was introduced in the v5 SDK and is still a valid, current helper — it simply wraps the raw stream into a Response object. Calling it "outdated" alongside the genuinely removed toDataStreamResponse() is factually wrong and may cause an agent to avoid it even in contexts where it's appropriate.

📝 Suggested wording fix
 // ❌ WRONG — outdated helpers
 result.toDataStreamResponse();
-result.toUIMessageStreamResponse();
+// ❌ AVOID in this project — wraps the stream into a Response directly;
+//    use toUIMessageStream() and compose the response via createUIMessageStreamResponse()
+// result.toUIMessageStreamResponse();
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.cursor/rules/vercel-ai-sdk.mdc around lines 115 - 119, Update the
documentation so it does not label toUIMessageStreamResponse as "outdated":
explicitly state that toUIMessageStreamResponse (introduced in the v5 SDK) is a
current helper that wraps a raw stream into a Response object, while
toDataStreamResponse is the removed/outdated helper; locate the example lines
showing result.toDataStreamResponse() and result.toUIMessageStreamResponse() and
adjust the wording to reflect this distinction.


---

## 5. `maxSteps` → `stopWhen`

Multi-step tool calls use `stopWhen`, not `maxSteps`. Import `stepCountIs` from `"ai"` for simple step limits, or use a custom function.

```typescript
// ✅ CORRECT — simple step limit
import { streamText, stepCountIs } from "ai";

const result = streamText({
model,
stopWhen: stepCountIs(5),
tools: { ... },
});

// ✅ CORRECT — custom stop condition (used in this project)
stopWhen: ({ steps }) => {
const lastStep = steps[steps.length - 1];
return lastStep.finishReason !== "tool-calls";
},
```

```typescript
// ❌ WRONG — deprecated option
streamText({ maxSteps: 5, ... });
```

**Source:** https://sdk.vercel.ai/docs/ai-sdk-core/tools-and-tool-calling#multi-step-calls-using-stopwhen

---

## 6. Messages: `UIMessage` + `ModelMessage`

Client messages are `UIMessage` type. Model calls expect `ModelMessage`. Use `convertToModelMessages()` to convert between them.

```typescript
// ✅ CORRECT
import { streamText, type UIMessage, type ModelMessage } from "ai";
```

```typescript
// ❌ WRONG — using untyped message arrays
const { messages } = await req.json();
streamText({ messages, ... }); // Missing type safety
```
Comment on lines +153 to +166
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Rule 6's ✅ CORRECT example never demonstrates convertToModelMessages() usage.

The prose on line 155 says to use convertToModelMessages() to convert between UIMessage[] and ModelMessage[], but the code example shows only type imports — the actual call site is never demonstrated. An agent following this rule won't know where or how to invoke the conversion.

Compounding this, the canonical pattern (lines 211–213) passes messages directly to streamText without any conversion, directly contradicting the advice here.

📝 Proposed example additions
 // ✅ CORRECT
-import { streamText, type UIMessage, type ModelMessage } from "ai";
+import { streamText, convertToModelMessages, type UIMessage } from "ai";
+
+// In your route handler:
+const { messages }: { messages: UIMessage[] } = await req.json();
+
+const result = streamText({
+  model,
+  messages: convertToModelMessages(messages), // UIMessage[] → ModelMessage[]
+  tools: { ... },
+});

And in the canonical pattern (line 213):

-  messages,
+  messages: convertToModelMessages(messages),
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
## 6. Messages: `UIMessage` + `ModelMessage`
Client messages are `UIMessage` type. Model calls expect `ModelMessage`. Use `convertToModelMessages()` to convert between them.
```typescript
// ✅ CORRECT
import { streamText, type UIMessage, type ModelMessage } from "ai";
```
```typescript
// ❌ WRONG — using untyped message arrays
const { messages } = await req.json();
streamText({ messages, ... }); // Missing type safety
```
## 6. Messages: `UIMessage` + `ModelMessage`
Client messages are `UIMessage` type. Model calls expect `ModelMessage`. Use `convertToModelMessages()` to convert between them.
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.cursor/rules/vercel-ai-sdk.mdc around lines 153 - 166, Update the
docs/examples to actually show converting UIMessage[] to ModelMessage[] using
convertToModelMessages() before passing messages into model APIs: replace or
augment the “✅ CORRECT” snippet (which currently just imports UIMessage and
ModelMessage) with a short example that calls convertToModelMessages(messages)
and then calls streamText({ messages: modelMessages, ... }); also fix the
canonical pattern that currently passes messages directly (lines around
streamText call) to instead use the converted modelMessages so the examples
consistently demonstrate convertToModelMessages(), referencing the types
UIMessage, ModelMessage, the helper convertToModelMessages, and the streamText
call site.


**Source:** https://sdk.vercel.ai/docs/getting-started/nextjs-app-router#create-a-route-handler

---

## 7. `generateObject` — Heads-Up for Future Migration

> **Note:** This project currently uses `generateObject` which is valid for our pinned SDK version. However, upstream AI SDK v6 has deprecated `generateObject` and `streamObject` in favor of `generateText` / `streamText` with `Output.object({ schema })`. Keep this in mind for the next SDK upgrade.

```typescript
// Current (v5) — valid for this project
import { generateObject } from "ai";
const { object } = await generateObject({ model, schema, prompt });

// Future (v6) — for when the project upgrades
import { generateText, Output } from "ai";
const { output } = await generateText({
model,
output: Output.object({ schema }),
prompt,
});
```

**Source:** https://sdk.vercel.ai/docs/ai-sdk-core/generating-structured-data#generateobject-and-streamobject-legacy

---

## Reference: Canonical Streaming Pattern

A correct streaming setup following this project's conventions:

```typescript
import {
streamText,
stepCountIs,
smoothStream,
type UIMessage,
} from "ai";
import { createGateway } from "@ai-sdk/gateway";

const gateway = createGateway({
headers: aiGatewayHeaders,
});

const streamTextResult = streamText({
model: gateway("openai/gpt-4o"),
messages,
tools: toolSet,
stopWhen: stepCountIs(5),
experimental_transform: smoothStream({
delayInMs: 1000,
chunking: "line",
}),
onError: ({ error }) => {
// handle error
},
});

const uiMessageStream = streamTextResult.toUIMessageStream({
onFinish: async ({ messages: generateMessages }) => {
// handle completion
},
});
```
Comment on lines +194 to +230
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue | 🟠 Major

Canonical "Route Handler" pattern is incomplete: no HTTP response is returned.

The PR describes this as a "canonical Route Handler reference template for copy-paste," but the snippet ends after creating uiMessageStream (line 225) with no return statement. A copy-pasted route handler would silently return nothing.

It should close with something like:

 const uiMessageStream = streamTextResult.toUIMessageStream({
   onFinish: async ({ messages: generateMessages }) => {
     // handle completion
   },
 });
+
+return new Response(uiMessageStream, {
+  headers: { "Content-Type": "text/plain; charset=utf-8" },
+});
+// — or, using the SDK helper —
+// import { createUIMessageStreamResponse } from "ai";
+// return createUIMessageStreamResponse({ stream: uiMessageStream });
🤖 Prompt for AI Agents
Verify each finding against the current code and only fix it if needed.

In @.cursor/rules/vercel-ai-sdk.mdc around lines 194 - 230, The example stops
after creating uiMessageStream (from streamTextResult.toUIMessageStream) and
never returns an HTTP response; update the canonical Route Handler snippet to
return the stream as the route's HTTP response (take uiMessageStream produced by
streamTextResult.toUIMessageStream and send it back from the handler with
appropriate status and streaming/content-type headers) so a copy-pasteable
handler actually returns the streaming response to the client.