Skip to content

Comments

Declarative agent prototype: Add OpenResponses streaming format support to AI SDK provider#335

Open
smurching wants to merge 3 commits intodatabricks:mainfrom
smurching:feature/add-openresponses-support
Open

Declarative agent prototype: Add OpenResponses streaming format support to AI SDK provider#335
smurching wants to merge 3 commits intodatabricks:mainfrom
smurching:feature/add-openresponses-support

Conversation

@smurching
Copy link
Collaborator

Summary

Add OpenResponsesLanguageModel to the AI SDK provider to support agent apps that return OpenResponses SSE format.

This moves OpenResponses support from individual consumer apps (like e2e-chatbot-app-next) into the shared library, making it available to all TypeScript/JavaScript consumers.

Related Issue: Improves architecture for OpenResponses streaming support
Specification: https://github.com/databricks/openresponses

Changes

New Module: open-responses-language-model/

  • open-responses-schema.ts - Zod schemas for event validation (text delta, item done, error)
  • open-responses-api-types.ts - TypeScript types for request/response
  • open-responses-convert-to-input.ts - AI SDK → OpenResponses format conversion
  • open-responses-convert-to-message-parts.ts - OpenResponses → AI SDK stream parts
  • open-responses-stream-transformer.ts - SSE stream parser and transformer
  • open-responses-language-model.ts - Main LanguageModelV3 implementation

Updated Files

  • databricks-provider.ts - Added openResponses() method
  • index.ts - Export schemas, types, and utilities
  • package.json - Bumped version to 0.5.0

Tests

  • open-responses.test.ts - 10 comprehensive tests for schema validation and conversion
  • All 248 tests passing

Architecture

Follows the same pattern as existing implementations (responses-agent-language-model, fmapi-language-model):

const provider = createDatabricksProvider({ baseURL, headers });
const model = provider.openResponses('agent-endpoint');
const result = await streamText({ model, messages });

Benefits

  1. Reusable - Available to all TypeScript consumers, not just one app
  2. Consistent - Follows established patterns in databricks-ai-bridge
  3. Tested - Comprehensive test coverage
  4. Maintainable - Centralized implementation reduces duplication

Testing

npm run build  # ✅ Build successful
npm test       # ✅ All 248 tests passing

Follow-up

After this PR merges, we can update consumer apps (like e2e-chatbot-app-next) to use @databricks/ai-sdk-provider@^0.5.0 and remove their embedded OpenResponses implementations.

Companion PR: databricks/app-templates@main...smurching:app-templates:feature/openresponses-support

🤖 Generated with Claude Code

smurching and others added 2 commits February 19, 2026 00:21
Add OpenResponsesLanguageModel to support agent apps that return
OpenResponses SSE format (https://github.com/databricks/openresponses).

This moves OpenResponses support from individual consumer apps into the
shared AI SDK provider library, making it available to all TypeScript/
JavaScript consumers of the Databricks AI Bridge.

Changes:
- Add open-responses-language-model module with schema, conversion, and streaming
- Export OpenResponses schemas and utilities for consumer use
- Follow existing pattern (parallel to responses-agent-language-model)
- Add comprehensive test coverage (10 tests, all passing)
- Bump version to 0.5.0

Architecture:
- Zod schemas for event validation
- SSE stream transformer for format conversion
- AI SDK v3 LanguageModel implementation
- Full streaming and non-streaming support

Usage:
  const provider = createDatabricksProvider({...});
  const model = provider.openResponses('agent-endpoint');
  const result = await streamText({ model, messages });

Testing:
  - All 248 tests passing (including 10 new OpenResponses tests)
  - Schema validation tests
  - Stream conversion tests

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
Add a root package.json that proxies to the ai-sdk-provider subdirectory,
enabling direct installation from GitHub branches for prototyping.

This allows consumers to test unreleased changes by installing directly
from the PR branch:

  npm install git+https://github.com/smurching/databricks-ai-bridge.git#feature/add-openresponses-support

The prepare script automatically builds the package after git install.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
@smurching smurching changed the title feat: Add OpenResponses streaming format support Declarative agent prototype: Add OpenResponses streaming format support to AI SDK provider Feb 19, 2026
Signed-off-by: Sid Murching <sid.murching@databricks.com>
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant