Skip to content

[Feature Request]: Intergrate with Vercel AI SDK #70

Open
@toan5ks1

Description

@toan5ks1

Problem Description

Would you consider integrating mlc-ai/web-llm with Vercel AI SDK to provide:

Easier API abstraction – Developers can easily interact with web-llm through Vercel’s AI tooling.
Automatic streaming & caching – Enhances performance while keeping inference efficient.
Better compatibility with Next.js – Many AI developers use Next.js & Vercel, so this integration would lower the barrier to entry.

Why This Could Help the Community

It could streamline browser-based AI deployments and encourage more developers to adopt WebGPU-powered LLMs.
Vercel AI SDK is already popular among AI developers, so native support for mlc-ai/web-llm would make integration much smoother.
A collaboration between MLC AI & Vercel AI SDK could push WebGPU adoption forward in the AI space.

Would love to hear your thoughts! Thanks again for all the amazing work you do. 😊

Solution Description

I share my project here as an example of what’s possible:
🔗 Deepseek Local on Web GPU - Vercel AI SDK
🔗 Demo

Alternatives Considered

No response

Additional Context

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions