LiquidChat - a showcase of on-device AI capabilities using Apple Foundation Models, MLC LLM Engine and Vercel AI SDK.
Demo.
- On-device LLMs via Apple Foundation Models or MLC LLM Engine.
- AI-generated chat topics and group conversations.
- Light and Dark themes with flexible customization.
- CI/CD with GitHub Actions and EAS.
- Modularized architecture with Dependency Injection.
- For Apple Foundation Models, MacOS 26 with Xcode 26 are required.
- No above restrictions for MLC models. However, testing is limited.
Install Bun.
bun icp .env.example .env
# (Optionally) update the EXPO_PUBLIC_AI_FALLBACK_MODELStart Metro bundler and follow the instructions in terminal to run the app.
bun run startNot testable. The library has runtime errors when attempting to load the model after download.
MLC models require a physical device, making them not testable on Simulator.
Mac / Physical device:
- Build Xcode project:
npx expo prebuild>open ios/liquidchat.xcworkspace - Select "My Mac (Designed for iPhone)" as running destination
- Adjust Signing & Capabilities with your Personal Team
Simulator:
- Remove MLC deps:
bun rm @react-native-ai/mlc>npx expo prebuild --clean> then adjust the ai/index to always return AppleAIProvider.
