LiquidChat - a showcase of on-device AI capabilities using Apple Foundation Models, MLC LLM Engine and Vercel AI SDK.
Demo.
- On-device LLMs via Apple Foundation Models or MLC LLM Engine.
- AI-generated chat topics and group conversations.
- Light and Dark themes with flexible customization.
- CI/CD with GitHub Actions and EAS.
- Modularized architecture with Dependency Injection.
- For Apple Foundation Models, MacOS 26 with Xcode 26 are required.
- No above restrictions for MLC models. However, testing is limited.
Install Bun.
bun i
cp .env.example .env
# (Optionally) update the EXPO_PUBLIC_AI_FALLBACK_MODEL
Start Metro bundler and follow the instructions in terminal to run the app.
bun run start
Pre-built MLC models aren't available for Android making the platform not testable at the moment. Follow callstackincubator/ai for updates.
MLC models require a physical device, making them not testable on Simulator. As a workaround, it's possible to run on Mac or a physical device.
Mac / Physical device:
- Build Xcode project:
npx expo prebuild
>open ios/liquidchat.xcworkspace
- Adjust Signing & Capabilities with your Personal Team
Simulator:
- Remove MLC deps:
bun rm @react-native-ai/mlc
>npx expo prebuild --clean
> then adjust the ai/index to always return AppleAIProvider.