Skip to content

artyorsh/liquid-chat

Repository files navigation

LiquidChat

Validate Build

Description

LiquidChat - a showcase of on-device AI capabilities using Apple Foundation Models, MLC LLM Engine and Vercel AI SDK.

Demo.

Stack

  • expo
  • react-native
  • react-navigation
  • react-native-unistyles
  • react-native-ai
  • mobx-react
  • inversifyjs
  • jest
  • react-native-testing-library
  • typescript
  • eslint

Features

  • On-device LLMs via Apple Foundation Models or MLC LLM Engine.
  • AI-generated chat topics and group conversations.
  • Light and Dark themes with flexible customization.
  • CI/CD with GitHub Actions and EAS.
  • Modularized architecture with Dependency Injection.

Requirements

  • For Apple Foundation Models, MacOS 26 with Xcode 26 are required.
  • No above restrictions for MLC models. However, testing is limited.

Setup

Install Bun.

bun i
cp .env.example .env
# (Optionally) update the EXPO_PUBLIC_AI_FALLBACK_MODEL

Running

Start Metro bundler and follow the instructions in terminal to run the app.

bun run start

Android

Pre-built MLC models aren't available for Android making the platform not testable at the moment. Follow callstackincubator/ai for updates.

iOS

MLC models require a physical device, making them not testable on Simulator. As a workaround, it's possible to run on Mac or a physical device.

Mac / Physical device:

  • Build Xcode project: npx expo prebuild > open ios/liquidchat.xcworkspace
  • Adjust Signing & Capabilities with your Personal Team

Simulator:

  • Remove MLC deps: bun rm @react-native-ai/mlc > npx expo prebuild --clean > then adjust the ai/index to always return AppleAIProvider.

Other apps

Schiffradar

Author

Artur Yersh

About

On-device models chatting with each other via Foundation Models or MLC LLM.

Resources

License

Stars

Watchers

Forks