an intelligent meeting assistant that transcribes, analyzes, and extracts action items from meetings. powered by whisper, llama3.2, and a modern next.js stack.
- meeting recording and transcription
- ai-powered meeting analysis
- automated task extraction
- meeting insights and summaries
- task management integration
- dark/light mode support
- next.js 14 (app router)
- typescript
- tailwind css
- shadcn/ui components
- clerk authentication
- local storage for data
- openai whisper for transcription
- ollama (llama 3.2) for local ai analysis
next.js
react-based framework with support for app router, server components, and api routes — used for fast and scalable frontend + backend logic.
clerk auth
handles user authentication with sessions, mfa, and identity provider support — integrated for secure, production-ready auth.
whisper
used for both live and post-meeting transcription. supports multi-language input, speaker separation, and solid accuracy.
ollama + llama 3.2
runs locally using ollama, powering offline and fast analysis. extracts meeting summaries, insights, and todo tasks from the transcript.
to run llama3.2 locally:
ollama run llama3.2- clone the repo
git clone https://github.com/yourusername/iris.git
cd iris- install dependencies
npm install- setup environment variables
create a .env.local file and add:
OPENAI_API_KEY=your_openai_api_key
NEXT_PUBLIC_CLERK_PUBLISHABLE_KEY=your_clerk_publishable_key
CLERK_SECRET_KEY=your_clerk_secret_key- start dev server
npm run devOPENAI_API_KEYNEXT_PUBLIC_CLERK_PUBLISHABLE_KEYCLERK_SECRET_KEY
pull requests are welcome. feel free to fork, tweak, and build.