Recording.2025-08-21.012134.mp4
Organize classes, generate smart study notes and flashcards, and chat with a friendly voice assistant. Built with React + Vite + Tailwind + shadcn/ui, powered by Azure OpenAI via a Vercel serverless proxy.
- Website: https://skippy-kohl.vercel.app
Skippy is a sleek, student-focused dashboard that helps you:
- Build a clean weekly timetable thatΓÇÖs truly date-wise, day-wise, and time-aligned
- Upload files and generate structured notes and flashcards with AI
- Chat with a playful assistant that can speak out responses (with a permission-friendly UX)
- Deploy easily to Vercel with a secure serverless proxy for Azure OpenAI
The app runs great locally and in production on Vercel. In production, all AI calls go through a secure function at /api/azure-openai/chat.
- Weekly Timetable (date-wise/day-wise/time-wise)
- Generates specific instances for the current week from day-wise class storage
- Accurate filtering per-day and per-time-slot
- Clear buttons to clear only timetable items or delete all schedule items
- AI-Powered Notes & Flashcards
- Uses Azure OpenAI for structured markdown with headings, key points, and summaries
- Fallback formatting when the API is unavailable to avoid single-paragraph dumps
- Skippy Assistant (Voice + Chat)
- Greeting flow with browser-safe speech permissions
- Password-gated access flow (no direct bypass in production)
- Conversational guide that never reveals the password
- Zero-config Production API
- Vercel serverless function at
api/azure-openai/chat.js - Client auto-detects prod vs local and routes requests appropriately
- Vercel serverless function at
- React 18, TypeScript, Vite 5
- Tailwind CSS, shadcn/ui, lucide-react
- Azure OpenAI (Chat Completions) via serverless proxy
- Vercel for deployment
Requirements: Node 18+ and npm.
# Install dependencies
npm i
# Start Vite dev server
npm run dev
# Optional: start the local Azure OpenAI proxy (Express)
npm run server
# Or run both (Windows PowerShell)
npm run dev:allLocal endpoints used by the app:
- Frontend: http://localhost:5173
- Local proxy (optional): http://localhost:5174/api/azure-openai/chat
You can use the local proxy or configure direct Azure env vars in .env.local for the browser to call Azure directly. In production, the proxy is always used.
Create .env.local for local dev (do not commit secrets). The app recognizes both Vite and generic names; prefer Vite names on the client.
Required for production (set in Vercel Project → Settings → Environment Variables):
- VITE_OPENAI_API_BASE = https://YOUR-RESOURCE-NAME.openai.azure.com
- VITE_AZURE_OPENAI_KEY = your_azure_openai_key
- VITE_AZURE_OPENAI_DEPLOYMENT = your_model_deployment_name (e.g., gpt-4o)
- VITE_AZURE_OPENAI_API_VERSION = 2025-01-01-preview
Optional equivalents (used by server/local):
- OPENAI_API_BASE
- AZURE_OPENAI_API_KEY
- AZURE_OPENAI_DEPLOYMENT_NAME
- AZURE_OPENAI_API_VERSION
Where theyΓÇÖre used:
- Client service:
src/services/azureOpenAI.ts - Vercel function:
api/azure-openai/chat.js - Local proxy:
server/index.mjs
More notes: see vercel-env-fix.md for the production env fix context.
Defined in package.json:
- npm run dev ΓÇö start Vite dev server
- npm run server ΓÇö start local Express proxy at http://localhost:5174
- npm run dev:all ΓÇö launch proxy and Vite together (Windows-friendly)
- npm run build ΓÇö production build
- npm run preview ΓÇö preview built app locally
- npm run deploy ΓÇö deploy to Vercel (CLI)
- npm run deploy:prod ΓÇö deploy to Vercel production
- Set the four required env vars in Vercel:
- VITE_OPENAI_API_BASE
- VITE_AZURE_OPENAI_KEY
- VITE_AZURE_OPENAI_DEPLOYMENT
- VITE_AZURE_OPENAI_API_VERSION
- Deploy
- Via CLI:
npm run deploy:prod - Or connect the GitHub repo to Vercel and trigger a production deployment
- Production API path
- All AI calls go to
/api/azure-openai/chat, implemented atapi/azure-openai/chat.js - The function is configured in
vercel.json
For deeper guidance, see VERCEL_DEPLOYMENT.md and DEPLOYMENT_READY.md.
File: src/components/SkippyAssistant.tsx
- The greeting appears and attempts to speak automatically; if blocked, a friendly modal prompts to enable voice
- The ΓÇ£Skip InstructionsΓÇ¥ button shows after ~5s and now leads to the password prompt (no direct dashboard button)
- The assistant accepts only the intended password variants (e.g., ΓÇ£onestring7ΓÇ¥, ΓÇ£one string sevenΓÇ¥)
- On success,
onPasswordUnlock("unlocked")is called to navigate into the dashboard
More details are documented in PASSWORD_FIX.md.
File: src/components/WeeklyTimetableView.tsx
- Instances are generated for the current MondayΓÇôSunday using
generateTimetableInstancesForWeek() - Each class is attached to an exact date string (YYYY-MM-DD) for the week, avoiding ΓÇ£placeholder datesΓÇ¥
- Day view and time-slot filters pull items by the calculated date and align to slots within ┬▒30 minutes
- Clear actions:
- Clear Timetable: remove only recurring class items (and clear day-wise storage)
- Delete All: wipe everything, including assignments/notes
Timetable storage is managed via TimetableStorage and general schedule items via ScheduleStorage in src/lib/storage.ts.
-
500 error from
/api/azure-openai/chaton Vercel- Cause: Missing env vars
- Fix: Set the four VITE_* env vars in Vercel and redeploy (see
vercel-env-fix.md)
-
ΓÇ£Only big paragraph is showingΓÇ¥ in generated notes
- Cause: AI API failed, fallback formatting kicked in
- Fix: Ensure env vars are configured and the serverless function is healthy
-
Browser wonΓÇÖt speak the greeting
- Cause: Autoplay/speech blocked by browser until user interaction
- Fix: Click ΓÇ£Enable Voice & Hear SkippyΓÇ¥ in the modal, or press replay
-
Local dev canΓÇÖt call Azure directly from the browser
- Use the local proxy (
npm run server) or set the Vite env vars and allow the browser call; in production, the proxy is always used
- Use the local proxy (
Place images under public/preview/ and update these paths if needed.
src/
components/ git add README.md
git commit -m "docs: restore README with video and complete documentation"
WeeklyTimetableView.tsx # Weekly calendar with date-wise instances
SkippyAssistant.tsx # Voice/chat assistant + password flow
... # Other dashboard features
services/
azureOpenAI.ts # Proxy-first AI client with formatting fallbacks
server/
index.mjs # Local Express proxy for Azure OpenAI
api/
azure-openai/chat.js # Vercel serverless proxy for production
vercel.json # Function config + rewrites
- Never commit secrets; use
.env.localfor local and Vercel dashboard for production - The client uses only
VITE_*variables; server and function support bothVITE_*and generic names
- Create a feature branch
- Keep PRs focused and small
- Add brief notes in PR description (what/why/impact)
Proprietary. All rights reserved (or update this section to your preferred license).
- Built with React + Vite + Tailwind + shadcn/ui
- Azure OpenAI for AI features
- Vercel for painless deployment