Inclusive Android app with two core capabilities:
- ✋ ASL Mode – Speech → Text → ASL Video
- 👁️ Object Detection Mode – Camera → Detect → Speak
- 🎙️ Listens from mic → converts speech to text
- 🔤 Maps text to ASL signs
▶️ Fetches ASL videos from the web and plays them in-app
- 🤟 Many Deaf and Hard of Hearing (HoH) individuals use ASL as their primary language, not written/spoken English.
- 🗣️ This feature bridges the gap by translating spoken words into sign language videos, enabling more natural understanding.
- 🧑🤝🧑 It promotes inclusion in everyday scenarios such as conversations, announcements, and meetings.
- 🌍 Designed with accessibility in mind: large buttons and high contrast.
- ⚡ Partial results (near word-by-word feel)
- 🎯 Candidate ranking (choose the best sign/video)
asl.mp4
- 📷 Uses camera to detect objects in real time
- 🗣️ Speaks object names via Text-to-Speech (TTS)
- 🏷️ Optionally shows detected labels on screen
- 👓 Helps users with eye problems or low vision identify objects in their surroundings.
- 🗣️ Provides audio feedback so users can hear what’s in front of them without needing to see it.
- 🧭 Improves independence by enabling safer navigation and better awareness of the environment.
- 🖼️ Optional on-screen labels can support users with partial vision, combining text with speech.
- ⚡ Real-time detection (CameraX + ML Kit/TFLite)
- 🔕 Debounced announcements (avoid spam)
- 📊 Confidence threshold + label coalescing
- 📦 Works offline using local models