Vyaakhya

Vyaakhya

Context-Aware Smart Glasses for the Visually Impaired


After my IIT Delhi internship, I knew NAVI was just the beginning. The mobile app worked, but it needed constant clicking. That wasn't good enough.


Timeline

September 2024 – Present

Role

Final Year Project
Prof. Rohan Paul

Tools

C++
Python
Node.js
AWS
LLMs

Disciplines

Hardware Integration
AI/ML Engineering
Embedded Systems
User Research
Accessibility Design


The Click-Click-Click Problem

Testing NAVI showed the problem clearly. Click to see ahead. Click to find doors. Click to check entrances. Users needed continuous awareness, not a manual camera trigger.

Building Continuous Awareness

I made Vyaakhya work continuously. The glasses capture images automatically, process them with context-aware AI, and speak only when needed. No clicking required.

The Intelligence Layer

The AI understands context. It knows where you're going and what matters for that journey. Walking down a hallway? It won't narrate every door—just yours.

From Research to Reality

NAVI proved the concept. Vyaakhya made it work.


← Learn about NAVI, the foundation for Vyaakhya