This is a system designed for Blind & Visually Impaired(BVI) users using which they can access services like Yelp and Uber through simple hand gestures. There are 3 major sections in this project
Mobile applications have not reached their potential among the Blind and Visually Impaired(BVI) users due to the following contraints :
The above mentioned constraints arise due to limitations in input and output mechanisms. To overcome these limitations, we have developed a new navigation system using hand-gestures and aural cues which simplies services like Yelp & Uber for the BVI users.
The aural free flow is designed to launch the user to specific interest point quickly and then provide finer navigation through the hand gestures. This also reduces the gesture fatigue as the number of gestures is reduced.
Every gesture is associated with an earcon. This feedback improves the learnability and error handling of the system.
At any given point of time, the user can perform the "fist" gesture to get contextual help. This provides 2 types of information - "where are you? what can you do from here?".
Since the system is aural and not visual, helping the users regain context was very important.
The concept of binary search in programming is used to navigate apps in this system through hand gestures and aural cues. The apps are split into 2 nodes based on their origin. Native apps to the left and from-store apps to the right of the center cursor.
This is an ongoing project. This project will be updated with more progress soon. In the meantime, reach out to me with any questions you have.
Here’s my e-mail address - jdara@iupui.edu