According to WHO there are 285 million people who are blind or visually impaired. Using smart-phones is a painstaking task for them, let alone typing with the existing keyboards. Typing on the go is another big problem. Screenless Typing explores a new mobile typing experience for the BVI community using hand gestures. This is an ongoing research with Prof Davide Bolchini at the U.S.E.R. lab at IUPUI.
Research, ideation, prototyping, Android Development
May 2018 - current (on- going project)
Aishwarya Sheth, Reeti Mathur, Parimal Vyas,
Google Faculty Research Awards Winner 2019
How might we make typing using smartphones an easy task for the blind and visually impaired community?
Difficulty in typing using regular ways
Regular ways for typing include the android talkback feature and IOS voice over feature which are not easy to use and need to be used keeping the device in hands.
Other techniques like voice command raise privacy issues in public areas which causes inconvenience for the BVI community to use it.
Blind people can hear with the speed that is twice as fast as sighted people
Blind people can hear 2.7 times faster making it easier for them to interpret speech which sighted people might not be able to.
Typing on the go, with current techniques is not an easy task.
Typing on the go, needs their device to be in their hands which is not easy, as blind people have a cane or a guide dog with them. Hands-free typing would make it extremely convenient for them.
The usual keyboards used for text-entry work in a spatial manner. What if we created a time based aural keyboard? We created a concept of Auditory Keyboard as an aural flow which would recite the characters from A-Z in a loop and with the help of a few easy gestures, the user can keep their phones in the pocket. We created an Android application for the keyboard and used the MYO band for gesture recognition
The prototype includes an Android application which is synced with the MYO band with the help of Bluetooth. The user can use simple gestures to type as he hears the alphabets in a loop with the help of the bone conduction earphones.
The Mobile Application
The mobile application is coded in Android and is able to communicate with the MYO band with the help of the MYO Software development toolkit. This prototype application is able to recite the alphabets in a loop i.e. the Syntagm. We have defined the MYO gestures in a way that the user can type words using five simple gestures.
The Syntagm is divided in chunks of five for easy navigation back and forth. The user can go ahead five characters at once thus increasing his navigation and typing speed.
Keyflow divided in chunks of five
3. Gestures for Navigation
4. Video of the working prototype
We tested this prototype with 20 blind and visually impaired people. We made them type simple words with the help of this prototype and asked them follow up questions to understand their thinking of the concept of this auditory keyboard and if they would use it in their daily lives.
User Testing Insights
Easy to understand and remember
Easy to perform
Can relate to each gesture to their functionality
Gestures are not discrete enough.
Might be uncomfortable to perform in public places.
Some participants thought them to be a little confusing.
there is a learning curve to better perform them
Sturdy to use, might not easily break
Doesn't feel like you're wearing it.
New interesting technology
Another thing to carry along with the phone
recognizes gestures depending on the person's muscle movement and misses it sometimes
Not comfortable to wear for a long time
A nice new concept, like a deck of cards
Would be great to integrate it with Instant Messenger and other applications
Would use it as an alternative to other typing techniques
Words per minute typed is slower than usual typing techniques
Need a way to control speed.
Has a steep learning curve.