top of page

Screenless Typing

An auditory keyboard for people with Visual impairment.

The Problem

According to WHO there are 285 million people who are blind or visually impaired. Using smart-phones is a painstaking task for them, let alone typing with the existing keyboards. Typing on the go is another big problem. Screenless Typing explores a new mobile typing experience for the BVI community using hand gestures. This is an ongoing research with Prof Davide Bolchini at the U.S.E.R. lab at IUPUI. 


Research, ideation, prototyping, Android Development


May 2018 - current (on- going project)


Aishwarya Sheth, Reeti Mathur, Parimal Vyas, 

Google Faculty Research Awards Winner 2019

How might we make typing using smartphones an easy task for the blind and visually impaired community?

Research Insights

Difficulty in typing using regular ways

Regular ways for typing include the android talkback feature and IOS voice over feature which are not easy to use and need to be used keeping the device in hands.

Privacy issues
Other techniques like voice command raise privacy issues in public areas which causes inconvenience for the BVI community to use it. 


Blind people can hear with the speed that is twice as fast as sighted people

Blind people can hear 2.7 times faster making it easier for them to interpret speech which sighted people might not be able to.


Typing on the go, with current techniques is not an easy task.
Typing on the
go, needs their device to be in their hands which is not easy, as blind people have a cane or a guide dog with them. Hands-free typing would make it extremely convenient for them.


The Solution

The usual keyboards used for text-entry work in a spatial manner. What if we created a time based aural keyboard? We created a concept of Auditory Keyboard as an aural flow which would recite the characters from A-Z in a loop and with the help of a few easy gestures, the user can keep their phones in the pocket. We created an Android application for the keyboard and used the MYO band for gesture recognition

The Set-up

Frame 3 (3).png

The prototype includes an Android application which is synced with the MYO band with the help of Bluetooth. The user can use simple gestures to type as he hears the alphabets in a loop with the help of the bone conduction earphones. 

User Testing


  1. The Mobile Application

The mobile application is coded in Android and is able to communicate with the MYO band with the help of the MYO Software development toolkit. This prototype application is able to recite the alphabets in a loop i.e. the Syntagm. We have defined the MYO gestures in a way that the user can type words using five simple gestures. 



   2. Chunks

The Syntagm is divided in chunks of five for easy navigation back and forth. The user can go ahead five characters at once thus increasing his navigation and typing speed.


Frame 4 (3).png

Keyflow divided in chunks of five

3. Gestures for Navigation

Frame 4 (4).png

4. Video of the working prototype

User Testing

We tested this prototype with 20 blind and visually impaired people. We made them type simple words with the help of this prototype and asked them follow up questions to understand their thinking of the concept of this auditory keyboard and if they would use it in their daily lives.

User Testing Insights

Group (1).png



  • Easy to understand and remember

  • Easy to perform

  • Can relate to each gesture to their functionality




  • Gestures are not discrete enough.

  • Might be uncomfortable to perform in public places.

  • Some participants thought them to be a little confusing.

  • there is a learning curve to better perform them


image 8.png

Myo Band


  • Sturdy to use, might not easily break

  • Doesn't feel like you're wearing it.

  • New interesting technology




  • Another thing to carry along with the phone

  • recognizes gestures depending on the person's muscle movement and misses it sometimes

  • Not comfortable to wear for a long time

Keyfow Copy 4.png



  • A nice new concept, like a deck of cards 

  • Would be great to integrate it with Instant Messenger and other applications 

  • Would use it as an alternative to other typing techniques




  • Words per minute typed is slower than usual typing techniques

  • Need a way to control speed.

  • Has a steep learning curve.

Future Work

We are currently writing a paper based on our user-testing and its findings for the Assets 2019 conference and plan to explore more ways for gestural input other than the MYO band as from our user-tests we found out it has some drawbacks. 

bottom of page