Bio-Acoustic Sensors and Machine Learning Help Fingers to Control Mobile Devices

A groundbreaking technology, called the Skinput technology, was jointly developed by Microsoft Research’s Dan Morris and Desney Tan, and Carnegie Mellon University's Human-Computer Interaction Institute’s (HCII) third-year Ph.D. student Chris Harrison.

The Skinput technology

This technology integrates advanced machine learning and bio-acoustic sensors to enable people to utilize their forearms or fingers or any body part as touchpads for controlling mobile devices or smart phones.

The technology will be explained by Harrison through a paper that will be presented at CHI 2010, the Association for Computing Machinery's annual Conference on Human Factors in Computing Systems in Atlanta, Ga. on April 12, 2010.

Skinput will enable persons to better leverage the massive computing power currently present in compact gadgets that can be easily carried or worn. The miniaturized size that renders MP3 players, smart phones, and various devices portable is also responsible for restricting the utility and size of the touchscreens, the jog wheels, and the keypads, which are usually utilized for controlling them.

The prototype that was developed by Harrison last summer as a Microsoft Research intern used acoustic sensors that were fixed to the upper arm. These sensors trapped sound created due to actions like forearm tapping, tapping or flipping fingers together. This sound is transmitted through the skin by transverse waves and not through the air. The sound signals are transmitted through the bones by compressive or longitudinal waves.

This system was tried out on 20 subjects and could categorize the inputs with an overall accuracy of 88%. The accuracy partly depended on nearness of these sensors to the system input. When sensors were fixed below the elbow the accuracy was 88%. It was possible to identify finger flicks with an accuracy of 97% and forearm taps with an accuracy of 96%.

The sensor incorporates a highly tuned vibration sensors array. These sensors are like cantilevered piezo films. The prototype armband incorporated a small projector and a sensor array. The projector was able to superimpose colored buttons on the forearm of the prototype wearer. It is also possible to project a keypad on the palm of the wearer’s hand. Without using the superimposed buttons, simple gadgets like the MP3 players can be easily controlled by tapping fingertips. A person’s sense for body configuration, called proprioception, can also be used by Skinput for interaction free of eyes.

The Skinput accuracy was lower in fleshier, heavy persons. Sex and age could also affect the Skinput accuracy. According to the researchers, jogging or running can result in the generation of noise and signal degradation. However, the extent of tests were restricted and it is likely that further training of machine learning programs under such conditions will enhance the accuracy.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.