Today, people have become used to their smartphones and smartwatches sensing what their bodies are doing, be it driving, walking, or sleeping. But, what about a person’s hands? It emerges that smartwatches, with a few modifications, can detect an astonishing number of things one’s hands are doing.
Scientists at Carnegie Mellon University’s Human-Computer Interaction Institute (HCII) have used a regular smartwatch to work out when a wearer was typing on a keyboard, pouring from a pitcher, petting a dog, washing dishes, or cutting with scissors.
By performing a few modifications to the operating system of a watch, they were able to use its accelerometer to identify hand motions and, in a few cases, bio-acoustic sounds related to 25 various hand activities at about 95% accuracy. Moreover, those 25 activities are merely the start of what might be likely to detect.
"We envision smartwatches as a unique beachhead on the body for capturing rich, everyday activities," said Chris Harrison, assistant professor in the HCII and director of the Future Interfaces Group. "A wide variety of apps could be made smarter and more context-sensitive if our devices knew the activity of our bodies and hands."
Harrison and HCII Ph.D. student Gierad Laput’s findings on this new sensing capability will be presented at CHI 2019, the Association for Computing Machinery's Conference on Human Factors in Computing Systems, which is held from May 4th to 9th in Glasgow, Scotland.
Similar to how smartphones can now block text messages while a user is driving, future devices that sense hand activity might learn not to disturb someone while they are doing a particular task with their hands, such as operating power equipment or cutting vegetables, Laput said. Sensing hand activity also could be useful in health-related apps — keeping track of activities such as washing hands, brushing teeth, or smoking a cigarette.
Hand-sensing also might be applied by apps that deliver feedback to users who are learning a new skill, such as playing a musical instrument, or going through physical rehabilitation. Apps might alert users to typing routines that could result in repetitive strain injury (RSI), or measure the onset of motor impairments such as those related to Parkinson’s disease.
Laput and Harrison started their exploration of hand activity detection by recruiting 50 people to wear specifically programmed smartwatches for nearly 1,000 hours while carrying on their routine activities. Intermittently, the watches would record hand orientation, hand motion, and bio-acoustic information, and then prompt the wearer to define the hand activity — shaving, scratching, clapping, applying lipstick, and so on. Over 80 hand activities were defined in this manner, providing an exclusive dataset.
For the time being, the smartwatch must be worn by users on their active arm, instead of the passive (non-dominant) arm where people normally wear wristwatches, for the system to function. Future experiments will investigate what events can be sensed using the passive arm.
The 25 hand activities we evaluated are a small fraction of the ways we engage our arms and hands in the real world.
Gierad Laput, Ph.D. Student, HCII, Carnegie Mellon University
Future work will possibly concentrate on classes of activities — those related to specific activities such as elder care, smoking cessation, or typing and RSI.
The research was supported by Packard Foundation, Sloan Foundation, and the Google Ph.D. Fellowship.