Posted in | News | Sensors General

A High-Fidelity Input Device for Computers Using a New Sensing System

A newly developed method for recording and examining surface-acoustic waves could allow almost any object to serve as a touch input device and further power privacy-sensitive sensing systems.

A sensing system called SAWSense takes advantage of acoustic waves traveling along the surface of an object to enable touch inputs to devices almost everywhere. Here, a table is used to power a laptop’s trackpad. Image Credit: Interactive Sensing and Computing Lab, University of Michigan

For computers using a new sensing system that has been developed at the University of Michigan, tables, couches, sleeves, and more can be converted into a high-fidelity input device.

The system reuses technology from new bone-conduction microphones, called Voice Pickup Units (VPUs), which can detect only those acoustic waves that travel together with the surface of objects.

It operates only in noisy environments, along odd geometries like arms and toys, and on soft fabrics like clothing and furniture.

Known as SAWSense, for the surface acoustic waves it depends on, the system identifies various inputs, like scratches, taps, and swipes, with 97% precision. In a single illustration, the team made use of a normal table for a laptop trackpad to be substituted.

This technology will enable you to treat, for example, the whole surface of your body like an interactive surface. If you put the device on your wrist, you can do gestures on your skin. We have preliminary findings that demonstrate this is entirely feasible.

Yasha Iravantchi, Doctoral Candidate, Computer Science and Engineering, University of Michigan

Acoustic waves are sent along the surfaces of materials by swipes, taps, and other gestures. Further, the system categorizes such waves with machine learning to convert all touch into a strong set of inputs. The system was presented last week at the 2023 Conference on Human Factors in Computing Systems, where it won the best paper award.

Since more objects remain to integrate smart or connected technology, designers are experienced with numerous difficulties while trying to provide them intuitive input mechanisms.

Iravantchi states that this leads to many clunky incorporations of input methods like touch screens, as well as capacitive and mechanical buttons. Touch screens might be too expensive to allow gesture inputs throughout large surfaces like refrigerators and counters, while buttons only enable one kind of input at predefined locations.

Past approaches to defeat such limitations have included the usage of cameras and microphones for audio- and gesture-based inputs. However, the authors say these methods have restricted practicality in the real world.

When there’s a lot of background noise, or something comes between the user and the camera, audio and visual gesture inputs don’t work well.

Yasha Iravantchi, Doctoral Candidate, Computer Science and Engineering, University of Michigan

For these limitations to be defeated, the sensors powering SAWSense have been housed in a hermetically sealed chamber that entirely blocks even highly loud ambient noise. The only entryway is via a mass-spring system that performs the surface-acoustic waves within the housing without ever coming in contact with sounds in the encircling environment.

While integrated with the signal processing software of the team, producing features from the data before feeding it into the machine learning model, the system has the potential to record and categorize the events together with the surface of the object.

There are other ways you could detect vibrations or surface-acoustic waves, like piezo-electric sensors or accelerometers, but they can’t capture the broad range of frequencies that we need to tell the difference between a swipe and a scratch, for instance,” stated Alanson Sample, U-M associate professor of electrical engineering and computer science.

The high fidelity of the VPUs enables SAWSense to determine an extensive range of activities on a surface beyond user touch events. For example, a VPU on a kitchen countertop has the potential to detect stirring, chopping, blending, or whisking, as well as determining electronic devices in use, like a microwave or blender.

VPUs do a good job of sensing activities and events happening in a well-defined area. This allows the functionality that comes with a smart object without the privacy concerns of a standard microphone that senses the whole room, for example.

Yasha Iravantchi, Doctoral Candidate, Computer Science and Engineering, University of Michigan

When several VPUs are utilized in combination, SAWSense could allow highly specific and sensitive inputs, particularly those that need a sense of space and distance, like the keys on a button or keyboard on a remote.

Besides, scientists are exploring the usage of VPUs for medical sensing, such as picking up fragile noises like the sounds of joints and connective tissues as they are in motion. The high-fidelity audio data VPUs offer could allow real-time analytics regarding a person’s health, states Sample.

The study has been partially funded by Meta Platforms Inc.

The research group has applied for patent protection with the help of U-M Innovation Partnerships and is looking for collaborators to bring the technology to market.

SAWSense: Using Surface Acoustic Waves for Surface-bound Event Recognition

Video Credit: University of Michigan

Journal Reference

Iravantchi, Y., et al. (2023) SAWSense: Using Surface Acoustic Waves for Surface-bound Event Recognition. Proceedings of the 2023 CHI Conference on Human Factors in Computing Systems.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.