UltraHaptics System That Provides Multi-Point Haptic Feedback in the Air to be Unveiled at UIST 2013

A system that allows users to experience multi-point haptic feedback above an interactive surface without having to touch or hold any device will be unveiled this week [Friday 11 October] at one of the world’s most important conferences for innovations in human-computer interfaces.

A person demonstrating UltraHaptics

Multi-touch surfaces offer easy interaction in public spaces, with people being able to walk-up and use them. However, people cannot feel what they have touched. A team from the University of Bristol’s Interaction and Graphics (BIG) research group have developed a solution that not only allows people to feel what is on the screen, but also receive invisible information before they touch it.

The research paper, to be presented at the ACM Symposium on User Interface Software and Technology (UIST) 2013 by Tom Carter from the Department of Computer Science, will unveil UltraHaptics, a system designed to provide multipoint, mid-air haptic feedback above a touch surface.

UltraHaptics uses the principle of acoustic radiation force where a phased array of ultrasonic transducers is used to exert forces on a target in mid-air. Haptic sensations are projected through a screen and directly onto the user’s hands.

The use of ultrasonic vibrations is a new technique for delivering tactile sensations to the user. A series of ultrasonic transducers emit very high frequency sound waves. When all of the sound waves meet at the same location at the same time, they create sensations on a human’s skin.

By carrying out technical evaluations, the team have shown that the system is capable of creating individual points of feedback that are far beyond the perception threshold of the human hand. The researchers have also established the necessary properties of a display surface that is transparent to 40kHz ultrasound.

The results from two user studies have demonstrated that feedback points with different tactile properties can be distinguished at smaller separations. The researchers also found that users are able to identify different tactile properties with training.

Finally, the research team explored three new areas of interaction possibilities that UltraHaptics can provide: mid-air gestures, tactile information layers and visually restricted displays, and created an application for each.

Tom Carter, PhD student in the Department of Computer Science’s BIG research group, said: “Current systems with integrated interactive surfaces allow users to walk-up and use them with bare hands. Our goal was to integrate haptic feedback into these systems without sacrificing their simplicity and accessibility.

“To achieve this, we have designed a system with an ultrasound transducer array positioned beneath an acoustically transparent display. This arrangement allows the projection of focused ultrasound through the interactive surface and directly onto the users’ bare hands. By creating multiple simultaneous feedback points, and giving them individual tactile properties, users can receive localised feedback associated to their actions.”

Source: http://www.bristol.ac.uk/

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type
Submit

While we only use edited and approved content for Azthena answers, it may on occasions provide incorrect responses. Please confirm any data provided with the related suppliers or authors. We do not provide medical advice, if you search for medical information you must always consult a medical professional before acting on any information provided.

Your questions, but not your email details will be shared with OpenAI and retained for 30 days in accordance with their privacy principles.

Please do not ask questions that use sensitive or confidential information.

Read the full Terms & Conditions.