Using ultrasound radiation, researchers at University of Bristol (UK) have devised a computer interface that basically allows users to interact with a digital screen without touching it. Sure the Kinect or Leap Motion does this already, the catch is that this system also provides haptic (touch) feedback. So, whenever a user traces a motion in front of the system, not only does the system react, it also relays feedback to the users which senses it as touch. The device was unveiled this week at the ACM Symposium on User Interface Software and Technology in Scotland.
Dubbed, UltraHaptics the researchers claim the system’s main advantage is that it allows the user to “feel” what is on the screen.
“UltraHaptics uses the principle of acoustic radiation force where a phased array of ultrasonic transducers is used to exert forces on a target in mid-air,” Co-developer Tom Carter explained. “Haptic sensations are projected through a screen and directly onto the user’s hands.”
The system works by means of an ultrasound transducer array positioned beneath an acoustically transparent display, which doesn’t interfere with the haptic interaction. The multiple transducers join together and collectively emit very high frequency sound waves. When all of the sound waves meet at the same location at the same time, they create sensations on the skin. By creating multiple simultaneous feedback points, and giving them individual tactile properties, users can receive localized feedback associated to their actions. A LeapMotion device is used to relay hand movements.
Finally, the research team explored three new areas of interaction possibilities that UltraHaptics can provide: mid-air gestures, tactile information layers and visually restricted displays, and created an application for each.
A video demonstration of the UltraHaptic system can be viewed below.
Tom Carter, PhD student in the Department of Computer Science’s BIG research group, said: “Current systems with integrated interactive surfaces allow users to walk-up and use them with bare hands. Our goal was to integrate haptic feedback into these systems without sacrificing their simplicity and accessibility.
“To achieve this, we have designed a system with an ultrasound transducer array positioned beneath an acoustically transparent display. This arrangement allows the projection of focused ultrasound through the interactive surface and directly onto the users’ bare hands. By creating multiple simultaneous feedback points, and giving them individual tactile properties, users can receive localised feedback associated to their actions.”
UltraHaptics: Multi-Point Mid-Air Haptic Feedback for Touch Surfaces, Thomas Carter, Sue Ann Seah, Benjamin Long, Bruce Drinkwater, Sriram Subramanian, UIST 2013, 8-11 October, St Andrews, UK.
Enjoyed this article? Join 40,000+ subscribers to the ZME Science newsletter. Subscribe now!