PARIS — Founded in 2013, Bristol-based startup Ultrahaptics relies on a compact array of ultrasound transducers to send inaudible sound waves through the air, using phase-shift techniques to precisely control the focus and intensity of the acoustic radiation pressure into something tangible.
While the ultrasound transducers lay flat in what could look like a thick mouse mat, the air pressure differences created at the focus points (where all sound waves meet at the same time thanks to the phase delays) can be felt like invisible contours projected into the air. What's more, different textures can be created by varying the modulation frequency or pulsing the feedback effect on the skin.
To make things even more interesting, the company can combine these haptic effects with real-time video tracking of a users fingers, so as to follow the users gestures with a consistent feel, for example progressively defining and unveiling the contour of a large virtual object as the user swipes his fingers across it.
The feedback force that can be felt ranges from 10 to 4Pa, maybe the equivalent of tenths of grams, but that is significant enough for our hands' tactile receptors to resolve, Tom Carter, co-founder and CTO of UltraHaptics, told us.
In blind demos (running the haptic effects without any visual cues), the company claims it is now at a stage where it can create invisible, yet easily recognized primitive shapes such as a sphere, a cube, a pyramid or a cone, floating above the mat.
In a user study, people were asked to come up to the interface and guess primitive shapes, and they guessed right 80% of the time Carter said.
There are several ways UltraHaptics can create these floating shapes.
We could always produce haptics in a limited space so people would feel it when they reached where the haptic effect took place, or with gesture tracking, we could adapt in 3D where to produce the effects, at the point of your fingertips, explained Carter.
To create the entire shape, we need from a hundred to a thousand points at the same time, but with finger-tracking, it is much more effective to create only the points of the object that the hands are intersecting with, he clarified.
With an operating range of about one meter and with a field of view of 60, the haptic effect can take place within a fairly large dome-like space, and UltraHaptics has put together a 16x16cm hardware evaluation kit to help partners develop new interface concepts.
According to Carter, you could create customizable 3D cockpits. For example in the automotive industry, the central dashboard knob could be entirely user-configurable for different feels such as a slider, a turning knob or a swinging arm, but it could also be adapted to different users morphologies, with longer or shorter arms.
It would only take little initial training, with guidance through the dashboards screen to help users getting acquainted with the new feel.
Another promising landscape for UltraHaptics is the realm of virtual reality, where VR headset wearers could actually touch and feel the objects in their digital environment, without having to wear special gloves.
The company has secured seed funding in 2014 and now has a haptic solution that is ready to be packaged into consumer products.
It could take five to seven years before you get to feel such haptics in a car, according to Carter, but the consumer space could adopt the technology a lot faster, realistically within the next 12 to 18 months.
Visit Ultrahaptics at http://ultrahaptics.com
Article originally posted on EE Times Europe.
Join over 2,000 technical professionals and embedded systems hardware, software, and firmware developers at ESC Silicon Valley July 20-22, 2015 and learn about the latest techniques and tips for reducing time, cost, and complexity in the embedded development process.
Passes for the ESC Silicon Valley 2015 Technical Conference are available at the conference’s official site with discounted advance pricing until July 17, 2015. The Embedded Systems Conference and EE Times are owned by UBM Canon.