If all of us are already familiar with the basic silent-mode of most mobile phones, a crude form of haptics (sensing the buzz of a vibrating mass in your pocket), there is much more to come on the display side.
From the lab to startup companies, the race is on adding physically perceptible volumes and textures to whatever is displayed on screen, ranging from a simple keyboard with a "click" feel to the complex rendering of 3D shapes and textures, either in volume or on a seemingly flat surface.
The EuroHaptics 2014 conference, which took place in Versailles from June 24 to 26, was buzzing with actuators and haptic devices of all sorts. Well over a hundred papers, posters, and dozens of demos were presented, covering experimental research setups about human touch perception on one end, and various tangible haptic interfaces on the other end of the spectrum, with plenty of force and feedback encoding schemes in between.
Before any sensory information can be effectively put to good use in a haptic interface, one should understand how we humans perceive touch, and how our perception and our experience of the world affect our individual capacity to discriminate features and objects. A lot of fundamental research goes into understanding the limitations of touch-only haptic devices, versus multi-modal haptics where touch is combined with vision and/or sound to provide a better perceptual illusion.
Often, the experiments show that a multisensory interface, as most of us would naturally experience with real-world objects, provides a much better illusion and makes it easier for the end-user to manipulate virtual objects. Sometimes, they just highlight how a dual combination would be the most effective (sight and touch, or sound and touch).
Then tricks can be developed by haptic device designers to tune into our perceptual illusions and create haptic feedback effects that are felt more strongly or differently from what the actual interface material really should provide (for example, feeling a textured shape on a truly flat glass surface).
One of the posters presented by Anke Brock from the CNRS and University of Toulouse was exploring the combinations of flat displays and haptics that would best suit visually impaired people for gestural interaction (touch displays often only offer visual cues).
For the purpose of her investigation, Brock designed an interactive map prototype including a raised-line map overlay for gestural interaction with contoured buttons for accessing different types of information such as opening hours and distances. The drawings were made in Scalable Vector Graphics (SVG) with a configuration file written in XML so as to be interpreted by the interface application. The physical overlay was painstakingly custom made and static. But, ideally, this is an area where dynamically reconfigurable haptic displays could play a bigger role.