Portland, Ore. -- Digitizing three-dimensional objects today is a tedious process requiring the user to trace an object's outlines using a tethered stylus. And even after laboriously running the stylus over every nook and cranny, the user captures only the object's shape; its other properties cannot be determined.
Now researchers with the Virtual Reality Lab of the State University of New York at Buffalo have created a thimble-like fingertip digitizer that not only eliminates the stylus but also captures the viscosity (hardness, homogeneity, texture) as well as the shape of an object. Further, the digitizer can double as a universal input device, allowing a machine to interpret a user's gestures.
"Users wearing our device not only can run their finger over objects to capture 3-D shapes, but also can tap, poke, scratch and so forth to collect information about its hardness, consistency, texture and things like that," said the lab's director, Thenkurussi Kesavadas. The Buffalo professor created the device with mechanical and aerospace engineer Young-Seok Kim, who recently completed a doctoral thesis on the work.
After creating the device, the researchers discovered that their biomechanical model for the digitizer allowed it to double as a universal input device. The digitizer can function as a mouse, joystick, trackball or keyboard, as well as turn any computer display into a touchscreen. The fingertip-mounted haptic sensory device can recognize such gestures as pointing, wagging, tapping, scratching, rubbing and palpating.
"Initially we were only designing this as a digitizing device, but it has evolved into a very versatile input device too," said Kesavadas. "We realized that the same things you need to model to know how a finger reacts to objects [it touches] can be used by the computer to tell what a user's intentions are--to recognize gestures."
The researchers believe the haptic fingertip digitizer has potential application in areas that require delicate touch, such as art, medical diagnostics, gaming, design and engineering.
The hardware for the fingertip-mounted digitizer is not unique, consisting of a magnetic position sensor, a force sensor and an accelerometer. Those components could be interchanged with similar technologies; indeed, for their second-generation device, the researchers may use an infrared sensor in place of the magnetic position sensor.
What sets the digitizer apart, according to its creators, is that its software drivers use a detailed biomechanical model of the finger. Other approaches, such as conventional touchscreens and pressure pads, use a passive model of the finger, whereas the Buffalo model is active. The model understands not only the shape and orientation of the finger but also how the joints bend; the impedance of the joints; viscoelastic tissue behavior (how the tissues conform to surfaces they touch); and how rubbing, scratching tapping, squeezing, stroking or gliding a finger over a surface senses texture.
As data comes in from the digitizer, the modeling algorithm deduces how much the finger is moving, its rate of change, its conformability to the surface and other aspects not explicitly present in the data stream, but deducible from it by virtue of the underlying model.
"We created a very detailed model of how the joints bend, how the tissues react to touching and many other aspects of the finger movement," said Kesavadas. "For instance, when you tap an object, the finger deforms on its tip because the tissue is soft there."
After gaining experience with the haptic fingertip digitizer, the researchers realized their detailed model enabled them to deduce the precise orientation of a finger moving in free space. So they configured an algorithm that could sense when a finger was pointing and then deduce to which file or folder it was pointing on a desktop's display.
"Usually you need some type of hardware built into a display to create a touchscreen," said Kesavadas. "But our device turns any screen into a touchscreen without modifying it in any way."
Having succeeded with the pointing recognition function, the researchers wrote algorithms to recognize gestures that would inform the computer what it should do with the indicated file or folder, Kesavadas said. Linear motion of the finger would move the file or folder on the desktop; a flick would move the file or folder to the trash.
After mastering file and folder manipulation on the desktop, the researchers developed example applications that would demonstrate the utility of their invention. In the simplest of those demo applications, tapping on any surface in the distinctive Qwerty pattern instantly turns that surface into a virtual keyboard. Next, the team created a virtual painting application program that accepts fingertip position, force and acceleration as inputs. For example, the act of splashing paint on a canvas is simulated by jerking the finger forward.
"We don't need a keyboard anymore; we can just tap on a tabletop, since we are tracking the finger at all times and know its rate of change," said Kesavadas. "And our painting program lets you draw or paint by direct finger touch."
Though Kim has graduated, the fingertip digitizer will continue to be developed at Buffalo's Virtual Reality Lab. The current focus of the research is on more-complicated applications for scientific work. For instance, its Fluid Touch application simulates the motion of particles in a fluid by solving the Navier-Stokes equations. Using the vorticity stream function, Fluid Touch simulates particle movement in a fluid to simulate the movement of fluid surface that can be touched by the user to create waves on the fluid surface.
"We also want to develop applications for people who, for whatever reason, cannot use traditional input devices like a mouse," said Kesavadas. "We are also striving to make the device much lighter [and to make it] wireless, with hopes of commercializing a haptic fingertip digitizer within three years."
The lab has filed a provisional patent application on the device.