PORTLAND, Ore. A new Kionix software engine can capture and subsequently recognize user-defined gestures. The Gesture Designer, intended to help OEMs design gesture-based interfaces using Kionix's microelectromechanical system (MEMS) accelerometers, widens the potential range of motion-enabled functionality for consumer electronics.
Kionix three-axis accelerometers have built-in algorithms for recognizing a user's tap patterns from six directions (top, bottom and all four sides). The prefab algorithms detect both the direction of the tap and whether it was a single or double tap, thereby enabling 12 possible actions for user control of a consumer device.
The Gesture Designer adds the ability to create unique gestures for specific types of consumer gear. OEMs use the software engine to build and manage a library of motions by capturing and analyzing the data stream from their accelerometer during a personal gesture-making event. The unique signatures of any number of gestures can be recorded, analyzed and subsequently recognized. Once a gesture is authenticated by the onboard recognition engine, OEMs can use the command to execute any operation on their device. Kionix recommends the Gesture Designer to developers of intuitive user interfaces and interactive games.
Kionix (Ithaca, N.Y.) is a Cornell University spinoff that pioneered high-aspect-ratio silicon micromachining and was subsequently acquired as a wholly owned subsidiary of Rohm Co. Ltd. (Japan).