SAN FRANCISCO — The founder and chief technology officer of Leap Motion shared his views on the future of user interfaces on the occasion of the release of updated software for the company's motion sensor.
David Holz said he’s most excited about changing the way humans interact with technology, but changing people's behavior is not easy. “The slowest part [of developing technology] is people, it’s often a bigger mental shift” than a technical shift, Holz told EE Times.
“The good news is there is very little limiting fast growth, but the bad news is we can’t just make a new processor and have the whole world begin to use gesture control to its fullest.”
Leap Motion uses two ultrawide-angle cameras with shutter sensors that run at 120 frames per second. Rather than individually process pixels, the sensors take pictures of all the pixels at once and process them later. It claims its process results in very fast motion tracking that allows for sharp movements.
About 200 applications use the product to deliver gesture control for a wide variety of uses. Grammy-nominated musicians use it to compose music. Surgeons use it to control MRI imagery without having to re-scrub.
Leap recently released to beta testers a software update that allows for occlusion robustness. It treats fingers, palms, and arms as pieces of a whole that can be tracked even when they’re not in view of the controller.
Holz says Leap's product was conceived out of frustration with existing 3D programs that made modeling on a computer more difficult than using clay. “I could never do 3D modeling, I never had the time to pick up all the skills and shortcuts and tricks."
With the Freeform app, "I was able to sculpt things in the air to the exact same capability I could do physically, and more. I could undo then grow the clay; I could paint the clay then distort its colors.”
Now Holz wants to supply a platform for new use cases that require complicated, quick movements similar to those needed for 3D sculpting. The most interesting uses for advanced motion tracking are those without existing input devices, where gesture can be the main interface, he told us.
“As output becomes more real, input needs to become more physically inspired.”
— Jessica Lipsky, Associate Editor, EE Times