Another Bell Labs veteran who returned to Taiwan is IEEE Fellow Liang-Gee Chen, who returned to NTU to open the Digital Signal Processing and Integrated Circuit (DSP-IC) Lab. The DSP-IC Lab is emulating the visual recognition algorithms of the human brain, which they cast into state-of-the-art 28-nanometer signal-processing chips fabricated by TSMC. By modeling the architecture of the human neocortex, the DSP-IC Lab is building applications that perform real-time object, face and "action" recognition.
"Many other groups are working on pattern recognition of static objects, but one of our goals is to also recognize actions in real time," said Liang-Gee Chen.
Applications being developed there range from collision avoidance systems for automobiles to more intuitive gaming to robots that can make decisions about how to accomplish their assigned tasks based on what they see going on around them in the real world. The team is also working on a medical diagnostic chip that can infer when a patient is going to have a heart-attack up to an hour beforehand, enabling patient's wearing 24/7 monitors to get themselves to the nearest hospital in plenty of time to save their lives.
Mobile Human Computer Interface Lab
"Mike" Yen-Yang Chen returned to Taiwan, after a stint at IBM Research, to found his own Mobile Human Computer Interface Lab. There his team designs apps that solve all sorts of common tablet and smartphone human-computer interface problems. For instance, when reading in bed, users who do not know how to lock their screen-orientation often end up propping their heads up to keep the screen from automatically rotating. The Mobile HCI Lab's iRotate app solves that problem by using the front-facing camera to infer at which angle your face is viewing the screen, thus locking screen-orientation to head-orientation even when viewing at an angle.
"First we study the psychology of the user--what they normally do to cause an HCI problem," said Mike Chen. "Then we solve that problem with an app that does what the user expects."
Other apps recently developed in the Mobile HCI Lab include iGrasp, which changes the location of the on-screen keyboard to match where your hands are holding the tablet, so that its keys are always in reach. The iGrasp app automatically splits the keyboard to keep its keys in reach of both thumbs when the user is holding the tablet with two hands, then reunifies the keyboard when the user switches to holding the tablet with one hand and types with the other.
Join our online Radio Show on Friday 11th July starting at 2:00pm Eastern, when EETimes editor of all things fun and interesting, Max Maxfield, and embedded systems expert, Jack Ganssle, will debate as to just what is, and is not, and embedded system.