Do you remember when the only thing you could do with a cellphone was make a phone call? Do you recall how amazing it was to see the user interface on the first iPhones -- the fact that you could turn the phone on its side, and it would detect its change in position and flip the display over to reflect that?
Over the past few years, more and more devices have become equipped with these capabilities. Also, more and more sensors -- often in the form of micro-electromechanical systems -- have been added to things like smartphones and tablets. We are now on the cusp of a new paradigm where sensors, the systems that process data from them, and the applications that use this data dominate the mobile industry.
The thing is that we've only just begun to dip our toes in the water of sensor technology. Accelerometers, magnetometers, and gyroscopes are only the tip of the iceberg. These are being augmented with proximity sensors and sensors that detect gestures and sensors to determine where the user is looking, along with a host of environmental sensors for things like ambient light, temperature, humidity, atmospheric pressure, and so forth.
Recently I wrote some blogs about my Fitbit Zip, which counts the number of steps I take throughout the day. I really like my Fitbit, but it has to be said that this gives only a taste of what's possible. Though the Fitbit counts steps, it cannot work out the actual distance I've travelled. Also, it gets confused when I'm on a treadmill or an escalator.
What I'm leading up to is that next-generation devices with be both environmentally and contextually aware. They will know if we are walking, running, or riding a bike. They will know if they are in our hands, in a packet, or in a purse. They will know if we've fallen over or if we've been involved in a car crash. (If we are not moving, they may call 911 automatically on our behalf.)
Another aspect of all this is sensor fusion, which involves combining the raw data coming out of multiple sensors and processing it to generate useful information. Not so long ago, we were happy if our smartphones could approximate our location to an intersection between two streets. In the not-so-distant future, if we enter a building (say, a mall), walk around, ride an elevator, walk some more, ride an escalator, and walk some more, our smartphones will be able to determine our location (based on the fusion of all the sensor data) to within a few square feet.
Of course, to perform these tasks, the sensors and their data processing systems must be always on. This mandates a sensor hub that consumes 1-2 percent or less of the entire system's power budget. Current sensor hub solutions range from ASSPs to MCUs and application processors. ASSPs are power efficient, but their algorithms are frozen in silicon (implemented in fixed logic). Microcontrollers and application processors are software programmable, but they are extremely inefficient in terms of power.
To address this issue, QuickLogic has introduced the ArcticLink 3 S1 -- an ultra-low-power sensor hub solution platform for mobile devices. The S1 platform integrates sensor management and fusion, optimizes application processor communication, and enables always-on context awareness by reducing power consumption to approximately 1 percent of system power.
The idea is that the S1 can be monitoring the sensors and processing data while the main (power-guzzling) application processor snoozes away. Front-end sensor management is performed by a micro-coded state machine, which employs a very low-power I2C master interface to communicate with multi-axis sensors such as accelerometers, magnetometers, gyroscopes, ambient light sensors, and pressure sensors. The sensor manager samples and buffers multiple seconds of sensor data at extremely low power levels while allowing the system's application processor to remain asleep. The S1 wakes the main application processor only when the occasion demands.
The sensor manager is augmented by an ultra-low-power CISC-based arithmetic logic unit for real-time sensor data processing. This, in turn, is augmented by an embedded array of reprogrammable logic. The integrated flexible fusion engine is specifically designed to process, filter, and interpret sensor data before it goes to the host application processor.
If we were living in the Star Wars universe, I would be tempted to say that I can feel a disturbance in the Force. Sensors have been around in one form or another for hundreds of years. They have evolved and become more sophisticated over time, but this generally has been a gradual process. In recent years, however, developments in microminiaturization and sensor technology have occurred at blinding speed, and even more incredible sensors are poised to leap to center stage in the coming months.
I think that QuickLogic has cleverly positioned itself to be in the perfect place at the perfect time. The ultra-low-power S1 sensor hub solution platform is ideally targeted for use in mobile devices, and I think we will see it appearing in a wide range of other products. The really interesting thing here is that I believe we have little conception as to how all this technology will change our lives in the years to come. I, for one, cannot wait.