Wearable computers such as glasses open up the door to new user experiences in context awareness, but they need new power architectures to drive them.
Contextual awareness likely will emerge as one of the most exciting new user experiences enabled by wearable products. It happens when a mobile device, carried or worn, senses the user’s surroundings and presents information, offers advice, or controls itself and/or other devices according to that specific environment.
This new experience will influence how users see the world, affecting their interactions with other types of screens, some of which have not even been invented. New styles of personal screens such as smart glasses and smart watches are popping up like weeds now.
A screen that you wear is more natural and personal because it becomes an extension of you unlike a smartphone you carry. So look into glasses platforms and you just might see the future, at least the future of the mobile platform.
With glasses an image can be projected right into users’ eyes, overlaid on their view of the real world. This merging of real and virtual realities will create a bit of an Alice in Wonderland world where reality is altered like never before.
The first likely major use of augmented reality will be for location-based services. Descriptions and information about the user’s current neighborhood, a museum exhibit, the building, the product the user is looking at, or the person the user is talking to (which is sort of creepy) will be available at the literal blink of an eye.
Connected glasses can provide context for people you meet.
Major mobile handset makers are already patenting contextual awareness methodologies. The implementations are rudimentary right now, but point to a future of augmented reality.
Augmented reality is all about creating a more natural interaction with the digital world while living an analog life. Simple contextual awareness applications could include a phone that recognizes what the user is doing and present options and services that make sense at that time.
Just imagine a smarter Siri that knows where you are and suggests what you might want to do. Now imagine a phone that has learned your particular gait when walking and senses whether someone else has taken your phone and locks it down until that person is authenticated. There is a smartphone already on the market that keeps the screen lit while the user has eye contact with it.
These new platforms present many new opportunities to pursue innovation in miniaturization, power management, connectivity, sensing, and control. Their requirements play right into the hands of semiconductor innovators, most notably those that make power products.
Industrial design is paramount in wearable products such as glasses. These new form factors will apply a lot of stress on a product’s physical design, so routing power will become more complicated.
In smartphones the trend until now has been to gather power blocks all in one place -- the PMIC. However, the PMIC’s monopoly is being challenged as power functions are being disintegrated and spread around in novel configurations.
Disintegrated PMICs, which are starting to be called micro-PMICs, make sense. They are already appearing in modular phone concepts from companies like ZTE and Motorola, and there will be others. While these are exotic and experimental concepts, they do point to a more distributed, decentralized architecture well suited for tomorrow's wearable devices.