Users are becoming more knowledgeable and likely to move beyond just counting megapixels and consider dynamic range, shutter speed, zoom, etc are likely to move to the front. Consumers will need to be educated but many already have experience with digital camera's.
One interesting side effect of more (and smaller) pixels is a lower light level (photons) that can be sensed. I don't think that Moore's Law applies for image sensors.
Yes. Pixel rate processing will likely be replaced by Metadata in which the pixels are preprocessed depending on the applications. Professor Ishikawa maintains that the sensors, system architecture and algorithms are all interrelated and need to be optimized for high frame rate applications in order to achieve this. More on this in a future blog.
I don't think Presbyopia (or it's electronic equivalent) will be a problem. The active systems that take control have radar & other sensors to complement the image sensors & diagnostics for the system. Surely mechanics will need to become more like applications engineers with specialization likely.
Agreed. Embedded vision in automobiles is pretty cool---as long as it works. What happens when the car ages? Does its vision get bad? I supposed the systems need tuning or testing every so often. Mechanics will have to be embedded systems engineers.
Quite agree, image sensors are one of the big thing in electronics industry. Now the smartphone are almost becoming necessity, image sensors would play a great role. More and more clarity of the image is becoming a requirement. And there are so many applications where these image sensors are used and almost critical.
NASA's Orion Flight Software Production Systems Manager Darrel G. Raines joins Planet Analog Editor Steve Taranovich and Embedded.com Editor Max Maxfield to talk about embedded flight software used in Orion Spacecraft, part of NASA's Mars mission. Live radio show and live chat. Get your questions ready.
Brought to you by