I've got CMOS image sensors on my mind. Once clearly inferior to CCD image sensors, CMOS has made big improvements of late -- so much so that both Ikegami and Sony are using them in their latest high-end cameras
(NEW YORK) I'm on a train heading up to Boston for the annual SID conference -- the Society for Information Displays -- taking place this week (it's still going on, if you're nearby). More on this after I actually see the convention, which starts tomorrow, but one big motivation to go was Motorola's announcement a few weeks ago that they'll be showing their new carbon nanotube (CNT) emissive display screen -- see Motorola Says New Nanotube Technology Will Shrink Flat-Panel Display Costs. At CES in January, Canon and Toshiba held a highly secretive demo of their similar SED screen technology, which I missed. This time, I want to see one of these new flat-panel, CRT-like displays with my own eyes. More on that later.
Meanwhile, I've got CMOS image sensors on my mind. Once clearly inferior to CCD image sensors, CMOS has made big improvements of late -- so much so that both Ikegami and Sony are using them in their latest high-end cameras (see CMOS Image Sensors Come of Age.)
Nothing new to report at last week's Streaming Media East conference in New York, except that after many years lurking in the background, IPTV now appears to be the rage. Despite what the early-adopter "streamer" community thinks, however, it still appears IPTV has a tough road ahead (see Telcos Face Tough Road Deploying IPTV: Report).
The great thing about video is there's always something new on the image acquisition side, the display side, and with practically everything that goes on processing and distributing the signal in between.
What's your opinion of CMOS versus CCD image sensors? Do CMOS Image Sensors Match CCD Quality? Share your thoughts on this in our Video/Imaging DesignLine Forum.