Increasing pixels in an image sensor is often done by decreasing the size of each pixel. You increase resolution but you usually also decrease sensitivity and increase noise. Finding the balance between resolution and sensitivity for a given application is important.
iPhone 3 of screen size 3.5" has resolution of 320x480. iPhone 4 has 640x960, doubling the resolution of the predecessor. iPhone 4 is talking about 326 pixel per inch. For a 40 inches TV, the pixel per inch is roughly 54. The distance between viewer and TV shall be taken in to account as well. Nonetheless, the industry may likely push the resolution further because, soon, the current HDTV will have very little profit margin.
Just like in optical lithography, sub resolution features on a masks still enhance the quality of the printed feature. I think more pixels will render better images and better zooming capabilities, but at reduced return on investment.
The Apple iPad is meant to have a "retina" display ie at the expected viewing distance its resolution is the same or better than that of the human retina. That display is roughly 2 mega pixel. Now I know that with print systems you need a bit more resolution than this, but it would still imply that unless you are so close to the image that you cannot see the whole image, then 16 mega pixels is rather excessive.
In photography of course you might crop away a lot of the image after taking it, hence reducing the total number of pixels used. The Nokia phone has a lot of pixels but most of the time it is either averaging them together, or is only giving the user part of the image - this saves on having an expensive zoom lens, you just crop the image to smaller part of the sensor and blow that up for display. However the more pixels, the less light per sensor cell and the grainier the results are likely to be unless the sensor is so big as to be uneconomic.
I would think that the increased pixel counts and resolution will be helpful for large format displays like movie theaters and large print posters. At some point I do wonder about the need for more pixels, but having a higher frame rate is always welcomed!
I wonder if the HD video is already going beyond what human being really can resolve with bare eyes! Maybe such high resolution is good for machine vision but that is a limited market which I wonder Omnivision has any interest on.
Do yourself a favour and go and look at red.com
4k and 5k movie/still cameras , 4k video projector
Being used by Peter Jackson to film the hobbit in 48fps 4k. Also for the great gatsby http://www.red.com/news/the-great-gatsbys-first-trailer
make sure to have a look at their red reels (if your pc can handle them)
Doesn't nokia have a phone with 48Mpixel camera ?
Blog Doing Math in FPGAs Tom Burke 13 comments For a recent project, I explored doing "real" (that is, non-integer) math on a Spartan 3 FPGA. FPGAs, by their nature, do integer math. That is, there's no floating-point ...