Bert, I think the point of confusion is the difference between decoding compressed video and post-processing of the resulting uncompressed frames. Clearly, every smartphone and tablet can correctly decode H.264 video, so those devices are correctly handling the motion vectors to decode predictive & interpolated frames. Those algorithms, as you said, are ubiquitous.
But it seems that in the mobile world, decoding the compressed video and basic FRC is pretty much where the video processing ends, whereas in a TV set, BD player or set-top box, there is typically another stage of video processing that occurs before sending pixels to the LCD. It seems that this post-decoder stage -- "display processing" -- is what Pixelworks is addressing for mobile devices.
It does seem hard to believe that in 2014 our mobile devices don't have this type of video enhancement that has been common in TV sets for so many years and is in fact one of the differentiators between one TV make/model and another. Perhaps this is because it is only recently that tablets & smartphones became very commonly used as mainstream video content consumption devices -- even becoming "first screens" for many people. Until recently, the demand for a high quality video experience on a mobile just wasn't there, or at least not to the degree it was there for the TV set market.
Junko, understood that there are better or less good ways of doing motion estimation and frame rate conversion. However these algorithms are still required in MPEG decoders. So although many companies have gotten out of that work, those algorithms are still ubiquitous, and some companies are still producing them.
Examples: Motion vectors are part of all MPEG decoders. Between the MPEG "interframes" there are a number of interpolated and predictive frames. That's why MPEG is so much more efficient than motion JPEG. You need motion estimation to generate motion vectors required to formulate the predictive frames (P-frames) and the interpolated frames (B-frames).
Also, you need these techniques for the 100 and 120 fps TV sets on the market these days.
Perhaps the issue is that smartphones and tablets have been getting by with sub-par MPEG decoders so far, and that the decoders in common use in TV sets are too power hungry to just plug into these handheld devices?
Also, the upcoming H.265 HEVC algorithm will need some more advanced forms of motion estimation, much like H.264 did compared with H.262 (MPEG-2 compression).
I guess I'm puzzled about the notion that this is a "lost art." The companies producing MPEG decoder chips can't have lost that art, even if the initial large number of companies might have been culled down to a few survivors.
You are right Junko. I worked on video chips for two different major semiconductor companies. Both ended up scrapping the projects and getting out of that business. There was a lot of learning and tweaking of the algorithms to get the best picture quality when decompressing the video stream. It's a shame to lose that learning, but that happens in a competitive market.
But apparently, according to a few video experts I talked to, "there are a billion different ways to up convert video, whether it is a motion estimation, motion vector or others."
In other words, there are many different ways to enhance video processing algorithms, and most consumer chip companies -- no longer in the TV SoC business -- are not investing in those things any more.
I'm assuming "motion estimation motion compensation," and "frame rate control." Which are part of MPEG decoders. Could this not be a case of a maturing technology, where a few companies survive and do the lion's share of the chip manufacturing? Both of those techniques should be ubiquitous, just to decode the various mandatory modes for ATSC and DVB digital TV, I would think. No? Even if not yet in tablets ans smartphones.
When I used to cover TV chip companies (and MPEG video compression standards, etc.), I had always thought MEMC is a well understood technology. What I didn't realize until recently is that the knowledge base of MEMC and FRC, etc., has been dispersed (if not lost), as more chip vendors getting out of the very tough TV SoC business.
Clearly, as things like UHDTV get rolled out, more updated MEMC and FRC IPs are now becoming in fresh demand. But apparently, very little investment has been made in that area in recent years, thus creating an opening to a company like Pixelworks, a display processing specialist, to find its way into not just in big-screen UHDTV but also in smaller tablet and smartphone devices.
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.