VP8 was introduced long after H.264, so it was at a disadvantage from the start. It seems that this time around, H.265 and VP9 will emerge at about the same time, and that both should be available for UHDTV. It would be nice to see them compared in terms of compression efficiency, where efficiency would be compared on the basis of image quality vs bit rate and, for any given image quality, the CPU resources required in encoding and in decoding.
I'm not sure whether HEVC is the only option worth discussing, this time around?
The biggest issue is really about frame rate, we have current implementations which only support 50/60Hz when in reality the large frame size doesn't suit low frame rates. There is research going on into higher frame rates (120/150Hz) but I would really like to see the industry taking a longer term view and building 300Hz.
I am glad that I am not alone traumatized (no, not really) by the tulips and windmills video! (well, as a reporter covering the field, at every MPEG demo I attended, I was shown the same video over and over...)
I always thought the standards body's "tool box" strategy genious,. But oh, boy, there appears to be so much flexibility in HEVC encoding.
It may take some time for vendors to "settle on a few subsets," though.
Ah yes, the tulips and windmills! There were a few other streams I watched so many times that they are forever etched into my memory.
On a more serious note, a standard verification suite will help, even if it isn't yet an official standard. Eventually, within the vast space of options on the encoder side, people will hopefully settle on a subset of tools that work well for most UHDTV delivery schemes.
When I was interviewing Argon Design, two things popped up.
The company explained:
...In HEVC, multi-core chips can encode streams in independent tiles, while hardware implementations may choose to minimize cache sizes by using wavefront encoding. However, decoders need to support all options.
...HEVC has moved from a fixed partitioning scheme to a quad-tree decomposition scheme. This means there are orders of magnitude more variations of block sizes for coding/transform/prediction.
Now both of these things pointed out above tells me that yes, verifying your decoder against your own encoder may work, but certainly it doesn't guranatee that your decoder can support various encoders that will soon be out there! That sure sounds like "unintended consquences" waiting to happen.
A Book For All Reasons Bernard Cole1 Comment Robert Oshana's recent book "Software Engineering for Embedded Systems (Newnes/Elsevier)," written and edited with Mark Kraeling, is a 'book for all reasons.' At almost 1,200 pages, it ...