The HD QoS dilemma comes down to this: Does high definition represent an improvement over standard definition, or is it more a tradeoff?
I don't claim to have "golden eyes," but I've certainly looked at enough video encode/decode samples to claim bronze status. Let's put it this way -- when I watch TV, I may not be looking for artifacts, but I do notice them when they hit me over the head. Which brings me to the subject of this blog, which may be the last for a while (see below), so it seems fitting to return to an old refrain here: Quality of service.
A few nights ago, having just finished a big project, I gave myself a rare luxury of time, by watching some late night TV -- an episode of "The Late Show with David Letterman." Don't get me wrong, it's not that I never ordinarily have the TV on, but usually it's in the background while I'm working or doing chores. This time, I actually watched the program, in HD, via Time Warner cable-TV. What I saw, from a QoS perspective, was horrendous. With a frequency of at least once per minute, and often four or five times per minute, there were glaring problems with the image. Sometimes it was freeze up for several frames. Sometimes it would exhibit artifacts where you'd expect, amidst fast-moving hands or other objects. Sometimes the picture would locally break up in seemingly random, or unchallenging spots. But it kept happening, over and over, throughout the entire program. At one point I got so fed up I switched to the standard definition signal, which thankfully exhibited no such problems.
Clearly the engineers at the CBS television network would never let such a poor HD signal leave the studios, so my gut tells me Time Warner cable, with its constrained constant-bit-rate channel bandwidth is the culprit (some reports have measured cable-TV's MPEG-2 CBR HD signals in the U.S. at roughly 10-megabits/second.) And if it's happening on my local cable system it's happening on other cable systems everywhere (see USA Today article.)
The HD QoS dilemma comes down to this fundamental question: Does high definition represent an improvement over standard definition, or is it more a tradeoff?
Back in the late 1980s, when the U.S. first contemplated the HD transition and there wasn't even talk about DTV -- HD was initially presumed to be analog, as the pioneering Japanese NHK "HiVision" system had been -- HD was presumed to represent a total improvement. HD would offer everything standard def had, plus higher resolution and a wider screen.
Two decades later, in practice, things have proved to be more complicated. HD does have more detail, and the wider screen, but on cable-TV it can come at the expense of image artifacts. Lots of image artifacts.
This conundrum isn't just limited to cable-TV. In camcorders, consumers and the designers of video cameras face a similar HD tradeoff: You can use higher resolution image sensors, but they'll come at the expense of low light sensitivity. This is just basic image sensor physics -- for any sensor diameter and technology, more pixels means less surface area per pixel. Storage, of course, presents yet another set of HD trade-offs.
In the rush to make everything HD, product designers and engineers should always keep the consumer perspective, and these tradeoffs that must be made, in mind. And carriers of HD signal, such as Time Warner, need to literally clean up their HD QoS act -- probably by devoting a few million more bits per second to each HD channel (or switching to a more efficient H.264-based compression scheme -- something that's inevitable, but requires considerable infrastructure investment.)
I've said goodbye in this space so many times previously, I'm losing count, but it now appears I'll be on hiatus again for at least a while, if not longer. In the meantime, please check out some of my other related activities at my own site, cliffroth.com.