On the final day of IBC I made my way to the EBU demo, comparing 1080i, 720p, and 1080p after being encoded by H.264 at various bit rates. It was an impressive and carefully prepared demo, and unlike similar comparisons I've seen previously in the U.S., this one had a very clear message: 720p is better than 1080i.
The demo featured three identical Pioneer 1080p plasma displays, all 50" diagonal, arranged with 1080i on top, 720p in the middle, and 1080p on the bottom. Seats were located at "3H" distance from the screen -- three times the horizontal width of the display. That's where I sat, twice, through the 12-minute demo to make sure I got it.
The preparation for this demo was extensive, using scanned 65mm film as source material, as well as identical-looking footage shot in 1080p, 720p, and 1080i. To demonstrate the effects of the production process and transmission using H.264 compression, each series of shots was first copied through seven generations of JPEG 2000 compression (to simulate a high quality production environment), and then compressed with H.264 to fit into a variety of bit rates, from 18-megabits/second down to 6 mb/sec.
The speaker who led us through the demo, which had no recorded sound, was adamant. "Which looks better to your eyes?" he would ask at various points in the demo. And if anyone ever suggested that 1080i looked good in any way, he'd cut their argument by saying things like, "Yes, but look at the grass and the lost detailed there -- isn't the 720p really better?"
By the end of each demo, he made sure the consensus was that 720p was better, and if anyone disagreed, they were treated as an oddball whose eyes were obviously not "golden." (That's my reference, not his!)
What did I think of all this? The demo was good but not compelling, and an argument could be made for the efficiency of 1080i. (I was reminded of my early-20s experience for a weekend at what turned out to be, unbeknownst to me, a Moonie recruitment camp, in which by repeating their catch phrases over and over we were expected to absorb it without quite making any sense out of it.)
To me it seems obvious that 720p would be better than 1080i, since it has wider bandwidth. The real question, which was never raised at this demo, is whether 1080i with roughly a third less bandwidth than 720p can do almost as good a job? But why would anyone expect it to be just as good, or better than 720p? The whole premise of what was being asked here seemed absurd.
Most frustrating of all was that the demo kept moving. The video never paused to compare individual frames, and the viewers were expected to catch quick-changing details on the fly. The narrator would talk about fleeting artifacts without pointing out where they were. It was like a twist on the old saying that if you have to ask how much it costs you can't afford it -- if you have to ask where the artifacts are, you don't belong here. Then again, the intended audience here consisted of videographers, cinematographers and other professionals.
The demo was completely uncompelling at high bit rates, such as the 18 and 12 mb/sec H.264 versions. Only when serious compression was applied did big differences between the formats appear. Which led one with half a brain to wonder how much the compression algorithms, and not the original production formats, really influenced all this.
In particular, for the final, most compressed 6mb/sec. versions, we were told that the 720p source material actually looked better than the 1080p source material because "today's compressor technology can't handle all the extra information."
The speaker was very clear about the conclusions that should be drawn from all this. "Use 720p for today, and in the future, use 1080p as the compression gets better. And as far as 1080i is concerned, don't ever it use it. 720p is always better. Interlace was useful in its day, but not today. Never again for interlace!"
Ostensibly, the demo was for the purpose of convincing the production community -- particularly camera operators and directors of photography -- that 720p was better than 1080i.
I must admit though that, given the investment Sony and other Japanese companies have made in the 1080i format, and given its adoption in the U.S. for HDTV by many parties, and given the history of Europe creating different TV standards from the U.S. going back more than half a century, one can't help but wonder a bit about EBU's adamant, one-sided presentation.
It's nothing new. The EBU has been promoting progressive over interlace for several years, such as in this 2005 EBU editorial. They ran a similar demo at IBC last year, but without the H.264 compression.
About a decade ago Microsoft was doing similar demos at big trade shows like CES, trying to convince the world that 720p was better than 1080i for the purpose of moving consumers away from TV and towards the PC (where progressive scan is native). Back then, their self-interest was clear.
Unlike their U.S. counterpart (the NAB), the EBU (European Broadcasters' Union) isn't the host of this huge IBC trade show -- the show is independently organized. Which might partly explain why EBU is so willing to go out on a limb criticizing some of the very technology that's on display at this convention.
With such questions of international intrigue, big business and HD format war competition still swirling through my head, I boarded the plane and headed home from IBC.