Technology transition nowadays is too fast that sometimes it is quite hard to catch up. We are the generation that has witnessed these changes, and the transition frames on the way we consume information, the speed of consuming information, and the quality we require. Gone are the days when products made are dictated by our needs. Nowadays, our escalating needs are realized and dictated by products that continuously plague us each day. Who would think that the iPod would revolutionize the way we treat music? We have never thought of that until we experienced what iPod can deliver. This is the same way that HDMI will soon create some needs in us. We may never realize it, but it will frame on the way we ‘generate’ new kinds of needs, and eventually more and more technology will be created, probably better than HDMI, to answer these new needs. It is like a never-ending cycle. http://www.discountdisplays.co.uk
Although an older article it is a good overview of HDMI but focuses entirely on "perceived" advantages overlooking major disadvantages and problems HDMI causes.
The most significant drawback to HDMI is the fact that it is fundamentally for point to point connection of one source to one display.
Many of us have whole house video distribution systems and have become accustomed to watching our TIVO, STB, and DVD outputs simultaneously displayed on all TV's in our homes. It's an incredible convenience that I won't give up but HDMI may mean I will have to live with SD to keep it.
From the article:
BUILT-IN INTELLIGENCE OVER HDMI
HDMI uses bi-directional communication and the increased processing power. . . .
This very bi-directional interactivity thwarts simultaneous viewing on multiple TV's throughout a home.
In a feeble attempt to correct this HDMI now provides for "repeaters" and allows up to 28 displays but this reveals another fundamental flaw in the original "point to point" concept: The EDID!
From the article:
HDMI Intelligence: Automatic Device Configuration.
First and foremost, HDMI uses Extended Display Identification Data (EDID), which is a set of detailed capabilities data for a specific device that is stored in a Read Only Memory (ROM) chip housed inside the device. The EDID ROM contains information such as the display's supported video resolutions, timings and audio capabilities. EDID is always read by the source device upon power up so that the source can quickly and automatically determine which specific format of video and audio signals the attached display can support.
Clearly there was never any thought for supporting multiple displays and the added repeater support accommodating 28 displays is compromised by the single EDID information required for the HDMI source.
That EDID must be the "lowest" video resolution and audio quality of any of the 28 displays. That's right, if you want to drive a 720p kitchen TV with internal stereo speakers through a "repeater" with a source attached to your family room 60" 1080p TV with surround sound all you're going to get on your big TV is 720P and stereo!
And the schemes to overcome this (MoCa, UWB, etc.)require "receiver boxes" at each TV in the $400 range - more than most smaller TV's these days. They also suffer from the least common denominator EDID problem but I suppose more sophisticated devices could present the "highest" common EDID requirement and have the intelligence to reformat the video and audio for the less capable displays. None that do that have been announced and we are in 2009 not 2006 when this article was written. In fact only one company to my knowledge (Gefen) has released a MoCa or UWB solution and they make no mention of supporting more than one "point to point" link so still no "simultaneous" viewing of the same program on multiple TV's.
Now for a comment in my area of expertise:
Do you think HDMI 1.3's "automatic lip-sync correction" feature detects lip-sync error and corrects it? It definitely DOES NOT but that's what the HDMI hype has consumers believing.
This article actually explains what it does pretty well in one place but in another it is highly misleading when it says (about HDMI 1.3):
"and supports the ability to automatically and accurately adjust the audio to maintain lip-sync with the video image."
That is simply not true in the context that most of us would interpret it.
Later in the article:
"This video and audio latency data is included in the display's EDID profile, and can include latency values for both progressive and interlaced video formats. Not only is this HDMI feature precise, but it can be designed to be done automatically and transparently without user interaction."
There you can see that all this feature can do is allow a HDMI 1.3 display to tell a mating HDMI 1.3 A/V receiver to add a fixed audio delay equal to the fixed video delay the display adds.
This will actually aggravate the lip-sync problem when audio arrives delayed which is quite common in DVD's and also happens in broadcasts where their is over correction.
True "automatic lip sync correction" is impossible since there is no watermark in the audio or video to suggest when they were ever in-sync.
The author mentions the feature allows for a different delay to be added for different visual formats having different video latency but in reality those differences may be an order of magnitude less than the lip-sync error variance between arriving programs and DVD's.
The only way to actually correct lip sync is to do it subjectively with a variable audio delay while watching the actual material being corrected. Three companies (Alchemy2, Felston and Primare)make remote controlled digital audio delays for that purpose and HDMI's false claim of "automatic lip-sync correction" hurts those products since most consumers actually believe HDMI - if and when they upgrade their receivers - will correct the problem. It not only won't correct for lip-sync error in the arriving signals but it will actually exacerbate it in many cases.
And my last comment relates to the "absurd" 8 channels of 192 Khz uncompressed audio capability. A double blind study published by the AES proved there is no audible difference in 192 KHz 24 bit audio and 44.1 KHz 16 bit audio. That means HD Audio could just as well have been 8 channels of 44.1 KHz audio. Even that in my opinion would be absurd considering how little information there is on rear channels and how much redundancy there is between all channels. I have no similar credible double blind study to quote but I personally feel such a study would reveal similar findings for any audible differences in 8 uncompressed channels and the long established but older Dolby and DTS compressed audio formats.
That was going to be my last but while I am on this soap box I can't refrain from another elating to the claimed compatibility "advantages" of HDMI.
Nothing has been further from the truth. HDMI 1.1 was an over 300 page spec and it is a moving target now at 1.3. You can't have a 300 page spec and not expect different engineers at different companies to interpret it differently - hence: PROBLEMS.
We've had cases where our customers had to switch from HDMI to component output because they could not defeat their HDMI display's EDID requesting 2 channel PCM (because the TV only has 2 speakers and no Dolby/DTS decoder). The STB took that request via HDMI EDID to mean it should send 2 channel stereo out not only via HDMI audio but also its s/pdif audio outputs destroying any advantage of the customer's surround sound system. Maybe you chalk that up to an error in firmware in the STB but my point is that all this ridiculous complexity in HDMI has created far more problems that it has solved. DVI with a single additional s/pdif cable would have provided (assuming its bandwidth were increased similarly)the same quality video and audio with far fewer problems in my opinion.