The view from Digital Hollywood looks pretty grim to me. 3-D TV was a disappointment, 4K TV (aka Ultra HDTV) is pretty much a useless extravagance, quality video is a poor stepchild in today’s wireless world and nobody wants to buy movies anymore.
Two Digital Hollywood insiders I talked to in a recent trip there expressed optimism given the breadth and pace of activity these days. Still, I see a town in search of the next big thing.
After its big debut a couple years ago, 3-D TV was hardly mentioned at the recent Consumer Electronics Show and broadcasters have pulled the plug on some 3-D channels. The lack of content and the need for glasses are both taking the blame for 3-D’s fizzle in the home.
Some had high hopes for the autostereoscopic approach Philips pioneered. But few think Dolby, which now owns the technology, has the clout to drive it forward. After a big belly flop, climbing back on the diving board is harder.
Samsung is pushing 3-D TV forward with work on a Bluetooth standard for active shutter glasses with backing from Panasonic and Sony. It also set up a facility in South Korea to convert 2-D content to 3-D.
Live sports is a big missing piece. “If the Super Bowl was broadcast in 3-D, this would be a different discussion,” said Brad Hunt, principal of Digital Media Directions (Westlake Village, Calif.).
The Ultra HDTV (4K x 2K) displays at CES seem to have left everyone cold. To really see all those extra pixels you need the equivalent of a 96-inch home TV, but even the more standard size screens are way too expensive for the average Joe, I am told.
The trouble is without a next big thing like 3-D or 4K, TVs remain stuck with their role as a commodity product. “We are back to TV sets sold by the inch so it’s hard for anyone to make a profit--and 4K is potentially a race to the bottom,” said Andrew G. Setos, chief executive of Blackstar Engineering (Pacific Palisades, Calif.) and a veteran audio-visual engineer.
“I think the improvement could come in truly lower cost of manufacturing--not by lower labor costs but a new mechanism that may be protected by patents--then TV manufacturers could make profits, but until then they are in a world of hurt,” he said.
Aaah, but you ignore the fact that a lot of old TV shows, or maybe all of them actually, and movies, are recorded on 35mm film, if not wider than than that.
I actually enjoy watching some of the original Star Trek shows on DTV occasionally, when staions air them late at night. It turns out that the picture quality is very good (audio not so good!). The picture is good enough that one can see how primitive the props really are. Which you couldn't tell on fuzzy old analog TV.
Good standard 35mm movie images, 18mm by 24mm frames, contain approximately the equivalent of 8.5 Mpixels. This is limited only by the camera lens, averaging ~70 line pairs per mm. A superb lens would beat that figure.
So even HDTV, where the image provides at best 2 Mpixels, is not providing you all the image quality potentially in the original film. UHDTV would improve this "shortcoming."
Since all the good movies and tv shows were made a long, long time ago, it doesn't make sense to buy an ultra HDTV to watch something that was recorded in black and white and have to stretch it out to make it fill the screen. If all a person cares about is special effects, oh yeah, buy the 3D and put on the glasses :)
I think many people keep cable solely because they are afraid of missing something. My mother-in-law, who lived in Philadelphia, was always complaining about her $100 cable bill.
I notice her viewing habits were only the networks, ABC, CBS, NBC, Fox, and PBS. I proposed she get a roof-top antenna installed and she could get all the channels she was now watching in HD - for free.
She refused. She somehow didn't think this was possible, because, if it was, why doesn't everyone do it?
So it appears, people are chained to their perceptions. She continued complaining about how expensive her cable bill was, but wasn't willing to do anything about it.
Now whats wrong with 96" in your house? And why not in every room and on your kitchen table surface? And panels on your wall covered in pixels? If its cheap? And why necessarily use the full area for the content at all times? Where is the fantasy? Hollywood will only be a small portion of this content, but I can see they stuggle with gaming and other social internet interaction. Maybe hollywood should buy Facebook and some tech company that doesnt care about excess pixels and start developing a full everyday experience. This is probably something Apple could do in 10-20 years, if it got focus.
"Connected TV" has long existed. Long long time ago, there were Viiv, AMD Now. Now, there are Google TV, Connected TV, Apple TV, or whatever. This is how many people in the EE trade, TV trade, CE trade, justify their existence by re-packaging old stuff!
"In regards to the hassle, people who frequent EETimes aren't really the norm to judge complexity against. Internet TV will take off when it's about the same in complexity as a typical cable box to plug in, turn on and use."
But Duane, these people use the Internet all day long. Why should we expect that they will become thoroughly incompetent when they want to watch TV?
With my PC-become-TV-STB, I can reach any Internet TV site I use with a single click of the (remote, sitting on my couch) mouse. Could anything be easier? And I can search out any content not available at these sites with any search engine. I have found numerous portals that way.
Connected TVs should provide this same functionality. It's hardly rocket science. And all the companies you mentioned would indeed be accessible easily.
In regards to the hassle, people who frequent EETimes aren't really the norm to judge complexity against. Internet TV will take off when it's about the same in complexity as a typical cable box to plug in, turn on and use.
Companies like Netflix, Blockbuster online and Hulu need to be accessible as easily as is a typical cable TV channel today.
When that's the case, potatoes can rule to couch again and TV makers will have their brief window of high margins followed by high volumes until the next big thing hits.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.