WiFi everywhere makes sense...but needs to migrate to 5 GHz band, otherwise 2.4 GHz will be too congested (it might be already)...I don't see UHDTV happening, what for? so I can see Swiss alps more clearly?
Frank, why you say that? At normal viewing distances, unless you have a extremely large TV, you won't be able to see the difference. It'll be years before there's any 4K content.
Already cable companies are overcompressing HD channels to crowd in ever larger numbers of channels, does anyone doubt the same fate would eventually befall 4K?
I think 4K will probably be successful in the same way 3D was, from a sales standpoint only. It'll eventually be bundled with every higher set like 3D, so it'll achieve great market penetration.
But like 3D, only a small number of buyers will make regular use of the 4K capability.
The one thing I do look forward is from 4K is that when it becomes mainstream then monitors with 4K resolution will become mainstream as well. That's somewhere there's an obvious benefit, because I sit maybe 20" away from my 27" monitor, versus 10' from my 50" plasma.
"At normal viewing distances... you won't be able to see the difference."
There's a simple solution to that: move the couch closer to the television!
My kids always want to sit in the back of a movie theater. I tell them that by doing that we paid $12 to watch it on a big screen. If you sit in the back you might as well be watching the movie on a cell phone.
I like to sit up front. And when you do you can easily see the pixels in the 4k digital movies (it is kind of disappointing really--we need 8K!).
I like full immersion, but we are decades from that, so sitting up close is the best you can get for now.
(Okay, just maybe the fact that I'm a little near sighted might have something to do with wanting to sit up front).
"unless you have an extremely large TV..." is part of the answer. The other part is, as tb1 says below, the "immersive experience." I expect average screen size as well as resolution to continue to increase, subject to the usual caveats about the economic sweet spots. Not many consumers will pay $20k for a huge UHDTV display, but millions will pay $2K for such a display. The same applies to your comment about bandwidth. As long as there is insatiable demand for more bandwidth, there are profits to be made in finding ways to provide more bandwidth.
It is the nature of our business to make things better & cheaper over time, and the main issue is rarely "if," but rather "when."
I remember when Kodak first came out with its flat grain Ectar 25 negative film. Even though theoretically "no one" would appreciate the improvement, in 4 X 6 prints, over previous negative films, the improvement was obvious. An image so smooth that you'd swear the film's emulsion was liquid.
Same deal here. People were saying that HDTV was unnecessary. And yet, even on not-so-large screens, it's gorgeous. With UHD, you can get a little closer to your normal size TV set, and still see a beautifully smooth image.
The eye/brain system is complicated. Even if first order approximations imply that "no one" will notice the difference, your brain will inform otherwise. Just look at the way people gush over "retina displays," on very small screens. These have a greater pixel count than 1080p.
What especially appeals to me about UHD is that codec improvements since the introduction of HDTV should make UHD require little or no more channel capacity than HDTV. So it should actually be practical and doable. As opposed to being a bandwidth hog.
This 5 GHz comment makes me think that perhaps it's not widely known that the 5 GHz band was always part of the 802.11 protocol.
The original 802.11a operated at 54 Mb/s on the 5 GHz band. Then came 802.11b and 802.11g in the 2.4 GHz band. But I'm using 802.11n in the 5 GHz band now. So it's not something new with the 802.11ac standard.
The main techniques used by 802.11ac, to increase the bit rate beyond 802.11n, are increase the constellation from 64-QAM to 256-QAM, which means more power needed to achieve the same range, and to widen the channel width from 20 or 40 MHz to 80 or 160 MHz.
Which means, the crowding you now see in the WiFi 2.4 GHz band will occur in the WiFi 5 GHz band.
Please see the SONY TV on Ultra HD. Unfotunatly the price is high (need to come down by min 6x to be affortable). As always, cost will come down. It is a beauty. With faster FPS and fill rates. You feel like you are in Swiss.
A couple of comments. First, it is not 802.11ac that will make the biggest impact. With 802.11n, I already get 270 Mb/s in the house. The weak link in the chain is clearly the broadband connection TO the home, and perhaps DOCSIS 3.0 can solve that problem for those who have cable. If you rely on xDSL, I can see a path to 50+ Mb/s, but not anything close to 1 Gb/s.
So, I think the emphasis in the article is on the wrong part of the distribution chain. It's the last mile connection that should have been discussed, not the in-home network so much.
As to smart TVs, I agree that it's more than just the software that would require upgrades. But (a) hardware upgrades are required much less frequntly, and (b) clever design could easily take care of that too. How hard can it be to provide a cell phone sized replaceable module with the smart TV set?
Honestly, I continue to be utterly baffled by the unimaginative discussions surrounding smart TV. With all the clever devices available out there, when it comes to TV, it seems like everyone hits a brick wall.
Which link has the bigger impact depends on the usage model & behavior of the people in your household. If you've got a large archive of content on a DVR or media server, with multiple people accessing it wirelessly, then upgrading from 802.11n to 802.11ac could be a big improvement at your house.
If on the other hand you've got multiple people streaming content from the cloud, the WAN connection might be the bigger bottleneck.
It's frustrating that in the U.S., the broadband service model is that we pay a lot for a little, compared to some other countries that pay far less for huge (100Mbps) pipes.
Frank, while what you describe is theoretically true, I don't think it's the wave of the future. The idea of people setting up home servers to send around massive files between computers in the home was never a big hit, and is becoming less so.
This kind of file transfer is going to "the cloud," as current lingo calls it. So in fact, your last mile connection is where the action is, rather than the home network. And any IoT appliances in the home, that may actually remain within the home network, are likely to be noise level in terms of bit rate.
As TV migrates to the Internet, this will be even more true. Multiple people watching different TV programs will result in multiple high bit rate streams from "prime time anytime" web servers, through the broadband link, then through the home network, rather than from any sort of centralized in-home PVRs.
Dear Junko - thank you for another informative interview. Dr. Samueli's comment that a war on number of compute cores (processor compute power)may be overblown is interesting -- from a company that is a master on very large (MIPS) cores in networking....
Perhaps you could have asked two more pertinent questions:
1. MIPS versus ARM cores - BRCM just purchased architectural licenses for 32- and 64-bit cores ("real men fine-tune standard ARM cores") -- any trends?
2. BRCM is the king of wireless connectivity Combos. Top-2 is Qualcomm and Qualcomm, while still have a Combo IC (RF radio portions, integrates digital portion of connectivity into processor.... for a variety of well detailed specific reasons. Qualcomm is the king of baseband processors, of course. Any thoughts on that ftom BRCM's CTO?
Many thanks in advance if you could follow-up on the above - as you did with your Rockchip interview.
And -- HNY Junko!
One of the things I find interesting about this article is the number of times he said something to the effect of "the standard is already here." It wasn't that long ago when the products more often preceded the final standard. 802.11n was that way. If I remember correctly a was also. Are companies getting smarter about getting together on standards early on? Or are some of these future standards wishful thinking that will need a lot of modifications when the implementation technology is available?
Of course they do. They won't give us any faster internet until they are ready for it. I'm sure it is already possible to provide much faster connection. But they wait. Who can tell what they are waiting for? As for the standards, they set it, for sure. All together.
William - http://www.carid.com/