NEW YORK – While the debate over divergent wired and wireless home networking schemes and standards rages on, one wireless application stood out at the International CES: Miracast.
The very idea of being able to wirelessly beam what a user is watching on his or her handheld device (a smartphone or tablet) to a bigger-screen TV struck a chord with a lot of conventioneers.
Miracast has shown a path that enables projection of personal media (or Web content readily available on mobile devices) onto a bigger-screen TV. It does so by completely bypassing artificial constraints put up by broadcasters or other service operators who often prefer a walled garden approach to their Internet offerings.
Meanwhile, a Miracast vs. UltraGig debate is brewing.
As Brian O'Rourke, senior principal analyst at IHS, explains it, “Miracast (previously known as Wi-Fi Display) is a software layer that enables Wi-Fi silicon with peer-to-peer connection capability.” Although Miracast is different from Wi-Fi’s traditional point-to-multipoint architecture, it’s a standard created and maintained by the Wi-Fi Alliance. Miracast’s application has the ability to “mirror whatever is on the smaller screen onto the larger screen."
In contrast, UltraGig is a 60GHz technology that Silicon Image is pursuing. Its original 60GHz wireless technology, WirelessHD, comes from SiBEAM, which Silicon Image acquired in 2011. Silicon Image has given WirelessHD technology its own, product brand name: UltraGig.
Just to refresh your memory, WirelessHD is the highest bandwidth wireless video transport solution currently available in commercial quantities. O’Rourke explained that it’s optimized for uncompressed 1080p video transmission, so it should not suffer from packet loss or artifacts created by compressed solutions, such as those which use MPEG compression.
Both Miracast and UltraGig showed similar demos of peer-to-peer wireless connectivity in a living room using smartphones.
Nvidia shows off Miracast.
Many chip companies--including MediaTek, Nvidia and Broadcom--touted Miracast at CES. However, Tim Vehling, Silicon Image’s vice president/general manager of wireless division, found CES the perfect forum to pitch and highlight UltraGig’s virtues as it operates on 60GHz.
Vehling pointed out that the UltraGig demos “suffered from none of the interference that Miracast has from the many WiFi devices in the market.” He claimed that “the only way people could show Miracast at CES was by doing it off the show floor (a la MediaTek in a private suite at the Sands),” or in some instances, by faking it (using an MHL cable, for example).
Peter Cooney, practice director for wireless connectivity and semiconductors at ABI Research, agreed. He said that some of the Miracast demos might have been using 802.11n, “so it would not have been a fair comparison.” At any rate, Cooney said, “it was an incredibly noisy environment at CES.”
This article seems a bit confused. Miracast is basically a standardized version of AirPlay, built on WiFi Direct. It is used in Wii U, and is certainly here to stay.
60ghz technologies have been cooking for years, and are now reaching mass market. It has very high bandwidth but very short range--basically in-room. Like 2.4/5ghz, it is unlicensed.
There are two questions here: how valuable are the high bandwidth, short range use cases for 60ghz; and whether anything other than WiGig will survive, given general industry support.
My first comment is, it would be nice to know more about how Miracast works. The one comment that we should wonder about congestion, when Miracast and WiFi are attempting to coexist, is very valid. Miracast is not WiFi, but if it uses the same frequency bands as WiFi for uncompressed video, one should ask.
The second comment being, if you can get that streaming media content to your tablet or smartphone, why not think in terms of streaming it directly to the smart TV? Without this peer-peer link?
Why assume you need a tablet, a Miracast or UltraGig reception box at the TV set, and the TV set, to do what a smart TV should be able to do all by itself? Who or what is keeping smart TVs from giving the user the same UI as any tablet or PC out there? Anyone?
Think of 60GHz as a replacement for an HDMI cable used to link a web-enabled device - or just a set-top-box - with a display device. (No reason why the display screen should have its own personal web-access)
@plk & @Bert22306: I agree with your points. I don't know why one needs a hand-held (to stream media) that is largely redundant, wasting materials and resources. On the other hand, if we are talking about a hand-held that can double as a remote and meeting the plethora of confusing standards (RF4CE, 6LoWPAN, etc.), there is a play for it.
I am still holding out for a TV that can be gesture-controlled.
Simple. Your Smart TV is across the room. Your tablet or Smart Phone is in your lap. Mirroring the video strikes me as an inefficient way to do this, but having a console app in your tablet routing whatever you want to your TV makes all kinds of sense.
I think something got lost in translation in this debate. Yes, if you can stream all that Internet content directly to your smart TV, that's all the better and efficient. But the reality is that many consumers today already have a smartphone and/or a tablet. Sitting in a living room with your kids, chances are, some of them are surfing the Net on a tablet or smartphone already, while you are watching a big screen TV.
Wouldn't it be nice, if a relevant (or irrelevant) content found on a smartphone can be beamed onto a big screen TV during a commercial break, and share the laugh?
I am talking about the reality of multi-screen era in a living room. It's already here. If so, how best to connect those multiple screens?
Junko, it is NOT efficient to stream content from Internet to SmartTV! I can do that simply by transmitting the hyperlink (a couple of Kbytes at the most!) to the SmartTV from my tablet (if one doesn't want to use the TV remote to do that). I can do that today with my smart phone and WiFi.
What makes sense is to stream stored content from a handheld. But any day that works better in a wired manner like plugging in a USB storage device to your TV!
In the end, it is the consumer adoption that will prove or kill UltraGig. Time will tell.
I just don't like too much Emag smog in my living room!
Think of it this way, then. Instead of spending money on a Miracast to HDMI conversion box, and the Miracast transmitter at the tablet, make yourself a smart TV. Connect a PC to the TV, via HDMI, and control the PC with a wireless mouse (and wireless keyboard, when and if necessary).
Now you have a really smart TV that can stand on its own, and the tablet can be used for anything else.
This is reality too. It happens to be my reality. I went this route when I saw that the CE companies didn't know how to design a smart TV. So I gave up on them and did the obvious. Without waiting for Miracast or anything else.
Is is not simpler to have a low-bandwidth remote mouse next to you, to remotely control a smart TV, than to have to use a tablet that costs about as much as the set itself?
If you already have a tablet, great. Use it while watching TV, if you like, for something more useful than as a duplicate screen? It just seems silly to create such an expensive and bandwidth-hungry remote control for a TV, is all.
Perhaps the question proponents of this idea should ask, is the uncompressed video link electronics required at the TV and at the tablet going to cost LESS than a low cost thin client built into smart TVs? Maybe using Atom or ARM processors?
EVEN IF there will be times when you want to put those photos from your smart phone on the TV set, does that mean that all of your Internet TV watching must tie up a tablet too? I watch Internet TV all the time, on my TV, and have yet to need a tablet to do so.
I am all for increased bandwidth, lower frame rate loss, improved resolutions, etc.... but I am not finding either UltraGig or Miracast very exciting. Perhaps it is because I do not own a smart phone and have limited interest in watching TV? I see the 60Ghz frequency as being too short range for really being useful in the home (say floor to floor or end to end of the house). The cost adder for all the devices that the user would want to have supported also is a draw back, how many manufacturers would willingly add $10 or $15 to their product on a maybe future usefulness?
I think the future looks bleak for Silicon Image's UltraGig aka WirelessHD given the WiFi Alliance has embraced the WiGig Alliance's 60 GHz technology as its own and part of its road map.
So WiFi will have 60 GHz too and it will be an industry standard unlike UltraGig/Wireless HD which did not get backing from the folks it aimed to disrupt. The empire fought back--and won it seems.
To the consumer, it makes little difference which of the 60 GHz schemes "wins." To the consumer, this will in essence be a wireless HDMI link, of limited range and limited wall-penetrating acumen, but way more bandwidth than you could hope for in the 2.4 or 5 GHz bands. So it's all good stuff.
I see that it is called 802.11ad, and that products combining 802.11ad and 802.11n are already being demoed. This is good. It would allow standard house coverage of WiFi, plus this "wireless HDMI" link within a room, for example.
The interesting historical aspect of this is that the 60 GHz band started being considered when it became obvious that the ultrawideband (UWB) radio hype wasn't panning out. Remember that hype a few years ago? It fills that same functional role, though. Short range, say 10 meters or so, very high bandiwdth wireless.