@plk & @Bert22306: I agree with your points. I don't know why one needs a hand-held (to stream media) that is largely redundant, wasting materials and resources. On the other hand, if we are talking about a hand-held that can double as a remote and meeting the plethora of confusing standards (RF4CE, 6LoWPAN, etc.), there is a play for it.
I am still holding out for a TV that can be gesture-controlled.
Think of 60GHz as a replacement for an HDMI cable used to link a web-enabled device - or just a set-top-box - with a display device. (No reason why the display screen should have its own personal web-access)
My first comment is, it would be nice to know more about how Miracast works. The one comment that we should wonder about congestion, when Miracast and WiFi are attempting to coexist, is very valid. Miracast is not WiFi, but if it uses the same frequency bands as WiFi for uncompressed video, one should ask.
The second comment being, if you can get that streaming media content to your tablet or smartphone, why not think in terms of streaming it directly to the smart TV? Without this peer-peer link?
Why assume you need a tablet, a Miracast or UltraGig reception box at the TV set, and the TV set, to do what a smart TV should be able to do all by itself? Who or what is keeping smart TVs from giving the user the same UI as any tablet or PC out there? Anyone?
This article seems a bit confused. Miracast is basically a standardized version of AirPlay, built on WiFi Direct. It is used in Wii U, and is certainly here to stay.
60ghz technologies have been cooking for years, and are now reaching mass market. It has very high bandwidth but very short range--basically in-room. Like 2.4/5ghz, it is unlicensed.
There are two questions here: how valuable are the high bandwidth, short range use cases for 60ghz; and whether anything other than WiGig will survive, given general industry support.