Cellular network architecture is becoming heterogeneous with big macro cells + pico, femto cells. Essentially, they are densifying the network in hotspots so that the system capacity requirement can be delivered via smaller cell sizes. In this context, mmWave can be used for air interface in these small cells. For e.g. http://nsn.com/news-events/insight-newsletter/articles/5g-ultra-wideband-enhanced-local-area-systems-at-millimeter-wave
Rappaport is brilliant but millimeter networks have very short range without line of sight. The backhaul cost today would be brutal unless you are the incumbent telco or cableco. Raises tough policy issues.
That said, Paul Baran's old scheme of node on every lightpole is the way to go. Because the frequencies are reused potentially hundreds of times in the space of a single cell site today, you get far more throughput per MHz. Maybe millimeter, maybe more unlicensed at lower frequencies.
Which they are proving in France with four DSL networks turning on a second SSID on the home gateway, strictly opt-in. Nearly everywhere in Paris my iPad picked up 3 networks: France Telecom, SFR & Free. Fabulously cheap way to deliver capacity.
I asked Rappaport about distance and he said this Gbit/s to the handset was based on 200 to 400 meters from a base station, implying rapid reuse of spectrum, rapid handoffs and small cell base stations--such as on lightposts and cafes--and maybe other technicques such as base stations collaborating on beam forming. Sounds like stuff confined to urban areas
Amazing!!! this will be opening a totally new set of protocols, technologies, wave propagation, antennas, electronics, spectrum licensing and many more associated things. This will require efforts from many researchers as well.
@rick: Yes, exactly. There are of course two questions.
One, is it viable to design consumer devices such as smartphones to have chips with support to these high millimeter wave frequencies? Standards on the personal/local area networks are already leveraging the unlicensed 60GHz frequencies e.g. IEEE 802.11ad (WiGig is the market term). Granted, this has not taken off yet in a big way, but there are chips being released and in the works. So, the answer seems to be yes.
Second, is how it works in the cellular architecture. Adaptive beamforming is a part of IEEE 802.11ad standard, to combat the path loss/NLOS scenario. For cellular, fast hand-offs and fast adaptation of the beamforming vector would be desired. Also, an architecture with a base station on each light pole (may be not as dense) as you indicated. For urban areas, even without millimeter wave, the current cellular architecture is getting denser with pico and femto cells. So, by the time we are ready to deploy millimeter wave cellular in the real world, we should have already done a lot of real world engineering in this direction.
There are definitely some very interesting engineering challenges in both of these things I listed above. But, we had definitely come a long way from a decade ago when people did not take IEEE 802.15.3c seriously, when it tried to do personal area networks with 60GHz.
I'm not the expert here, but the Marconi Society is doing a free webinar with three world-class engineers next Thursday. Bring your questions. http://bit.ly/1cNGfTK An EE Times article not long ago featured Samueli, one of our speakers, on whether Moore's Law will slow down.
The Remarkable Wireless Future, Next Thursday Featuring 2014 Marconi Prize winner AJ Paulraj, the inventor of MIMO; 2012 Marconi Winner Henry Samueli, the founder of leading chipmaker Broadcom; and Stanford Professor Andrea Goldsmith, who is creating a wireless future at 10 gigabits. Thursday March 6. 10 a.m. Pacific, 1 p.m. New York, 6 p.m. London. Please join us. Register at http://bit.ly/1cNGfTK We promise a lively set of answers to the key questions of wireless. There won't be any lectures.Your questions will be welcome. There's no charge to register at http://bit.ly/1cNGfTK. Wireless capacity will improve tens and maybe hundreds of times in the next five to ten years, the 5G era. We'll open by asking our experts: Given the range of estimates from tens of times to more than a thousand-fold, what's realistic?
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.