Kris, yes indeed. One reads about a mix of macrocells and femtocells, for instance, and of these femtocell base stations located on street lamps and such. One issue will be all the handover signaling involved.
I think the backhaul requirements for this can never be underestimated, which is why I like reading articles like this. More discussion than just the user interface to the base station.
Great analysis Bert...what this implies though that you would need many more microcells than today...if 2km reach is replaced with 150m reach you need 2000/150 squared to get the same area coverage...lots of hardware and power dissipation! Kris
Krisi, even today, obstacles affect 3G and 4G. At home, I'm lucky to get one bar on my 3G device, and it's not enough signal to transmit graphics. So, with more and smaller cells at 60 GHz, as much as propagation qualities work against you, the range requirements and the potential for a close-by alternative go way up.
I think that moving up to the 60 GHz band, or similarly high frequency, is pretty much a given, for WiFI as well as for 5G. There's simply no other way to offer link capacities of 10 Gb/s and more, with reasonable channel bandwidths, down in the UHF or L bands. If WiFi is supposed to be a shared medium, it doesn't really make sense to require each user to grab 80 MHz or more of spectrum, in the 2.4 or 5 GHz bands, does it? So, increasing the frequency channel was the logical move.
At the same time, increased consumer demand for wireless spectrum can only be met realistically with more frequency reuse, aka small cells. Which works well with these higher frequencies that don't propagate very far. It helps keep inter-cell interference in check.
So, this is all logical and predictable. Which is why I have been questioning the FCC's drive to take back TV UHF spectrum for use in RF cellular broadband. The TV frequencies now being targeted are in the 600 MHz band. Way too low to give any good payoff.
As to propagation loss, free space propagation signal attenuation, at 800 MHz and 2 Km range, is 96.5 dB. This is what you'd expect in today's larger cells. Signal attenuation of a 60 GHz channel, at 2 Km, is a much higher 134 dB. But the point is, you wouldn't use such a large cell with 60 GHz. So scale the cell down to 150 meters, and now attenuation is 111.5 dB.
So, assuming that the mobile device sensitivity is -70 dBm, which is not unreasonable, the base station transmitter would need to transmit an ERP of 14 Watts to reach the mobile device at the max range of 150 meters. Electronically steered antennas, of course, would provide antenna gain, and reduce that power requirement. I think this is feasible.
thank you @WiLess...I do realize that the algorithm will try to find another path in case of blockage (your 747 example)...perhaps in city environment there is enough bouncing off bulding to create that alternative paths...time will tell...I would rather have less bits more reliably than more bit interuppted but it could be just me, I don't watch movies on my cell phone ;-)...Kris
802.11ad spec includes dynamic beam forming and beam tracking. With the proper implementation, if the object block the propagation path, the protocol will find another path that maintain the link and the connection will be kept. Of course, there are limits to what that algorithm can do. For example, the bird crossing RF link is different from Boeing 747 crossing it. Having a duplicate RF path can help with such events as well.
thank you @y_sasaki...yes, with directional antenna it might be theorethically possible...but in practice various object can enter direct line of sight line and Internet connections would be lost...so even if this can be built it will not be highly reliable I am afraid...Kris
Frii's equasion shows it is theoritically possible to achieve little over 200m range link with 15dBm (31mW) TX Power, -60dBm receive sensitivity, 20dBi TX antenna gain + 20dBi RX antenna gain.
Shannon's equasion shows minimum required SNR=1.0 (same as noise floor!) to achieve 2Gbps on 2GHz/channel bandwidth. Assuming -60dBm receive signal on -90dBm noise floor, 30dB margin will be good enough to be practical.
This is just very basic theoritical reality check. You may need lot more margnin for real implementation - such as cable loss complement, obstacles between antenna LOS, interferance, etc.
My caliculation is depends on highly sensitive directional antenna (20dBi each). Very high-gain dish antenna (40-50dB) are common for backhaul link or satellite link, but I'm not sure if such antenna is available for 60GHz band.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.