Good article, Junko, especially because you address both sides.
One thing I'd point out is, I'm not sure I agree with your definition of "white spaces." White spaces are locally unused TV channels within the TV broadcast bands. The 700 MHz spectrum should no longer be considered "white space," because TV lost channels 52-69 in 2009, with the end of (full power) analog TV broadcasting. So what we're really talking about, with "white space" spectrum, should now only be up to Ch 51. True, the FCC is also trying to subtract Ch 32 and beyond, I believe it is, from TV broadcasting. The 600 MHz band is in principle Ch 36-51, but the FCC is interested in a few more. Still, that's a separate discussion from "white spaces." Once these are subtracted from TV broadcasting, I don't think they qualify as white spaces either. The 800 MHz band was also taken away from TV years ago, for cellular service (Ch 70-83), and that's not considered white spaces.
I've seen very confused definitions of white space. Suffice it to say, though, that a major effort in allowing use of unused TV channels is to create a database based on precise location, and of designing channel-sensing devices. It would make no sense to create a database of available TV channels, if we were tallking about a vacated swath of spectrum.
As you state, this "super WiFi" is not all that super, compared with real WiFi, at least not in terms of bit rates. What I can't figure out is why there should be any controversy about its application? In rural areas, TV white space spectrum should be plentiful, and a long range but low bit rate wireless network might be okay. Low population density helps.
In urban and suburband areas, TV white spaces are going to be really hard to come by. But at the same time, why be so insistent? It's not like the IEEE 802.22 range of 4.54 Mb/s to 22.69 Mbit/s (the high figure is the least robust) can serve any appreciable fraction of the urban/suburban population. And these are gross numbers, not net.
A problem with this super-WiFi is that BECAUSE it has to coexist with TV stations, it only uses the TV channel width (6 MHz in the US, as opposed to 20 MHz channels for WiFi). And BECAUSE it is primarily aimed at long range coverage, it can't use MIMO techniques very reliably. So there should be no big controversy about its use. Should mainly apply to rural areas.
...or maybe it's not enough, soon enough. I'm sorry, phone companies and cable vendors, but it's becoming increasingly clear that you are holding up progress. The world needs universal free, high-speed WiFi to support education, public safety, and public utilities. This is no longer a tool of convenience. It's a basic public service, like running water. Data must flow, too.
You are right. There shouldn't be controversy about Super Wi-Fi.
And yet, the controversy began whenthe FCC proposed to make a substantial amount of additional spectrum available for unlicensed uses.
Further, the FCC wants a significant portion of this spectrum to become available on a nationwide basis. The FCC describes it important because there currently is little or no white space in the TV bands in parts of many major markets.
In making these proposals, the FCC said that the agency "seeks to promote greater innovation in new products and services, including increased access for wireless broadband services across the country."
Apparently, this very idea of "additional spectrum for unlicensed devices" is making some stakeholders (who want to use the spectrum to expand their own services) unhappy.
The cellular providers have worked very hard to try to choke off municipal and community wifi so I can certainly imagine that they would not like an expansion of open access via prime spectrum. The fact that wireless data access is becoming a necessity is to them a business opportunity. From their point of view they are defending the market that they developed from unfair competition, but from my point of view I would like to see some counterbalance to their dominance of the spectrum.
I can also imagine that the Wifi consortium is not happy about the FCC calling this SuperWifi. We recently FCC qualified a device that uses 802.11x for communications. We originally described the device as using Wifi until we realized how much more that would cost us. It would be difficult for the consortium to complain about the FCC, but I would bet that if anyone else called their non-compliant protocol anythingWifi they would be in court before the first press release hit the wires.
Let's be clear about this. The 4.5 to 22.69 Mb/s figure is not per user. It is per 6 MHz channel. Just like in WiFi or the original coax shared Ethernet, the figure is the total available to everyone sharing that medium.
So in an urban or suburban environment, it's nothing to write home about.
Look at it another way. The amount of spectrum used by the new super-fast WiFi networks is aggregated from multiple 20 MHz channels. So we are talking about as much as 80 to 100 MHz of aggregated channel capacity for the WiFi nets faster than 802.11n. Well, guess what? That amount of spectrum is equal to the entire TV UHF broadcast spectrum left, in Ch 14-31, if the FCC takes away the 600 MHz band from TV broadcasting.
So what I'm saying is, the numbers don't stack up, except for use in rural communities, where a few farmhouses would be sharing a couple of 6 MHz channels (or three or four 6 MHz channels perhaps). The digital capacity you can get out of whatever TV white spaces there are in urban and suburban will be a drop in the bucket, compared with what people need.
Now, the scheme can be used to create small cells in a cellular setup, of course, so that the spectrum is reused. But doing it that way, the 470-500 MHz frequencies will work against you. They will tend to exacerbate intercell interference.
The incompatibility of "SuperWiFi" with "WiFi" makes me wonder about how it will be used. Is the concept that the "SuperWiFi" carries a signal to many hotspots which will receive "SuperWiFi" and then rebroadcast in the conventional "WiFi" protocol to users with conventional "WiFi" devices? If so, perhaps multiple unused channels could be ganged together to provide a wider bandwidth for the "WiFi" hotspots. Unfortunately, a college population of wired students can put a lot of pressure on a "WiFi" hotspot ... even without an upstream bottleneck.
If any user wants to connect over a future white-space network, he or she will probably need something like a USB form-factor modem for "Super Wi-Fi."
Alternatively, there could be a wireless router that connects to the white-space network and provides Wi-Fi connections as a hot spot. But of course, in that case, users will face the same challenges as today -- congestions on WiFi -- especially in a college campus.
Interesting question. My thinking would be that individual homes, mostly out in the boonies, would erect a TV indoor or outdoor antenna, to receive the signal. We're talking about TV frequencies here, at potentially a range of 100 Km. So for those farmhouses far from the base station, not much different from a distant UHF TV station.
The antenna downlead would go to an in-home modem, just like your cable or ADSL modem, and then Ethernet or WiFi inside.
Conceivably, a USB stick could connect to an antenna, for direct 802.22 interface with your laptop, just as you can do now with 3G.
Aggregating multiple channels should be doable, but now you're talking about a college town? Hmmm. Fugetaboutit.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.