What I wonder is if the use of these TV white spaces is non-regulated then what happens to your data system when somebody sticks a TV expansion system in that frequency range, or some other of those consumer junk things. Working in the regulated zones means that there is a bit of control over who is interfering with you, and perhaps they would be mandated to not interfere with your signal. A system that always works is a much better value than a system that is really cheap.
....following my previous post.
Regarding the range, we'd generally deploy a system, just like the cellular operators, with smaller cells in cities and larger ones outside. In a simulation for the UK, our cells in London typically have a range of 500m - 1km, and we use the 5km radius cells in suburban and rural areas where we still need indoor penetration (eg to get to smart meters indoors). Having the flexibility of greater range means we can get 99% UK coverage (better than the cellular operators) with about 5,000 cells (compared to 15,000 per cellular operator). So this keeps our network costs down. We can easily curtail the range by reducing the transmit power.
Where we do have long range you are also correct that there is a chance of self-interference. We generally use different frequencies in neighbouring cells (there's typically enough white space channels to enable this) and then use orthogonal spreading factors to further reduce any interference effects. But we will still need to carefully plan and deploy the network.
But perhaps the key to emphasise is this system is, compared to cellular, much lower capacity in order to gain range and low transmit power which gives us low cost and long battery life. we think this is what most M2M applications require. It would be dire for humans!
I hope this helps. I'd be very happy to comment further to your very insightful remarks.
Many thanks for these thoughtful remarks. Can I respond as one of the designers of Weightless.
So you are totally correct that (1) large cells are not always a good thing and (2) if you had a large cell in a metropolitan area it would likely end up congested. Although actually I'd come at your capacity numbers slightly differently. In fact we only use 1 TV channel (6MHz in the US) per cell typically and because of the spreading we use to extend the range we often only get around 1Mbits/s net throughput. So let's take your case of 3 million devices, that leads to 0.33bits/s per device. But that's fine. A smart electricity meter might send a 50byte reading (400 bits) every say 3 hours. That's 0.04bits/s. So we can handle 10 times as many machines - 30 million. Of course, if some machines have much higher data rates that will change the equation - but we believe most are ultra-low data rates.
I'll comment on the range in the next post.
@Bert: Thanks for the substantive reply. It begs the question of what will be the transport of choice for IoT if not TV white spaces--Bluetooth, cellular, Wi-Fi, Zigbee, all the above and some of the proprietary stuff out there?
There's no shortage of choices it seems.
In case the numbers aren't clear, let me elaborate a little.
Let's say that use of TV white spaces is advocated because they have this long range capacity. And let's say that in the densest parts of a market like NYC, 3 million IoT devices would share this TV white space spectrum. Seems like a reasonable number, no? And let's say that the entire TV spectrum is used for this purpose, none left for TV.
To create the large cells they tout, as I stated previously, limits the scheme to perhaps a raw 240 Mb/s or so per large cell. This gets reduced to something around 160 Mb/s, by the error correction scheme. So this 160 Mb/s capacity has to be shared among these 3 million devices. That only gives each device a max of 53 b/s. That's bits per second, not Kbits or Mbits. That's nothing! That load can be distributed among the multiple small cells already in place, no?
Okay, so I didn't consider the use of MIMO techniques. Sure, but then MIMO works best when constrained to very short range, so the reflected paths are highly uncorrelated *and* reliable. Something easier to achieve at the higher RF frequencies than those used by TV.
I'm not saying that using TV white spaces, or all of the TV spectrum, won't help some. I'm instead trying to make the point that used in a 2-way wireless network, this long range that keeps getting touted as a benefit is actually working against you. You're sharing a very limited resource among a lot of devices. What works with you in TV broadcast works against you in a 2-way network.
So once again, a company that plans to use TV white spaces makes a pitch, and we're to believe that this so-called IoT depends on TV white spaces.
Not so. The two are orthogonal topics. And for that matter, this up to 10 Km range, cited in the article, is likely to work very much against you, if the IoT application is in any location other than the boonies.
Here's the fundamental issue:
The way you achieve the huge density of IP connection points that this IoT envisions is to REDUCE the air link range as much as possible, reuse the RF spectrum effeciently (meaning frequent geographical reuse of RF channels), and rely on a cabled backhaul network for the actual "heavy lifting."
If the TV spectrum is better suited for long range propagation, and it is, then this works completely AGAINST the goals you should be pursuing for wireless IoT applications.
Put this another way. The entire TV spectrum, VHF included, is currently about 300 MHz. To prevent interference while providing constant coverage, any given market can only use a small fraction of that. Say, up to maybe 80 MHz, for the big markets.
If you intend on bragging about long range possibilities of TV white space IoT, you will similarly be constrained to a small fraction of the TV white space slice of spectrum, to prevent co-channel interference. These folk mention 10 Km range as if it were a good thing. Is it?
The article mentions at best 16-QAM. That's 4 bits per symbol. So now tell me, what's so impressive about offering perhaps 240 Mb/s network capacity over a large swatch of an urban area? That's all you'd get. Imagine this 240 Mb/s solution to cover most of the densely populated NYC market. Is that a great thing?
As we unveil EE Times’ 2015 Silicon 60 list, journalist & Silicon 60 researcher Peter Clarke hosts a conversation on startups in the electronics industry. Panelists Dan Armbrust (investment firm Silicon Catalyst), Andrew Kau (venture capital firm Walden International), and Stan Boland (successful serial entrepreneur, former CEO of Neul, Icera) join in the live debate.