"I believe they were suggesting 10dB could be translated into higher data rates, lower power consumption or longer distances, not all three at once."
Well, perhaps the "power consumption" angle adds enough ambiguity to let this otherwise unsupportable claim slide. Perhaps. I'd have to do a survey to tell for sure. As to distance (SNR actually, combined with receiver noise), data rate, and channel bandwidth, that *is* the tradeoff made by Shannon's equation. Either all thee together, or fix two and vary only one, or fix one and vary only two, makes no difference. That's why the claim sounds wrong.
This sort of truth bending happens all the time. In an interview I heard on a supposedly revolutionary engine design, the interviewee implied that his engine's efficiency was way higher than the norm because it had a lot more "working area" than standard piston engines.
Too bad the interviewer didn't think to ask, "What does working area have to do with efficiency? It's all about compression ratio. Tell me your engine has higher compression ratio, and I might start to believe. And then I'll ask, how do you prevent detonation?" The problem is just accepting as fact the implied significance of some irrelevant measurement.
These claims of supposed "breakthroughs" are very rarely credible.
Elsissi - See the list of 15 published patents I posted already. I have not had time to review them much, I am starting with 8,526,523.
One thing that does stand out is the new transmission method uses a metric of 10^-1 to 10^-3 symbol error rate which seems in stark contrast to the 10^-6 metric normally applied to BER for data transmissions.
It is also astounding to me that these patent applications could be filed in Jan 2013 and granted in Sept 2013, for what is claimed to be a revolutionary technology. There is a lot to take in for anyone, patent examiners included.
This article and many similar others in the press in the last week, is an example of what you get when you have a marginally good idea and a lot of money to spend on unsubstantiated PR. There is only ONE patent issued in the company name and several applications (not patents). The patent describe a partial signaling transmitted and a maximum likelihood receiver implementation. It is NOT as claimed in the article which strung together all sorts of unsupported superlatives. What a shame!!
I personally believe 11ac 256QAM is already pushing little bit too far.
As Shannon's equasion (C=Blog2(1+S/N) suggests, the straightforward way to increase datarate (C) is to use more bandwidth (B). This was tried and failed on UWB (initially 500MHz/channel, hoped more than 5GHz/channel for future), partly applied to 11ac (up to 160MHz/channel).
Using higher frequency natunally increase bandwidth/channel, so there is high hope we can achieve more than 1Gbps datarate at 60GHz freq (802.11ad), however its laser-beam like characteristics and extremely low penetoration capability (you only can use 60GHz in clear line-of-sight) will limit is application.
MIMO is another way to increase datarate, but more than 4x4 MIMO will be impractical, since we need set of antenna / receiver / transmitter for each stream.
My most honest answer to "how we can get more datarate?" is "use wire!" :-)
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.