It would be interesting indeed. However, we do have Shannon's equation to compare it to, and we do know how close to the Shannon limit other existing techniques come. And we also know that MIMO, under conditions in which multiple propagation paths are very uncorrelated, can give the appearance of violating the Shannon limit, but actually does not.
So, when we get more details, a fair comparison can be made. Anything that is 10 dB better than existing possibilities, aside from MIMO, sounds like it violates the Shannon limit, to me. Aren't we capable of only a couple of dB from Shannon already?
It would be a true breakthrough if it did legitimately violate Shannon's limit, but that wasn't mentioned.
Frii's equasion is C=Blog2(1+S/N), while C=data rate (bit/sec), B is bandwidth (Hz), S/N is simple signal-noise ratio (not in dB).
While is is easy to define B, but assuming S/N is trickey part. Typical WiFi usecase is signal level around -70dBm, while background noise floor will be around -90dBm, so SNR is about 20dB.
Based on 802.11ac standard, 256QAM MCS9 datarate is 200Mbps per stream with 40MHz channel, short GI, coding rate 5/6. Raw (pre-FEC) rate is 240Mbps.
So it proves 802.11ac MCS9 is within typical usecase, even though 2dB margin is pretty low. Of course "typical" usecase could be varied - you'll get -50 - -60dBm signal if your PC is close to AP (within 10ft) so MCS9 will be much more practical.
Theoritically, 4096QAM (64x64 constallation) is 16bit/symbol so it should have x2 datarate than 256QAM (16x16 constallaton, 8bit/symbol). Thus, we can assume 802.11ac 4096QAM must have x2 datarate than 256QAM MCS9.
It shows "4096QAM WiFi" will be inpractical (so I don't think WiFi will adopt more-than 256QAM modulation / stream). Even if their claim of +10dB advantage is correct, 26dB SNR will be still tough to find in public wireless networ (WiFi or LTE).
However, it will make sense to backhaul, where dedicated frequency band is used with much higher TX power and highly tuned directional antenna.
I personally believe 11ac 256QAM is already pushing little bit too far.
As Shannon's equasion (C=Blog2(1+S/N) suggests, the straightforward way to increase datarate (C) is to use more bandwidth (B). This was tried and failed on UWB (initially 500MHz/channel, hoped more than 5GHz/channel for future), partly applied to 11ac (up to 160MHz/channel).
Using higher frequency natunally increase bandwidth/channel, so there is high hope we can achieve more than 1Gbps datarate at 60GHz freq (802.11ad), however its laser-beam like characteristics and extremely low penetoration capability (you only can use 60GHz in clear line-of-sight) will limit is application.
MIMO is another way to increase datarate, but more than 4x4 MIMO will be impractical, since we need set of antenna / receiver / transmitter for each stream.
My most honest answer to "how we can get more datarate?" is "use wire!" :-)
Thanks. You are right, 4096QAM is 64x64 constallation, 2^6 x 2^6 = 12bit/symbol. I also wrote "Friis equasion" instaed of Shannon... Maybe because I often refer to both equasions to verify "revolutionaly communication method" on news :-)
This article and many similar others in the press in the last week, is an example of what you get when you have a marginally good idea and a lot of money to spend on unsubstantiated PR. There is only ONE patent issued in the company name and several applications (not patents). The patent describe a partial signaling transmitted and a maximum likelihood receiver implementation. It is NOT as claimed in the article which strung together all sorts of unsupported superlatives. What a shame!!
Elsissi - See the list of 15 published patents I posted already. I have not had time to review them much, I am starting with 8,526,523.
One thing that does stand out is the new transmission method uses a metric of 10^-1 to 10^-3 symbol error rate which seems in stark contrast to the 10^-6 metric normally applied to BER for data transmissions.
It is also astounding to me that these patent applications could be filed in Jan 2013 and granted in Sept 2013, for what is claimed to be a revolutionary technology. There is a lot to take in for anyone, patent examiners included.
"I believe they were suggesting 10dB could be translated into higher data rates, lower power consumption or longer distances, not all three at once."
Well, perhaps the "power consumption" angle adds enough ambiguity to let this otherwise unsupportable claim slide. Perhaps. I'd have to do a survey to tell for sure. As to distance (SNR actually, combined with receiver noise), data rate, and channel bandwidth, that *is* the tradeoff made by Shannon's equation. Either all thee together, or fix two and vary only one, or fix one and vary only two, makes no difference. That's why the claim sounds wrong.
This sort of truth bending happens all the time. In an interview I heard on a supposedly revolutionary engine design, the interviewee implied that his engine's efficiency was way higher than the norm because it had a lot more "working area" than standard piston engines.
Too bad the interviewer didn't think to ask, "What does working area have to do with efficiency? It's all about compression ratio. Tell me your engine has higher compression ratio, and I might start to believe. And then I'll ask, how do you prevent detonation?" The problem is just accepting as fact the implied significance of some irrelevant measurement.
These claims of supposed "breakthroughs" are very rarely credible.
It seems like a major challenge that companies have to face to have to standardize their communications technology before they cand eploy it and to have to reveal through disclosure much of their advantage to their competitors in the process. It is worth aiming for smaller markets in that case.
This sounds very interesting, but this is a long road with the corpses of a lot of startups alongside of it. Until they go public with details it is going to be hard to see whose ox is going to be gored by this and if it will actually make it out into the world, and if it does, in what form. There are a lot of players in spectrum that have the capability to buy and bury inconvenient technologies or the political wherewithal to block them.
Reminds me that UWB (Ultra Wide Band) fiesta back in 2002-2005, more than 20 modulation schemes are proposed from a number of startup companies with pretty smart people, but sadly, practically none of them survived. The "most popular" UWB scheme was so-called MB-OFDM, similar to WiFi but using faster keying speed (312.5nsec VS 4usec of WiFi), thus wider subcarrier bandwidth. Anyway, even MB-OFDM UWB did not make breakthrough as proposed Wireless USB standard.
For comparison, the US digital TV standard requires about 15.1 dB of SNR to achieve reception, in a gaussian channel. US DTV uses channel bandwidth of 5.3 MHz (and guard bands that bring this up to the 6 MHz channel width), and a net capacity of 19.29 Mb/s. The Shannon limit for a 5.3 MHz channel carrying 19.3 Mb/s is 10.6 dB of SNR minimum, required.
So this now-20-year-old standard is already only 4.5 dB from the Shannon limit. Unless Shannon's limit can be proved to have been violated, there ain't any 10 dB gains to be had here.
DVB-T2, the new European DTV standard, gets even closer. Last time I checked, it was ~ 3 dB from the Shannon limit.
So, all of this tells me that we're not looking at any "breakthrough in modulation." We're looking at refinements, much like DVB-T2 refined DVB-T1. Marginally better FEC codes, clever tricks on twisting the constellation, better interleaving, and so on. Small improvements that provide a small but measurable improvement.
Also, a significant point here. The purpose of OFDM is NOT to improve spectral efficiency. It is to improve resistance to multipath distortions. There's no such thing as a free lunch. What you pay, with OFDM, is moving away from the Shannon limit. So if a new modulation standard goes back to a single carrier approach, with improved equalizers, no one should be surprised. Equalizers benefit from Moore's law, after all. They are bound to improve over the decades.
Today's codes achieve very close to this (I have heard). So coding+QAM4096~=capacity. So would be interesting to see which way this company goes especially if you decide to toss the popular OFDM-QAM combination. Who would want them for 1-2 dB improvement ?
I guess the catch is that QAM 4096 may be near impossible to implement taking into account the practical implementation issues (EVM etc etc). I think the benefit must be that Magnacom has a scheme theoretically almost equivalent to coded QAM4096. Thats a big deal.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.