Today's codes achieve very close to this (I have heard). So coding+QAM4096~=capacity. So would be interesting to see which way this company goes especially if you decide to toss the popular OFDM-QAM combination. Who would want them for 1-2 dB improvement ?
I guess the catch is that QAM 4096 may be near impossible to implement taking into account the practical implementation issues (EVM etc etc). I think the benefit must be that Magnacom has a scheme theoretically almost equivalent to coded QAM4096. Thats a big deal.
For comparison, the US digital TV standard requires about 15.1 dB of SNR to achieve reception, in a gaussian channel. US DTV uses channel bandwidth of 5.3 MHz (and guard bands that bring this up to the 6 MHz channel width), and a net capacity of 19.29 Mb/s. The Shannon limit for a 5.3 MHz channel carrying 19.3 Mb/s is 10.6 dB of SNR minimum, required.
So this now-20-year-old standard is already only 4.5 dB from the Shannon limit. Unless Shannon's limit can be proved to have been violated, there ain't any 10 dB gains to be had here.
DVB-T2, the new European DTV standard, gets even closer. Last time I checked, it was ~ 3 dB from the Shannon limit.
So, all of this tells me that we're not looking at any "breakthrough in modulation." We're looking at refinements, much like DVB-T2 refined DVB-T1. Marginally better FEC codes, clever tricks on twisting the constellation, better interleaving, and so on. Small improvements that provide a small but measurable improvement.
Also, a significant point here. The purpose of OFDM is NOT to improve spectral efficiency. It is to improve resistance to multipath distortions. There's no such thing as a free lunch. What you pay, with OFDM, is moving away from the Shannon limit. So if a new modulation standard goes back to a single carrier approach, with improved equalizers, no one should be surprised. Equalizers benefit from Moore's law, after all. They are bound to improve over the decades.
Thanks. You are right, 4096QAM is 64x64 constallation, 2^6 x 2^6 = 12bit/symbol. I also wrote "Friis equasion" instaed of Shannon... Maybe because I often refer to both equasions to verify "revolutionaly communication method" on news :-)
I personally believe 11ac 256QAM is already pushing little bit too far.
As Shannon's equasion (C=Blog2(1+S/N) suggests, the straightforward way to increase datarate (C) is to use more bandwidth (B). This was tried and failed on UWB (initially 500MHz/channel, hoped more than 5GHz/channel for future), partly applied to 11ac (up to 160MHz/channel).
Using higher frequency natunally increase bandwidth/channel, so there is high hope we can achieve more than 1Gbps datarate at 60GHz freq (802.11ad), however its laser-beam like characteristics and extremely low penetoration capability (you only can use 60GHz in clear line-of-sight) will limit is application.
MIMO is another way to increase datarate, but more than 4x4 MIMO will be impractical, since we need set of antenna / receiver / transmitter for each stream.
My most honest answer to "how we can get more datarate?" is "use wire!" :-)
This article and many similar others in the press in the last week, is an example of what you get when you have a marginally good idea and a lot of money to spend on unsubstantiated PR. There is only ONE patent issued in the company name and several applications (not patents). The patent describe a partial signaling transmitted and a maximum likelihood receiver implementation. It is NOT as claimed in the article which strung together all sorts of unsupported superlatives. What a shame!!
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.