I guess it would. I don't know why DOCSIS 3.1 uses OFDM. DVB-C2 does, too. It may be that it's six of one and half-a-dozen of the a other and as a lot of other systems use OFDM these days, it's easier to design a similar system. eg DVB-T2 (and T) uses OFDM; DVB-C is single carrier, but DVB-C2 is OFDM and very similar to DVB-T2. It makes designing a chip to do both easier and cheaper.
"The addition of OFDM modulation and other coding techniques results in a 50 percent increase in throughput versus QAM channels across an equivalent bandwidth."
Is this only because the guard bands between 6 MHz channels can be slightly narrower?
Use of OFDM is only really called for in environments with a lot of multipath interference. Are they having multipath problems in cable systems? Otherwise, a single carrier scheme like QAM should give measurably lower levels of threshold SNR, so I don't see this unqualified statement that OFDM is desirable in cable systems as a slam dunk. Sounds like another tradeoff that needs to be mentioned, at least in passing.
Jim, it looks like there is a significant increase in upstream bandwidth and a smaller increase in downstream. Am I reading that correctly? Does that indicate a departure from the assumption that most traffic goes to the cable customers instead of from them?
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.