For the 40G/100G standard, the PAR was approved in late 2007, and the standard ratified in mid 2010. So, about two and a half years. 100G switching units are only selling in small numbers today, three years later.
Some of that may have been the economy throttling capital investment. But to some extent it might also be because streaming media (a big consumer of bandwidth) wasn't so popular back in 2010. All that is changing and the pace is sure to pick up.
As to what is fast track, the bandwidth study indicated that 400G would be needed by 2015. I would say that two years to get a standard and first products out would be the fast track.
Yes, it's a lot of data. A lot of it is on optical going any kind of distance, but inside the central offices wired Ethernet is also used. 40G PHYs for copper are for the 1 to 10m range, so rack to rack. I think 100G is strictly optical, but not certain.
My opinioin is that we better get the hell on it as quickly as possible. But how do you define fast track? What's the normal timeframe and process to establish standards? What are the obstacles to speeding up the process?
That's a staggering amount of data flowing around there. Is that all optical or are there wired systems capable of some of those data rates? I haven't paid a lot of attention to the newer generations Ethernet standards, so I'm not really sure where the cut over from copper to fiber occurs.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.