At OFC/NFOEC, I had an opportunity to hear both Dr. Henry Dardy, a senior technologist and Navy Chief Scientist at the DoD, and Dr. Larry Smarr, professor of computer science and engineering at UCSD and Director of its CAL-IT2 project.
Putting a challenge out to the industry—to accomplish terabits on a single wavelength, and eliminate a “best effort” mentality—Dardy stated that we will reach 100 Gigabit Ethernet within a year or so, 500 Gigabit Ethernet by 2010, and a Terabit by 2015. Stating that the vendors are hamstrung by Wall Street, the market will come along only when concepts are completely proven in optical testbeds.
Smarr, however, brought it home. Showing an array of such actual optical applications as digital cinema which is 4x high definition; remote underwater observatories designed to study deep sea vents and their behavior in real time; the ability for US scientists to use the CERN particle accelerator remotely—are all dependent on interactive visualization of massive data sets.
A great example of how the technology works—and yet is stopped in the process is that there is an ability to produce 1-foot resolution via aerial imagery. The technology was ready when Katrina hit, and the ability to see damage with such resolution was amazing. Yet, it took 10 days to receive the data. If a 10 Gigabit transmission were available, it would have taken an hour.
So close, and yet so far.