I'm the "DARPA person" in question. They pay us to be paranoid about national defense, and there really are people in the world who do not wish the United States well. (I hope that doesn't surprise anyone here.) So keep that bias in mind.
But I do think there's a real issue here, and it's the main reason I finally gave in and decided to perform some government service for the past couple of years. The issue is that for several decades, if you wanted to field military electronics, you developed it at great cost, but when it was complete, only peer nation states could afford to do likewise. Nowadays, commercial off the shelf electronics are very high performance, readily available, and inexpensive. So many more players besides peer nation states can make electronics with military implications. We still do develop electronics beyond COTS for U.S. military purposes, but there are times when using COTS is just the best we or anyone else can do.
When Moore's Law finally grinds to a halt, further advances in COTS will continue but at a far slower rate. Yes, there is low hanging fruit in SW, algorithms, 3D stacking, specialized processors, and other items I mentioned in my talk. But you cannot sustainably combine lots of onesies and replace an underlying exponential. One of the motivating ideas for my talk was that the U.S. must plan for the end of Moore's Law as though it will cause all players, not just peer nation states, to end up with the same HW capabilities. That would drastically reduce some of the advantages the U.S. has long enjoyed in certain militarily relevant arenas. -Bob Colwell
I find it hard to accept the DARPA person's opinion that US national security is threatened if Moore's law comes to an end. There is plenty more to do beyond scaling that have not received the same attention! Some one already commented about "More Than Moore" and there are quite a few challenges remaining in circuit boards and substrates.
One argument for it all coming to an end is that FinFETs get us from 20nm which was the end of the line for planar to 5nm before it too breaks down and there is no new switch even on the drawing board. It took FinFETs 20 years from initial stidies to first manufacture. That could mean that it will take 20 or more years to get beyond 5nm.
Of course brute force scaling is not the only way forward. We still have all manner of "more than Moore" ways to improve microelectronic devices. And the end of Moores Law doesn't mean a total end to scaling, just a slowing of the rate. That next node may now take a few years instead of just 18 months.
When the number of process steps has to double suddenly, it may no longer make economical sense to use 30% shrink to advance to the next node. It may have to be at least 35% shrink; the max of course is 50%. So 28 nm should be followed by 18 nm instead of 20 nm, for example.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.