@ jaybus0 "How tightly packed are the CNTs? How does the extreme heat of vaporizing a metallic CNT not affect adjacent semiconductor CNTs?"
I queried the researchers about this and the short answer is that density varies and the metallic ones breakdown so quickly that they don't appear to damage adjacent ones.
Here is the long answer in their emailed response: "CNT density depends on growth recipes and/or transfer techniques. Typical CVD growth could range from 1-10 CNts/um, and there have been reports of up to 100 CNTs/um. CNT density may be increased after growth through CNT multiple transfer. CNT sorting can also result in high CNT density. The breakdown temperature of a CNT is approximately 600C. Due to the extremely high thermal conductivity of the CNTs and extremely low thermal mass, the metallic CNTs breakdown very rapidly, greatly reducing the amount of heat which dissipates from the CNT and thus warming their surrounding. Ideal CNT density would be 100-200 CNTs/um, resulting in much closer spaced CNTs. When the CNTs are brought closer together, heating effects from adjacent CNTs will increase. However, even with current CNT density, some of the CNTs still grow very close to each other, and we do not experimentally see a significant effect from adjacent CNTs in the breakdown process."
@rpcy "This demo runs one and only one instruction, the SUBNEG instruction, from which all other instructions can, in principle, be synthesized"
Thanks for the clarificaition. I guess we could say this proof of concept demo is the ultimate reduced instruciton set computer. It reminds me of early Cray supercomputers which used NAND gates to synthesize all their instructions.
The part about running 20 instructions from the MIPS instruction set is incorrect, or at least very misleading. This demo runs one and only one instruction, the SUBNEG instruction, from which all other instructions can, in principle, be synthesized. What the Stanford guys have done is really cool, but let's be clear about exactly what it was.
10nm is already well into development at Intel, with all candidate process tools in place or set to be installed before the end of the year and something like this would take many years to become viable. First equipment vendor(s) would have to be working on this for atleast a couple of quarters. There are several steps involved and working with quartz substrates may lead to issues.
Intel seems to think they can extend "traditional" CMOS to the 5nm node which should be ramping up in Hillsboro in 6 years. However this technology may actually qualify as traditional so I cannot really comment on anything that far off. But 10nm is not going to bring CNTs to the desktop.
A single carbon nanotube could form a transistor channel as narrow as a single nanometer, but this technique uses many in parallel to form a single transistor channel by patterning at the lithographic limit of whatever process is being used. The researchers did not speculate on the node at which it would be prudent to implement their technology. Their next step is to characterize the speed and energy efficiency of their technique.
The metallic nanotube removal process is performed before the etching step which defines the standard cells. Here what they told me about VLSI-compatible Metallic CNT Removal (VMR) in an email: "The process begins by depositing a special interdigited layout structure on the wafer containing a mixture of metallic and semiconducting CNTs. These interdigitated fingers are patterned at the minimum lithographic pitch (parts of it will become the final source and drain contacts in the circuit). Electrical breakdown is performed once on the entire VMR structure, removing all metallic CNTs within the entire structure...After breakdown, sections of the VMR structure are etched out, leaving the contacts which will remain for the final circuit."
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.