See for example the work of Steve Furber at University of Manchester (UK) with Amulet async ARM. There is also Prof Hava Siegelmann's work in computing "beyond the Turing limit", so-called analog field computing. Other examples of clock-free computing exist. These days, the asynchronous emphasis seems to be on minimizing EMI, rather than just average power consumption. I suppose if you had a low/scrambled noise asynchronous uC running your hot beverage maker, you could have a TEMPEST in a teapot.
Based on this article, I received an email from TODpix that have a film coming out about Turing titled Codebreaker. A trailer for it can be found here http://www.websandbox.co/codebreaker/index.html
Additionally, TODpix's new crowdsourcing "Theater-on-Demand" platform allows individuals and groups to sign up their local theater for a viewing of select premiere movies, like Codebreaker (25+ seats will guarantee each new screen).
In this world it is often a fact that people are treated according to how well they fit in. If one chooses a lifestyle that others find offensive, the others may be offended, and most folks choose to seek their friends from among those people who don't offend them. It just works that way. The trite truism, "Birds of a feather flock together" is an example. So if an individual acts in a manner that offends most people, they will probably not be treated as well.
It is true that asynchronous logic can be much faster than clocked logic, but my experience is that it is a lot harder to design for a number of reasons. Component variability is one very real and obvious obstacle, and the variability with temperature is a very real pain.
One useful option is using "blocks", each asynchronous internally, but with clocking to keep all the paths in step. It really does work, and has the added advantage of leveling the power draw quite a bit. Unfortunately those who paid for those designs are quite secretive about what they do.
But picture an elaborate software flow chart, with the different boxes working nearly instantly, but with a clock to tell the system to pass data. That is a gross simplification, but a valid explanation.
It's good to remember Alan Turing for his genius and contribution, and for the sad way he was treated. And, interesting to reflect on how the vast majority of designers are trapped in the synchronous design box. That's the only way I learned about in school!
Alan Turing was a bright star indeed. We are all poorer for the way he was treated. He applied his formidable intellect to help his government win a war, and look how he was rewarded. This crazy world -- yes, pretty much the whole world -- still struggles to value humanity when it doesn't exactly fit some rigid, antiquated notion. It makes me sad and angry that such a gentle soul would be driven to end his own life by the very country he served. Thank you, Mr. Bailey, for reminding us.
In the 1960's I represented Dyad Systems of Columbia, MD.
They were designing source data acquisition systems based on asynchronous
circuits using "Dyad's". Their technology had been patented by Carlo Fuastini of Silver Spring, MD. I believe I got the first and only order they ever received (from NSA). The CMOS Dyad ICs used were built by Solid State Scientific of Montgomeryville, PA
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.