Hardware considerations seem to also play a part in evolution that is brushed aside sometimes in favor of theoretical equivalences. How a stable noise floor/zero is achieved could have a big influence, establishment a stable zero rather than trying to always theoretically beat it could be important.
Perhaps it depends a bit on intended use but- compared as discreet units,even an op amp of some kind processes more information,and in parellel,than a serial digital cpu and memory. In wave functions and fourier there is implicate and extricate order,all building block operations are availible at once and can be performed instantaneously together without discreet building of function one element at a time and iteration like a convential computer. Total temporal serialisation in a Turing system also has some drawbacks,the assertion that a working technology couldnt be many times more powerful than a Turing configuration sometimes seems rooted in the idea that analog and digital processing units are of equivalent power.
A deeper question for analog versus digital computation asks, are there things an analog computer can do that digital computing cannot do. In my experience, which is important to a formal bookkeeping framework of rules, is to track and record the history of physically recursive structures. I have found no way to track such a history in relative real time without using both forms of computing, where analog versus digital become a language of rational complements.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.