Back in the 1990s some interesting things were done with hardware that could evolve into a function using genetic algorithms and feedback. So what happened? And could it have a role to play today?
The free market and many other natural systems are supposedly about the survival of the fittest. The survival of the best companies, the success of the best products, the best processors and ICs, and so on.
But what about a circuit that evolves through thousands of iterations, influenced by feedback, until it is optimized for a particular function?
I remember being excited back in the mid 1990s, reading a description of work by Adrian Thompson of the University of Sussex that made use of a Xilinx FPGA to perform the genetic design of evolvable hardware.
Thompson chose the task of evolving a circuit in the corner of an unclocked XC6216 FPGA to discriminate between 1-kHz and 10-kHz tones presented at an input — one or other tone in and a 1 or 0 out. The method was to treat the 1800 bit-string that is the configuration word for the FPGA as a genotype and then test random choices of genotype for its fitness to perform the task. The evolution comes by creating "off-spring" genotype words using a genetic algorithm. After 3,500 iterations the circuit defined by the genotype had evolved and was performing the discrimination well (see Thompson's paper presented at the 1st International Conference on Evolvable Systems in 1996).
And with automated iteration of the generations it produced a circuit that occupied far less circuitry and in far less time than could be achieved by human design.
That is spectacular or, if you are an IC designer, may make you concerned. The idea of evolvable hardware is clearly very powerful in design, but also in adaptive and self-repairing circuits.
Having just revisted the article at - https://www.damninteresting.com/on-the-origin-of-circuits/ to validate my memories that the story contained aspects of "solutions without explanation", I next searched for a follow up and found this page.
For me, the answers I see here are a case of not seeing the trees for the forest. My interest in what happened relates to the following:
1) If this AI produced working solutions not understandable by humans as to why or how they work isn't that itself extemely interesting as to the knowability of consquences if AI?
2) From the article "Five individual logic cells were functionally disconnected from the rest— with no pathways that would allow them to influence the output— yet when the researcher disabled any one of them the chip lost its ability to discriminate the tones."
It would seem to me that those two points trump any issues relating to cost or speed of alternate design and manufacturing options.
An evolvable hardware seems like a fantastic idea. Yet, it can be scary. A hardware is able to change itself to fit the environment seems like a mimic of human. With the help of neural network from the software world, an intelligent being made of metal and plastic will soon be built.
Nonetheless, moving forward is an inevitable event. We just need to learn along the way and, be cautious and responsible of what we do.
I think Erebus is on the right track in terms of what happened with this. Conventional hardware has just advanced so rapidly that the need to adapt to newer and faster processors has kept ahead of the need to optimize.
At some point, hardware may very well become so complex as to be unmanageable by humans. These advances may very well slow or stop and then optimization will be top priority. At that point, techniques like self-evolveable hardware will become viable and possibly even necessary.
I think the whole concept just was overcome by events. Regardless of the versatility of your evolvable technology, it just could not compete with the pace of standard component improvements. Look at the power you get each year and then think about holding hardware for five or ten years. It just doesn't make any sense and it is clearly not cost effective.
It's like reuseable software. It's a great idea, but few people do it because of the rapid changes in language options and extensions.
Just my opinion.
Apple’s decision to remove the 3.5 mm headphone connector on the new iPhone 7, while continuing to use their proprietary Lightning connector, could mean more electronic waste as users discard their old headsets and many OEMs rush to develop new ones using Apple’s connector.