The free market and many other natural systems are supposedly about the survival of the fittest. The survival of the best companies, the success of the best products, the best processors and ICs, and so on.
But what about a circuit that evolves through thousands of iterations, influenced by feedback, until it is optimized for a particular function?
I remember being excited back in the mid 1990s, reading a description of work by Adrian Thompson of the University of Sussex that made use of a Xilinx FPGA to perform the genetic design of evolvable hardware.
Thompson chose the task of evolving a circuit in the corner of an unclocked XC6216 FPGA to discriminate between 1-kHz and 10-kHz tones presented at an input — one or other tone in and a 1 or 0 out. The method was to treat the 1800 bit-string that is the configuration word for the FPGA as a genotype and then test random choices of genotype for its fitness to perform the task. The evolution comes by creating "off-spring" genotype words using a genetic algorithm. After 3,500 iterations the circuit defined by the genotype had evolved and was performing the discrimination well (see Thompson's paper presented at the 1st International Conference on Evolvable Systems in 1996).
And with automated iteration of the generations it produced a circuit that occupied far less circuitry and in far less time than could be achieved by human design.
That is spectacular or, if you are an IC designer, may make you concerned. The idea of evolvable hardware is clearly very powerful in design, but also in adaptive and self-repairing circuits.
An evolvable hardware seems like a fantastic idea. Yet, it can be scary. A hardware is able to change itself to fit the environment seems like a mimic of human. With the help of neural network from the software world, an intelligent being made of metal and plastic will soon be built.
Nonetheless, moving forward is an inevitable event. We just need to learn along the way and, be cautious and responsible of what we do.
I think Erebus is on the right track in terms of what happened with this. Conventional hardware has just advanced so rapidly that the need to adapt to newer and faster processors has kept ahead of the need to optimize.
At some point, hardware may very well become so complex as to be unmanageable by humans. These advances may very well slow or stop and then optimization will be top priority. At that point, techniques like self-evolveable hardware will become viable and possibly even necessary.
I think the whole concept just was overcome by events. Regardless of the versatility of your evolvable technology, it just could not compete with the pace of standard component improvements. Look at the power you get each year and then think about holding hardware for five or ten years. It just doesn't make any sense and it is clearly not cost effective.
It's like reuseable software. It's a great idea, but few people do it because of the rapid changes in language options and extensions.
Just my opinion.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.