If Micron marketing does its job, in 3 or 4 years there will be an AP chip in every cell phone and tablet offloading facial and speech recognition. There will be multiple AP chips in every new automobile for collision avoidance. Finally, every robot, drone, and autonomous device will have a handfull. These things will cost about $2 each if the die sizes are to be believed.
The AP is not a computer even though Micron claims it is a processor. It is a computer peripheral, more like a GPU for patterns. There will never be thousands of programmers coding for it. Rather there will be a set of state libraries to perform common API functions.
They do seem to have adopted the processor on DIMM technique we proposed about 6 years ago for our CPU in DRAM. http://www.venraytechnology.com/Implementations.htm
Don't know much about Automata Processor (AP)...please forgive me for a novice question.
"Its design is based on an adaptation of memory array architecture"...sounds more like an architecture similar to that of a CPLD...how does a AP architecture compare to that of a CPLD or FPGA?
I've lost count of how many novel and ground breaking parallel processors I've written about in 20 years at EE Times that have died quiet deaths because no one could write code for them. Is this any different?
According to the article, the AP architecture is data-flow based. In the past, dataflow architecture was also proposed for network procesing applications (e.g. xelerated dataflow network processor) but the main challenge is the programming complexity and the programming restrictions of these architectures.
Also it looks similar to the transputer processors architecture that were targetting parallel computing.
However, this automata processor seems to have included both an efficient programming framework/SDK and an efficient silicon implementation.
Processing big data sounds like an ideal application for parallel processing: huge quantities of data each of which nbeed to be processed through the same algorithms. I used a massively parallel processor (32,000 processors as I recall) in 1979 for image processing and was amazed at the work that could be done with just a 1 MHz clocked system. The pixels in images are just a special case of big data.
At most it is a new way of accessing the memories, in digital systems dealing with data is nothing but dealing with memories, Big Data Problems still demands a universal way to handle any problem associated with it, but still this solutions if claiming good results in certain directions, let's see how much it actually becomes effective.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.