Why CAST cheat on their results? I noticed that their 8051 partner is very young, they started their business in 2013 and they claim they have world's fastest 8051, which seems to be (if it;s true) just 0.05% faster than DCD's 8051. It looks like young boys' boasting.
@Tomeq: Hi Max, Could you tell the CAST folks that they make at least few mistakes in their calculations...
Hi Tomeq -- looking at your comment, I don;t think I need to say anything -- it seems to me that you've detailed everything nicely -- I look forward to seeing the response from the guys and gals at CAST (Happy Friday, by the way :-)
Could you tell the CAST folks that they make at least few mistakes in their calculations. They wrote to you:
The Dhrystone score rates this new core at 26.85x the original at the same frequency, which just beats DCD's best 8-bit 8051 at 26.62x (..) Furthermore, our CPU size is about 7K gates, while DCD's is about 10K gates.
But reality is different as they made few mistakes in this statement:
5. CAST: "Our partners achieved this by starting from scratch and applying several modern processor architecture design techniques" – good to know that CAST's partner, which is just 2 (2013) years on IP Core market (source: http://silesia-devices.com/company) started their work from scratch, but DCD is basing on more than 15 (1999) years on IP Core market, having the world's fastest 8051 and 80251 (source: http://dcd.pl/page/147/about/)
Competition is always good, because we can achieve better results, and the same – offer better products. We believe in power of innovation, that's why we don't compare completely different competitor's IP Cores. I believe that above examples explain the truth.
It's a good question. A bit like in the article, where the quote from Cast says it is 26.85x faster, and only 7k gates. Yet if you look on Cast's web site, the 7k gate version of the S8051XC3 is only 9.4x faster, and the 26.85x faster version is actually 11.5k "minimum". I.e. it will be much larger when synthesized to a higher frequency, or in an older process. It's a game all CPU vendors play.
@Sanjib.A: The 8051 cores are not only used for replacing microcontrollers. They are heavily used in modern designs, (e.g. sensors, RF or analog front-end control and calibration, light-weight packet processing, housekeeping in complex SoCs etc) due to their small silicon footprint, energy efficiency, ease of use and lack of royalties. The increased performance makes them suitable for a wider range of applications/tasks.
True, BUT the eSi-1600 being a 16bit CPU based on a proprietary ISA, can not run 8051 code. The nice thing with 8051 cores, is that you can run your legacy code and exploit the pretty broad ecosystem (software stacks, IDEs, dev kits etc).
Also, I would be careful with publicly available specs for processor, especially when they are in the form of "up to 1.02 DMIPs/MHz" and "from 8.5K gates" (these are taken from the eSi-1600 page). I wonder what is the area when the performance is 1.02 DMIPs/MHz, or what is the performance when the area is 8.5kGates?....
In case of legacy software with software timing loops (that can not be replaced for whatever reason), the best choice is an IP core that is explicitly designed to match timing (i.e. number of cycles per instruction) of the original architecture. CAST has such a core, the L8051XC1, if anyone would be interested. This one allows configuration of the number of cycles per instruction, to allow timing compliance with more or less any 8051 architecture.
Drones are, in essence, flying autonomous vehicles. Pros and cons surrounding drones today might well foreshadow the debate over the development of self-driving cars. In the context of a strongly regulated aviation industry, "self-flying" drones pose a fresh challenge. How safe is it to fly drones in different environments? Should drones be required for visual line of sight – as are piloted airplanes? Join EE Times' Junko Yoshida as she moderates a panel of drone experts.