In an entry on the RocketBlog, Dave Orecchio of Gate Rocket says embedded processor cores for DSP, microcontroller and microprocessor functions are becoming increasingly common in FPGAs.
Both Xilinx and Altera have long offered their own embedded soft IP cores, and a growing trend is to also support the integration of third-party hard IP cores, such as the PowerPC or ARM families. Gartner estimates that approximately 40 percent of all FPGA designs include embedded processors.
The combination of a flexible FPGA platform and a proven, high-performance core saves designers time and effort. And the FPGA suppliers are making it even more efficient through offerings such as Xilinx’ Extensible Processing Platform for the ARM Cortex-A9, which makes customizing the FPGA device for specific function or design requirements easier.
But with the addition of larger and more complex embedded processors, combined with the complexity of the overall leading-edge FPGA architectures, designers are faced with unprecedented verification challenges. A ‘fully loaded’ FPGA with multiple embedded cores and millions of logic cells can literally bring a traditional simulation approach to its knees.
Just the density of these devices alone is daunting. But consider that there is also a need to execute and verify software along with the hardware, and designers are looking at multi-million cycle simulation challenges.
I agree with DrDSP. The merging of embedded software and FPGA domains should not automatically increase verification challenges.
When debugging an embedded system that uses an OTS processor, we don't try to verify the operation of the silicon. So why do we think we need to verify the operation of IP in an Embedded FPGA system? Sure if you are developing the IP in parallel to the software then it may be necessary to perform some co-verification (of SW and HW) but that should be done at a local level and not at the system level.
Once the IP has been appropriately constrained and verified, it should be treated the same as any other piece of hard IP.
I think we need to be very careful imposing the verification regimes of traditional FPGA flows into the embedded space.
I guess a related question is what level of simulation is required for an embedded FPGA. Since it is an FPGA you can just verify the 'logic' to make sure it is working and then use your friendly evaluation board (or a prototype board) to just run the software in real time. Maybe you even use a hardware emulator (remember those?) Are you going to find any 'software' bugs that impact the board level design? Maybe I'm boing old fashioned but one of the advantages of using FPGAs is to get a prototype up and working quickly to iron out those last few bugs instead of spending so much time verifying everything.
Is this the wrong approach?
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.