For this year, one area that is definitely going to be a theme is FPGA-based prototyping. This is heating up into a must have tool in everyone's arsenal...
Every year DAC seems to take on a theme or two. Sometimes it is because of a recent product announcement that creates buzz, or a new kid on the block that attracts attention.
Last year Cadence kind of stole the show with their EDA360. Not all of the talk was positive, but it is what everyone was talking about and how company XYZ had already solved all of the problems that Cadence had outlined. Another theme was assertions, with several new companies displaying interesting tool offerings.
For this year, one area that is definitely going to be a theme is FPGA-based prototyping. This is heating up into a must have tool in everyone’s arsenal and there are several ways to attack the problem, which makes it interesting to see how each of the vendors will battle for their turf.
In addition, some of the problems that have restricted its adoption in the past are being solved and there will plenty of companies showing new boards, new software to add capabilities such as debug to those boards, new integrations between prototyping and emulation, or prototyping and virtual prototypes, better or different tradeoffs between quality of results, turn-around time, extensibility and many other factors.
FPGA-prototyping board from the Dini Group
So what are the basics to look for? Some of the things are fairly common between the various prototyping boards, be they the low-end cards or the high-end offerings. They will all have several of the recent generation of FPGAs on them. Most will have Xilinx, but a few go with Altera. Some will be fixed, and others will be flexible in that you can plug in however many FPGAs you need.
But the biggest differences are in the ways that the FPGAs are connected together, and this make a world of difference when it comes to the software that gets the design ready for execution. As soon as the design will no longer fit in one FPGA, it has to be partitioned across however many FPGAs exist or are needed.
But what do you put in each FPGA and where to draw the partition line? Part of the consideration is how many wires go between each of the FPGAs and the way in which they are configured. Some solutions will only connect nearest neighbors, others will have connections that go between each of the FPGAs, but in doing this, fewer wires will be available between any two. If you do not fully populate the board, do you lose the ability to use various pins on any of the FPGA, and the list of possibilities goes on.
Why is this so important? Well it is related to Rent’s rule
, which talks about the relationship between the number of interconnections that a block of functionality has and the number of logic gates necessary to implement the function. Basically, many designs are interconnect constrained. Some solutions for this problem are to employ time division multiplexing so that multiple signals can be sent across the same physical wire, but this can slow down the execution speed of the board and may make the board more expensive.
Performance is one of the primary reasons to utilize an FPGA-prototype as this enables earlier execution of software. How does IP, such as a processor get integrated into the prototype? You should closely examine the tradeoffs that a solution has made between performance, cost, flexibility and many other issues. There is no one right solution for every situation. Also make sure that the extensibility of the solution will work for you both today and at least for your next design. This is in terms of the gate counts that they can handle and their ability to plug in custom pieces, such as a radio receiver, or GPS unit.
Debug has been one of the constraints to adoption. Putting in debug slows down the prototype and in general means that the design has to be recompiled, which in turn means a long wait while place and route is performed. Basically, changing debug information was not been an interactive process. Several companies are now releasing debug solutions and many of these have reduced the turn-around time considerably. I know that there will be even more announcements before DAC on this subject.
The other thing that I believe is going to be vital, as we put together complete ESL flows, is the emergence of hybrid prototypes. This is a prototype that is composed of some parts of the design running in an FPGA prototype, and other parts in a virtual prototype. For the end user, they would see no difference as to how the design was distributed across the prototype, except perhaps in the types of debug available in certain areas of the design. So we could expect to see IP and other fairly stable items running in the FPGAs, while the architecture and the new capabilities perhaps running in a virtual model. Another possibility is running the verification model virtually and using it to compare against the implementation model running in the FPGA-prototype. Some vendors are pushing into this area, but there are still issues that have to be solved to make it fully viable.
Synopsys has been the technology leader in this field for a while based on a number of acquisitions that they made in both the hardware and software areas. Cadence recently talked about some of their new and upcoming offerings, which make a different set of tradeoffs in the solution. There were new offerings being shown at DesignCon in both the prototyping boards and in software that can program them, debug them and turn them into more flexible solutions. Other companies will be making announcements in the weeks leading up to DAC. One thing for sure is that you will see more FPGA-prototype solutions being talked about this year on the show floor than you ever have before, and I am sure there will be at least one solution that will fit your needs.
Brian Bailey (http://brianbailey.us
) – keeping you covered