Good blog content. FPGA-based prototyping has become mainstream but using these is still very difficult. I'm sorry that you were disappointed with what you saw at DAC but there is much to improve in this area. Unfortunately, not enough information get's out to the engineering community about who's got what. As Mick posts above, Synopsys has had co-simulation capability from their Chip-It (HAPS 6000) platform for a few years...at least.
At InPA we too have co-simulation capability where the users RTL test bench drives the design in the FPGAs. In fact, we interface to all popular RTL simulators. In our flow, this is used to help verify that your design running in the FPGAs functions as the test bench expects, addressing your item #2 and #4. In our methodology co-simulation is important in that it transfers checkpoints from the simulation test bench to our Embedded Micro Machines (EMMs) giving the in-circuit debug flow a more reusable and qualified test plan.
What engineers tell us they'd really like is a debug capability that looks at the system view of the design....and not just the individual logic states. What they mean by system view is debug technology that can track overall datapath activity, stimulated by I/O and controlled and monitored by the firmware.
Hi Brian -- I also noted that FPGA-based prototyping debug was a focus at this years DAC. I wanted to mention that Synopsys offers a co-simulation mode which is designed just as you described. They were demo'ing it in their suite. It enables a DUT running in the HAPS FPGA-based prototyping hardware to be validated against it's original simulation testbench. Doing this block by block will obviously reduce the numbre of surprises later as you integrate the blocks with each other in the system level prototype. Easier to debug as well as you have already individually validated the blocks operation before the integration.