It sounds like they have aimed for a level of abstraction that is above RTL but below untimed, algorithmic HLS. This could be very appealing, but will designers warm up to the idea of capturing their designs in a new, proprietary language?
thank you for your comment, and this is an interesting question that you're asking. We hope to get some early adopters for whom this won't be an issue, and work with them to make the language evolve until we feel comfortable it is ready for standardization.
It looks very appealing for small to medium projects in companies thats can't afford big tool chains from synopsys or others. Is the simulator included ? Is it possible to simulate mixed design with C~ top level and some VHDL/Verilog IPs ? What about cosimulation ?
I see three factors that might afflict Synflow but I think you are right to highlight the language issue. Proprietary languages often struggle for adoption.
The question has to be how does C~ compare with such offerings as SystemVerilog and perhaps to a lesser degree SystemC...but also how the IDE can connect with descriptions in those languages.
A second factor is graphical design. Textual languages have dominated the IC design area for decades, despite one or previous attempts to introduce graphical design front-ends that failed to gain traction.
A third factor is Synflow's avowed intent to steer clear of software. This is a bit of double-edged sword. On the one hand they are probably right that there are two distinct audiences. One is system/software developers who are not interested in hardware.The second is hardware engineers. However, high-level hardware engineers for a long time had to have an eye on the software. Often they are choosing whether something should be software programmable or accelerated in hardware.
So there are fundamental questions to be addressed but all could be resolved in Synflow's favor if this tool produces high QoR designs in a fraction of the time of conventional design AND it could pass designs over to an SoC RTL to gate-level desgn process.
I run a large research program that standardized on Bluespec. We still use it but when we have collaborators working from home, it is tough to give them licemnses and Chisel's open source nature works well in that scenario,
Also Chisel let you use multiple models of computation which is convenient.
Both generate Verilog, so the existing tool flow does not change. Frankly you need to be hard core masochist to use Verilog, VHDL or System Verilog ! The HDL community has been a bit resistant to the idea of higher levels of abstraction.
By the way most research programs use one of these newer languages. I typically neversee Verilog or SV.
U Penn as part of the Crash Safe DARPA project uses Bluespec. Link to CPU source.
I agree there has been a reluctance to move up to higher levels of abstraction but it is partly because HLS does not really work. Is definitely not sufficiently automated.
So you end up modeling at high-level and then designing at lower level and then spending a lot of effort to try and be confident that what you modeled at high level is equivalent to what you are designing at lower level.
And why has HLS not worked?
Here is my argument.
Because unlike RTL-to-gate-level -- where designers were willing to give up messing about with transistors and gates and dimensions of cells -- there was no obvious way to accept constraints (even at the IP core level).
And that has happened because there is not equivalent to the NAND completeness of logic at the gate level. Isn't it DeMorgan's Theorems that prove that all logic systems can be converted into NAND gates or NOR gates? Therefore being constained to using a limited set of gates allow all digital logic to be created. The constraint of using those predetermined gates allows synthesis to work.
There is no equivalent limited set of IP cores that could create all-imaginable logic circuits and so engineers are reluctant to accept any constraints on either IP cores or their dimensions and no obvious way to guide a synthesis engine.
Until we have automated provable synthesis of yet unimagined logic EDA is stuck working at multiple levels of abstraction and stuggling to prove their equivalence.
Blog That A-Ha Moment Larry Desjardin 12 comments Have you ever had an a-ha moment? Sure, you have. The Merriam-Webster dictionary defines it as "a moment of sudden realization, inspiration, insight, recognition, or ...