A certain woman is a diabetic. She is now skilled at working with syringes, and finding new places to give herself those shots. She is very grateful for the science that keeps her alive, and is diligent to follow the rules.
Then science (an art) became technology (an engineering discipline), and she received an implanted insulin pump.
Now, at restaurants, she pulls out her wireless (yes, that's W I R E L E S S) remote pump controller, and uses it to prick her finger. She taps in chicken, potatoes, and ice cream, and the calculations are done. No more charts, no more disposing of needles, no more remembering to just do it.
And you have colored syntax and lint!!?!!
@RonCollett : we found your lost productivity!! Its spinning counterclockwise in the northern hemisphere!
@antiquus -- There are more tools available to help out at the RTL level than just text editors feeding into simulators & logic synthesis tools. There are decent linting tools, for example, that will warn you of all sorts of bad coding practices, and will also let you graphically view your hierarchy and track objects from one level to another.
I like your idea of an editor that would automatically add the mandatory (and mundane) stuff like wire & reg declarations. But you should be aware that even for old standby editors like vim, there are add-ons for Verilog syntax awareness -- nothing too fancy, but at least they display keywords in one color, numbers in another color, operators in yet another and so on.
Not nearly up to the level of software development tools, but not entirely stuck in the Stone Age either.
And for real productivity gains in hardware design, certain types of designs lend themselves nicely to ESL tools -- even some tools in which the design entry method is drawing block diagrams rather than typing code in a text editor.
VHDL was modeled from Ada, and Verilog seems to be some combination of C and Cobol. I did not say that these languages are at the end of their usefulness (although some tweaks would be helpful), but proper tools would enhance the productivity of using them.
Why are there no compilers that work in real-time, as the code is entered? For modern C, C#, Java and other systems, all syntax is checked and errors displayed directly in the editor. Cross references can be checked, so that an "input" to a module is instantly tracked to its source in the calling module. This greatly reduces the number of fundamental errors, and can ensure consistent naming.
Another example would be that the editor should recognize that a (Verilog) "wire" declaration is needed that corresponds to an "assign", and simply insert it, or a proper "reg" declaration when the assignment happens inside an "always". ....and delete them when necessary as well.
Multiple files in a design should be displayed in a tree or network format. "dir" or "ls" just don't lend themselves to identifying dominant and subordinate files in a hierarchical design, and especially won't help in culling out just the call strings and linkage information. This is currently a form of tribal knowledge in the design team.
The fundamental problem here is that software developers make their own tools. Hardware developers are stuck with vi or gvim or (Heaven help us) edit, and don't know what they are missing. You have a billion-instruction per second machine there (thanks to the hardware guys that came before us), and it spends its day waiting for keystrokes (sort of reminiscent of Marvin the robot). Incremental simulation and synthesis should be a research topic at all the major vendors.
ahh, you can't even buy ICs any more.
We almost ALWAYS have to make changes because even 'standard' components are not available.
What a joke.
all these new fantastic 'press releases' for some pie-in-the-sky chip, but you can't hardly find ANY inventory any more.
WHAT A JOKE.
Time to redesign using basic discrete components.
take your 'fancy' chip sets and go fly a kite !!!
So you are saying that your early design efforts can run down a rabbit hole, leaving the synthesis step to materially start over? Reliance on the designer's wisdom in this matter does not allow for scalable productivity. How many iterations do young designers require before they learn these quirks? Getting the bugs out during synthesis and validation when they should have been gone during simulation and verification is certainly a hindrance to better productivity.
I tend to disagree. Verilog/VHDL though modelled on the traditional software languages like C, were meant for Hardware description and there is only a level of abstraction in any language before it stops serving its actual purpose. I think, even software engineers would agree with this!
Don't forget that VHDL and Verilog are languages that are used to describe physical logic. Many of the techniques used by logic designers help prevent problems in the physical logic. Good software coding can cause problems when it is synthesized into logic. You may need a block in your chip that has 50 inputs, so that is okay.
It does not strike me as odd that the synthesizer and the simulator have different compilers since they are used for two different objectives. Many things work fine is simulation that can not be easily implemented in the silicon. That is the designers job.
Much of the stagnation can be attributed to the toolsets. Yes, there have been major strides in tool development, but mostly in the area of broad integration and backend support, and not at the level of helping to understand the code itself. (Don't get me wrong -- some of the backend pieces are absolutely amazing!) At the front end, where the coder sits, there has been little effort at improving code entry. At both places where I worked with VHDL/Verilog, the code was edited by standard text editors. In my position as reviewer, I find myself sorting through mulitple vi windows tracking this or that variable. I don't see why syntax-aware and otherwise helpful editors (like from www.scitools.com), or even compile-on-the-fly (like Visual Studio), are not more welcome in the hardware domain.
Coming from a software background (mostly C and C#), my impression of VHDL/Verilog is a bunch of ill-organized and poorly factored modules, and coding style/methods based as much on tradition and superstition as the worst software I've ever seen. Modules with 50 inputs are routinely accepted, and no one thinks this is hindering productivity? We won't even get to ideas like information hiding and other advanced concepts.
The languages require a schizophrenic combination of brute force and "oh, the compiler will optimize that away" thinking. Those coders that use the advanced features of the languages are clearly in the minority. Does it strike anyone else as odd that the simulator and synthesizer use different compiler _front_ ends? Syntax that works in one throws warnings and errors in the other! Duh!
Perhaps my impressions are wrong: my sample set is exactly '2' employers. But I see much the same on the comment boards.
When I refer to design productivity, the term "design" includes verification, as well as design (digital, analog, RF, etc.), layout, test, validation, qualification, etc. So my use of the term "design productivity" is a misnomer. (My bad.) It's really IC Development Productivity, which is the term our tools use to report the productivity metric. Development Productivity measures the aggregrate productivity of an entire IC development project, from start-of-concept milestone to release-to-production milestone, and includes every activity occuring during that interval (except SW development, which has its own productivity metric -- SW Development Productivity).
But to your point, it is indeed true that verification productivity is not keeping pace with verification complexity. The evidence is in the size of verification teams -- they continues to grow, which means that verif. productivity is not keeping pace with verification complexity. In addition, verif. team size is growing non-linearly with other design activities on the project, which reflects that indeed it is becoming a more significant "bottleneck." Thanks for your comment. Ron
Blog Doing Math in FPGAs Tom Burke 2 comments For a recent project, I explored doing "real" (that is, non-integer) math on a Spartan 3 FPGA. FPGAs, by their nature, do integer math. That is, there's no floating-point ...