Some people are quick to pronoune EDA dead. In fact it is in a period of transition. It can either grow larger than ever or shrink to a quarter of its size.
In a viewpoint published on March 16 (www.edadesignline.com/news/198001821), Paul McLellan, vice president of marketing at Virtutech, starts his dissertation with a three word sentence: "EDA is dead." I agree with most of what he says in the first half of the article. I have said the same things before both on this website and on the gabeoneda site even earlier. But I do not agree with his conclusion. Many analysts and industry observers are suddenly speaking about multi-processing as if it was a new invention. The fact that both Intel and AMD have introduced dual and quad processor chips has them resurrecting the parallel programming issue once again. Paul comes to the conclusion that multiprocessor chips will be ubiquitous and will require the invention of new development environment and, although he does not specifically states this is the article, new programming methods.
The problem is that parallel execution is not greatly accelerated by doing the same thing in parallel, it is best employed by doing a number of different things concurrently. So the most widely used system of the future is not a chip containing a number of equivalent processors, it is a network on chip containing a number of functionally specific processors. A floating point unit, a matrix solving processor, a number of graphics engines coordinated by a display management chip, one or more DSP engines, a number of memory management units, and a CPU can all co-exist on a network on a chip.
The application software required for each type of processor is different but fairly sequential in its nature, so programming methodology does not need to change. I agree that someone with a brain that can handle concurrent events must develop the system management software (or hardware for that matter), but this task is much simpler than having to write an entire application for parallel execution. In addition, special purpose processors are more efficient at doing what they do than a CPU.
What the EDA companies need to develop in the near future is an understanding of system level design that has so far eluded them. Systems are not about hardware expressed at a higher level of abstraction, nor are they only about hardware/software co-design. System architecture is about application domains, so ESL is the wrong label. What functions do I need to control the environment inside a building? What functions do I need to fly an airplane or to control traffic on a highway? In the latter case, just as an example, I would like to point out that we have so far approached the problem in the least creative way possible. Don't just try to solve the problem by controlling the highway or the car: you must do both concurrently and as they constitute an integrated system.
So I agree with what Paul implies but does not state: the times when EDA companies realized most of their gross margins on back end tools are coming to an end rapidly and Cadence, Magma, and Synopsys will be well served to understand this quickly. Significant customers for 45 nm tools will be less than three dozens. But EDA has space to grow. As an industry we just like to procrastinate. The industry motto is: do not solve a problem before the market is ready for the solution. The market is ready for the new system level design approach. The Mathworks and National Instruments are telling us so with their revenue numbers. Now the slumbering giants better hear the alarm clock or they will sleep through the remake of the industry.