CamilleK, a good point on the jobs issue ... while resistance would be the natural reaction, the ability to scale resources on the EDA user (design and/or verification engineers) and cloud computing provider ends can actually increase employment levels. :)
KarlS: You are pointing out a methodology shortcoming or sequencing that if solved could provide a faster way of doing something or perhaps remove some iterative steps. True. Regardless of that, and even if somehow this is implemented, you will always need to break the problem to many compute servers or get any step to run faster using say bigger/faster machines and you do not want the added cost and latency of deploying, provisioning and maintaining the hardware, not to mention the logistics of negotiating added license access even if you are a large company with resources. So the cost saving would be in focusing engineering resources on creation not upkeep and in not having to continuously buy hardware. I realize this is also a touchy topic in terms of job preservation, but a skill retooling could be phased in with expansion of jobs if the aforementioned companies (vendors/users alike) save enough to stay profitable and go on hiring for projects and other productive endeavors.
Too much cost is involved in generating RTL before anything useful can be done. The design has to be compiled, synthesized, placed, routed , and whatever , before simulation. Each iteration has the unnecessary steps repeated. This is apparently because EDA only understands stuff that "looks" like program code. Boolean Algebra is a very concise way to define logic and can be simulated easily and quickly. Then when the logic is stable do the compile stuff. Take a good look at where the cost is generated.
It is undeniable that the driver for EDA providers to move to a cloud computing model is the achievement of a capex and opex reduction. The attractiveness to the end user is elasticity and capacity on demand, paid for when needed to scale without having to reserve unneeded capacity through traditional procurement, or skimping on licenses and seeing a schedule impact. What is needed is a standardized transacting model that, while allowing each vendor their pricing flexibility, provides a consistent user interface and experience to the end user. Collaboration, interoperation, common taxonomy, best practice sharing, usage statistics among the vendors is key to bringing about even bigger efficiencies and to enabling big and small users to focus on design creation and with no fear by EDA vendors of having a side by side exposure of big and small EDA providers launching from one optimized GUI environment. A side benefit would also allow a comprehensive metrics analysis of what key features and tools are being used and in what way.
Blog Doing Math in FPGAs Tom Burke 15 comments For a recent project, I explored doing "real" (that is, non-integer) math on a Spartan 3 FPGA. FPGAs, by their nature, do integer math. That is, there's no floating-point ...