Getting engineers to listen to proposals about tools that would increase their productivity is like trying to get a mule's attention, says a self-proclaimed "sales weasel."
While not a very religious-minded individual, I was tempted to shout Amen brother! after reading "The value proposition unfulfilled." I have worked in industry as an engineering manager. Nowadays I sell software tools for a company whose name many of your readers would recognize. To immunize myself against charges of using this forum to shamelessly shill my product, I won't say which one, nor will I use my real name.
Based on my experience industry and more recently, talking to embedded systems developers, I would say that Jack has hit the nail on the head. Getting engineers to listen to proposals about tools that would increase their productivity reminds me of the old sobriquet of how you get a mule's attention hit it over the head with a 2x4! A lot of engineers these days have hunkered down into their foxholes, sticking to methods that they are familiar with because they are too afraid to try something new. Fear of real or perceived failure, and the possibility of having their next project be contracted out to Bangalore, seems to be driving decisions more than analyzing what's good for the company's bottom line. An unfortunate reality is that many engineers do not share the benefits of improving a company's bottom line, but that is a topic for another day.
I believe that there are tools on the market that can help developers create better embedded applications. It requires the willingness to learn something new, and expenditure of some precious engineering hours. Feedback I have gotten from the brave souls who have undergone the process, indicates that it is worth the trouble. One guy even went to the trouble of looking me up in our booth at ESC in San Francisco to tell me that. To me, this is encouraging news. It is worth a lot more than any marketing hype I might lay on a customer while trying to secure an order.
Among the engineer who believe in the concept, many have expressed their frustration at not even being able to have a conversation on productivity tools once management learns that some additional $$$ will have to change hands. Without proper metrics to assess productivity, we will inevitably resort to qualitative arguments, in an era where capital equipment proposals must be carefully constructed with quantitative payback estimates.
I wonder how many companies did a study to gauge the productivity gain from switching to C compilers over assemblers for embedded software development. Yet, it remains an article of faith that this approach is more productive. Why can't some effort be devoted to objectively evaluating newer productivity tools currently on the market? The results may be surprising. Aren't the goals of fewer bugs, less expensive to maintain software at least sufficient to justify a little evaluation effort?
You can tell this article comes from a salesman, even if he was an engineer in a past life. There are two main reasons why engineers don't buy new tools. First is that they don't hold the purse that's what management does, and, as "Sales Weasel" implies, management like to keep hold of it, and don't like to spend from it.
The other reason is that all I've ever really seen to tell me that these tools are better is hype. Sales people are generally there to sell a product, so they will naturally hype the benefits and conveniently forget the problems. What we need is access to tech support in advance of the sale, so we can find out what really happens when the tool is installed and in use. After all, we know what's wrong with the tools we have now, and we can work around their foibles. But we don't know whether we'll be able to do that with a new tool, because we don't yet know what will be wrong with it although we can be pretty certain there will be something wrong with it.
In short, we aren't going to switch from what we know (even if it's not perfect) to something unknown. But we might switch from something irreparably broken to something we can be convinced will do at least some of what we need to get done.
It's back to the first law of software coding (unless you're into XP): If it ain't broke, don't fix it.
A note on evaluation. Back in the old days when we were coding in assembler and looking to move to C, there was on the one hand less time/cost pressure on us to get the code shipped, and on the other hand less complexity in the tool set to be evaluated. We could pick a representative set of assembler functions and code them in C, compare the code size and run time and RAM space, decide which method was best, and have the whole thing done in a matter of days. The increasing pressures on schedule (and cost and resource) and the increasing complexity of tools mean that in most cases there isn't the resource people or time to do an evaluation that's likely to take or months to come to a conclusion that has holes in it you could reasonably expect an elephant to walk through.
If you take your evaluation off line (thus removing the time pressure), the resource will get pulled when the "real" projects get bogged down, or someone moves the end point three months closer. And if you do your evaluation on a "real" project (thus removing the resource pressure), you have to be pretty damn sure it'll work, and how to make it work, before you start, or you can kiss your project goodbye.
There's no such thing anymore as "a little justification effort."
Principal Software Engineer
T R W Automotive
I don't think this is an "era where capital equipment proposals must be carefully constructed with quantitative payback estimates." I think this is just business 101. A business simply can't afford to invest in a new tool unless that tool can pay for itself in reasonable amount of time or that business will cease to exist before long.
If you are going to try to sell me a tool that will increase productivity, prove that it will. How do I know that the potential productivity increase is even worth the resources necessary for evaluation much less of the cost of the actual tool itself and the support contracts?
Don't be so greedy. Jack Ganssle argues in his article "The value proposition unfulfilled" that "Hardware designers have little trouble securing funding for budget-busting items like oscilloscopes and logic analyzers for a number of reasons." One of the reasons he did not mention was that a scope or a logic analyzer can typically be shared by a number of hardware engineers. Software productivity tools are often priced per seat which makes it wildly more expensive to outfit a team of programmer's with the latest and greatest value adding tool that it does to replace the old oscilloscope with a the latest and greatest model.
VP Product Development
I think that the major obstacle most engineers deal with is that of quantifying the ROI of a software product. When you are in a position of having to get approval for expenditures over a certain amount, and that approval form concentrates almost exclusively on quantifying how much money will be saved by buying this product (you mean a lousy CD-ROM costs $3,000!?), then you either lie through your teeth because you cannot make even a SWAG as to how it will affect program costs, OR you just drop the whole idea. I'm trying to figure out how to justify a $3,500 ICE right now, and I'm running into the exact same issue. My advice? Either lower the prices for the products (right!) or have a 30 or 60 day trial period you get the product, you try it; if it works for you, you buy it!