"More haste, less speed."
"Months of coding can save hours of planning.".
Yes, I have seen contract programmers display both of the above. They write code, do some testing, get paid, then move on. Management pats itself on the back thinking the jobs done.
Only later when a new software feature has to be added that the true nature of the beast is revealed. Suddenly it becomes clear that coding and documentation standards were not followed. Management was too hasty in accepting the "product", or the person allocated the task of checking "product compliance" was not up to the task, or was put under extreme pressure not to hold the project up. The software contract has no clause for redress. The original coder is no where to be found.
But that is months down the track, and becomes somebody else's problem... usually mine.
Don't I know it? A lot of the work that I do is troubleshooting serious design problems in different systems. It has to be right because of the usual critical nature of the issues.
My real projects get delayed but no one pushes me.
I got to be the slowest designer in the company but stuff works when I'm done.
It almost seems like even the people at Microsoft designing the Windows libraries forget that some task already has a library function written and just write a new one. Or maybe, it's just easier to create a new one than update an old one and go through all of the compatibility testing that would be required.
I had a junior engineer wait a month until he had the latest and greatest RF design software before he would even attempt a prototype design of a new power am design.
He wanted to do a harmonic balance along with several other competing analysis before he would even attemp a bench prototype.
The only software we had was an older RF package that was simple, easy to use and plain worked, albeit with a bit of bench tweaking to fine tune the circuit but it never let us down.
It was the minimal tweaking on the bench that this new grad wanted to eliminate and bench tweaking was out of the question, His thought was push a button and have a working design come out the other end without any knowledge of what could go wrong which you only obtain from actaul bench experience.
Unfortunately when you spend 60 thousand dollars to replace your existing EDA software and the new software requires a couple of months to learn how to use, you also loose valuable time in which competitors get their product to market.
You can train a pilot to take off and land in a simulator but you sure as heck are not going to give him or her a pilots license to fly a commercial airliner until they take the seat of something simple and basic to start out on.
Bill, are you describing SCRUM design methodology? Oh, that's Schwaber, not Schweber! ;-)
Good point and this is not just a time issue, either.
I admit, I sometimes embrace the dark side and I just tweak and throw the code at the FPGA compiler, again. This only works though if the basic architecture is sound and it's a double-edged sword if this is meant to save time. What if there is another error behind the error? And one behind that? Time to stop and think!
If I jump in without proper thinking upfront and hope to do the checks later, the architecture will resemble the proverbial dog's breakfast and when finally the design appears to work, then the obvious temptation is not to look too hard for possible errors still hiding, especially with increasing time pressures on the project. It's just human nature.
The person sitting in front of their computer appearing to be doing nothing may actually be thinking about how to solve a problem or how to best organize a process. On the other hand, the person frantically hacking away at their keyboard may be making a host of mistakes and bad, quick decisions that will affect a product for a long time down the road.
that's what a lot of folks say about C++ - you get 'choice/decision paralysis' because there are to many ways to code something - and it's not at all obvious which will be best. A great language would point obviously towards the best approach
Replay available now: A handful of emerging network technologies are competing to be the preferred wide-area connection for the Internet of Things. All claim lower costs and power use than cellular but none have wide deployment yet. Listen in as proponents of leading contenders make their case to be the metro or national IoT network of the future. Rick Merritt, EE Times Silicon Valley Bureau Chief, moderators this discussion. Join in and ask his guests questions.