I can't even imagine a single test doing anything to accurately indicate competence in software development. It sounds like this test is more about the process than about the coding, but being disciplined at following a process doesn't really indicate competence or lack thereof.
I think it could possibly help understand someone's attitude toward software development, but so can a good interview. Even it it were a good test, there are so many divisions of software development, that understanding one methodology would not necessarily say anything about the particular methodology that I might apply in my company. It's also quite possible that a person could do a good job of passing a test about the process, but not really be able to code.
From the perspective of someone who might be required to take the test, I didn't go into civil engineering for a reason. I can't imagine how spending a lot of time studying other engineering disciplines could make me a better software developer. (Except, of course, that I thank knowledge in general is a good thing)
Given there is actually no shortage of engineers in the US I would suggest that the H-1B visas should only be given if the holders are paid at the very least the average local salary of an engineer in that field and at that level, not 1 cent less. Then they would realise they are better off hiring locals who generally better understand what they're working on. I worked for Delco years ago and they ran development programs in Singapore to design engine management systems. It was disaster, where people that may well have been good engineers were developing vehicle systems SW and had never driven a car. It just didn't work.
OK, laugh away. You've probably been flying commercially for quite awhile, therefore at multiple occasions you've trusted your life to software that was developed to extremely rigorous specifications that the IEEE had absolutely nothing to do with. I wouldn't take the risk to ride a bicycle at night that had an IEEE-approved taillight because their first interest is making a buck and their second interest is in "locking in" the need for the approval of academicians most of whom have never had to make anything work in their lives (and have nothing but contempt for those who do - that's how it was my entire time in school anyway - isn't it interesting no one there ever tried to dispel that impression?). The notion that "here comes the IEEE, THEY'RE authoritative, now throw away everything else you thought you knew about software safety engineering" is just an arrogant power and money grab, pure and simple, whether YOU think so or not.
I laugh at the many claims that some organization or other (notably the IEEE) is promoting this exam in order to make money. The IEEE Computer Society and IEEE USA got involved at the request of the licensing community and the PE community because they were viewed as the most authoritative source of information on what constitutes genuine software engineering (developing software using engineering principles and techniques). They responded because doing so is consistent with their mission.
One motive for this exam and the prospect of licensing that underlies it is that software engineering has been struggling to become accepted in the engineering community. One of the criteria that many engineers cite for legitimacy of an emerging field of engineering is the existence of a PE licensing exam. This is why you can now get such exams for aerospace engineering, for example.
The fact that almost half the software exam takers flunked is one indication that the exam is not something to scoff at.
There's some misinformation in the article, Rick. Regarding "The exam gives a license that could open doors and provide job security for engineers working in utilities, traffic control, automotive, wastewater management and other critical infrastructure areas, backers say." The exam does no such thing. A license to become a Professional Engineer is granted by a licensing board (there's one for each US state plus a few more for US territories and DC). The boards that recognize software engineering (and not all do yet) will typically use a passing grade on the exam as one of the qualifications for licensing, but it's not the only qualification.
Bert, for once I'm even worse than you are! I see this as just an IEEE -sponsored turf war, they're upset that the "gold standard" for safety-critical development is RTCA DO-178C, and they know historically the professional organization for FAA-related work is SAE not IEEE. They're also "striking while the iron is hot" because they're just as acutely aware as I am that there hasn't been any serious new avionics development going on since the '07 crash. As far as medical is concerned the AMA may have good intentions but the only FDA SW development I was involved in I was threatened with being fired if I didn't sign testing documents fraudulently, I of course refused and they carried out their threat. So you start out by realizing the FAA carries a "big stick" but the FDA is a paper tiger where the monetary clout of the players insures there will never be any credibility to that field, therefore IEEE had to set up elsewhere. And this is all when I was just starting to get over using H-1B visas to bring in minimum-wage engineers (and low pay levels are of course FAR more important than product quality), please excuse me for my horrendous level of cynicism!
Bert, I don't think that is a cynical view! I would say that you are being realistic! Any certification / licensing that is brand new would and should be questioned as to motive. That said, I was wondering also about the correctness of the test and the certification of those who created the test?
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.