Close but not quite. You must also factor in the well-developed libraries of Java, the very well implemented library utilization/documentation schemes, and the rapid build of "hello world". Java includes tracking the original source and author, and the ability to load libraries at run-time.
The embedded world has no similar construct. I take your code, and the only traceability is in the comments; I erase the header and its mine. If I link to a Java library, then I get a wealth of metadata and version control. I don't even need to bind your code into my distributable, but instead can download it from any of multiple sources.
That programmer you just hired is spoiled, and you want him to return to stone knives and bearskins.
I will disagree with Thomas, I got my Computer Engineering degree in 1980, so it's been around for a while, but it's been in flux as CS, CE, and EE departments got merged and split over the years. I have a first-year EE student taking a first programing class in assembler and C. While the materials are good, the class got lost by trying to teach too much in too little time to people who never programmed before. The last midterm had an average score of 25%. This is not the way to encourage people to take up embedded programming. One of the problems with many of the university classes is that they understand what needs to be taught, but no idea on how people learn. The class has a strong "no-copying" requirement, but most people learn programming by seeing and copying examples, and from each other. I agree that using copy/paste isn't learning anything, but seeing how someone else solved a problem makes it easier to solve that same problem again later.
Unfortunately as hardware manufacturing has dwindled down and moved to SE Asia so has the demand for assembly and to some extent 'C' language. I was a BIOS engineer and then a network driver engineer for a long time and all that skill has now gone overseas and CA. What's left is application development and some embedded and plenty of teachers with Java, C# experience never mind the hardware. Once we start manufacturing here assembly language will be back in demand. I don't see that up on horizon. I have since grasped C++ and working for an employer writing user mode applications.
All the above entries are saying, 'Newbies need to be able to get in at high level but be able to drop down, deep to the hardware as quickly as they want, and they need tools that allow them to explore what is going on, and they have to be cheap, quick, simple and effective.', and embedded employers want kids that have done that kind of stuff.
After bruising my knuckles toggling in 8085 machine code, I discovered that Forth was the cheap, complete IDE that showed me assembler, macros, the basic Djikstra high-level constructs, pointer abstraction, interpreters, compilers, cross-compilation and meta-compiling, dynamic storage, dynamic programming, data structures of every persuasion, graphics from pixel to JPEG, disks from LBNs to RAID spanning... yada yada...
In other words the ultimate teaching environment - language-neutral but DEEP - able to make a robust, sandboxed user interface, or slice in a single line of source from the most abstract script on a host PC to a single port bit on an embedded target. All with the same language, tools and principles, and all for free...
You can bridge from Forth to any other language because you will have met everything that another language can throw at you, you just have to learn what they call the construct, and how much worse or better it is than the Forth version.
OK so I never (officially) worked in Forth, but I have _thought_ in Forth ever since, because it is the leanest, cleanest, uncluttered expression of human control over hardware through software.
From a lifetime's experience, think on this: what if Forth actually IS what you get if you relentlessly apply Real Engineering to the problem of getting a machine to do what you want?
Finally, before you get out the RPN and 'write-only' jokes, just think on this; would you prefer to hunt out a tricky bug in 20 megabytes of beautiful C, C++ or Java source, or in 2kB of dense Forth text doing the same job?
No experience with a language like C is a disadvantage, but I think it goes deeper than that. With students doing their programming on laptops running with multi GHz clocks, at least 1 GB of RAM and essentially infinite hard drive space, there has been little incentive to learn about efficiency either in execution speed or code space.
The last bright young guy I brought into work on an embedded 8 bit project had a lot of experience with C#, so was at least familiar with the basic C syntax. His code (at first) was not very efficient. The concept of things like using logical functions to mask off bits was alien to him and his peers. The tools for 8 bit development are also a lot less sophisticated than the Visual Studio and similar environments he was used to. He was very frustrated by this.
There are a lot of gaps between web and PC programming and embedded development.
Excellent point, betajet. The background of assembly language really does provide a good basis for a more efficient toolset such as C. Computer Engineer students and Embedded Systems programmers really should have this level of understanding.
I would, however, understand if Computer Science majors did not necessarily have this background. Virtually all of the "modern" computer languages (Java and .Net in particular) have abstracted away these details in most instances.
"Computer Engineering" is a new discipline (relatively speaking) and the curriculum can widely vary among the schools. There is at least one school that I know of that still mandates C programming for its ECE majors and even offers a course entitled "Embedded Real-Time Systems". However, the author's point is valid, many other schools have moved away from this embedded systems focus. This trend needs reversing. C programming needs to be an elective at the minimum.
In my career as a Embedded systems programmer and later the project leader when I had to recruit people who could do embedded systems programming , I found that the engineers with hardware knowledge could easily associate themselves with the assembly language programming. I could turn many a hardware engineers into good assembly language programmers who later mastered the high level language skills also.
So I feel this is the best route to take for universities. Groom the electronics engineers into embedded programming and you can get smart all-in-one designers and programmers out of them.
I've been embedded for about 20 years, almost always programming in C or C++. Many years ago did use assembler, and very infrequently have to read it now. So I'd disagree with many comments here on the overriding importance of knowing assembler to a high level. To me, whilst some knowledge is necessary, you really don't need to have much to be successful. I think some teaching of principle is good, but forcing students to learn it in great detail will put more people off than it attracts, especially since the current instruction sets are so complicated.
C, though is vital. I noted one comment in the article that it's principles that are important, not the language. The problem there is that a language like Java does not give you the principles, certainly in memory management. Moving from Java/C#/Python to C is harder than the other way round because of this.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.