@Duane - you're not supposed to ask.....I just had 2 weeks off.....had about 10 000 things to do, of which this was about no 5000....only got about the first 2000 done so this got left on the list....... I need one of those Round Tuit things....
@Duane - I have heard the PIC configuration ican be a real pig.
I signed up for the MCU course with the ST ARM board and just got the board - so this is going to be a baptism of fire I think. Plenty of little problems there too by all accounts. Good thing I'm a bit of a masochist :-)
That's probably overestimating my skill. :-) What always seems to delay me the most is the fuse/configuration bits. PICs are notorious (at least to me) for being diffiult to get the fuse bits right on a new mode MCU. It's only something in the range of 8 - 16 settings to get, but they drive me nuts. The worst part is that it seems like the same fuse (same lable anyway) can work differently from one MCU to another otherwise closely related MCU.
As far as the useability of the C, I think MCU C tends to be a lot closer the Assembly then C on other platforms. I still have to directly manipulate the I/O registers in most cases. C obvioulsy takes care of bank switching and adds in a lot of structure, but I still think it leads to a decent understanding of the architecture.
@Betajet: Sorry for the error. My contention is that the typical third stage of compile takes a nice compact intermediate language and and breaks it into so many loads, stores, and whatever else RISC is noted for, that memory accesses are excessive and too many cycles are used. This leads to the "memory wall" and not being able to simply clock faster to overcome the waste. Bit significant micro code is a good way to control CPUs as well as FPGAs and embedded block RAM can be used for more registers than any RISC ever dreamed of and gets the performance benefit.
Assembler is fun for puzzle like entertainment, but takes years to make a system. IBM OS/360 was a great example.
It's the PDP-11 that maps well from C. The PDP-8 is a 12-bit accumulator machine with paged addressing and not a good C target.
The PDP-8 "microcoded" instructions do not use the usual meaning of the word. The OPR (operate) instruction doesn't have the usual 9-bit operand field and insteads uses those bits for various functions like "complement accumulator" (1's complement) and "increment accumulator", but you can do multiple functions in the same instruction: "complement and increment accumulator" does both functions in sequence, giving you 2's complement.
If you're interested, there's a good description at Wikipedia. The PDP-8 instruction set is so simple that it only takes a few minutes to learn it -- though much longer to master its tricks. It's a fun ASM if you enjoy puzzles. If you just want to get work done, use an architecture that maps well from C like PDP-11, 680X0, or PowerPC.
@Betajet: Thanks for pointing out the microcode(horizontal microcode?) aspect of the PDP-8 and how C maps so well.
Terminolgy is so superfical or understanding is so superficial that it is nearly impossible to discuss a new concept, but maybe the ease of mapping C to PDP-8 can help.
C structures a program into statements and compound statements delimited by curly braces. statements are either assignments or flow control using relational evaluation for the loops, switch, and if statements.
The flow is sequential or the non-sequential target of the control statement.
That means that C can be parsed and microcode a la PDP-8 (bit significant) can be generated to execute C directly without compiling to a conventional ISA.
Just an extension of the bit significant micro code and I am having fun doing it.
I agree with you about PIC machine language. I took one look at it a long time ago and once was enough. If you're going to have an accumulator machine, at least do something clever like the PDP-8 and have just 8 instructions (well, the 8th instruction is actually multiple micro-coded instructions, but then that's pretty clever too).
ARM is problematic. At one time it was a RISC machine, back when "A" stood for "Acorn" but each architectural version has crammed new instructions into irregular gaps left by the previous versions so that it's now well into the realm of rococo. If you like complex railroad timetimes with myriad footnotes, you'll love the ARM "quick reference card".
I did most of my early ASM programming in PDP-11, which was a delight. C was designed to map easily and efficiently into PDP-11 machine language, and it shows. Machines based on PDP-11 are generally very pleasant, such as 68000/Coldfire and TI MSP430. For RISC machines, I really like PowerPC -- very regular and well-planned.
It sounds like you've studied enough machine language to understand what computers are doing. Once you've done that, it's not necessary to keep programming in ASM, particularly if you can visualize what the computer is doing as you write in C.
Regarding your experience with recent CS grads -- and their lack of experience with programming unless they were doing it on their own -- I would make the following comment: It's very time-consuming to grade programming assignments -- you can't just see if the answer is the correct number or the correct formula. Each answer is different, so it's like grading essays. Grading this sort of assignment takes time away from other professorial activities like publishing papers and getting research grants, so any idealistic teacher who spends his or her time doing it is not going to last long in modern academia.
That said, when I was an student I did spend at least an order of magnitude more time writing programs than what was required by classes, because I loved it and still do. Your candidates who did lots of programming on their own likewise love it, and you should hire them because people who enjoy their jobs are generally going to do much better work. The ones who only did programming when it was assigned will most likely treat a programming job as "just a job".
@Duane - I'd agree with both you and Betajet. I did some Assembly stuff in the good old days (8080 / Z80 etc) and since then have not done more than observe how things have gone since my job has not been in that field. However I've recently started playing with PICAXEs (Pics with a Basic interpreter). Lots of fun, and REALLY easy, but some limitations. I think that having done assembler stuff in the past does help a lot with understanding how MCUs work, but I marvel at the ease with which guys like you and Caleb jump into pretty well any processor from any manufacturer using C. I think that's the way I'd like to go. I just need to win the Lotto and retire to get the time.....
Betajet - It's fair to say that I have a bad attitude about PIC assembly programming. I have studied the architecture a bit, but that's not the same thing. I really enjoy C and can get a lot more done in a lot less time. With ARM processors, I'm afraid I'll probably never allocate the time to learn assembly. Again, I'm studying the architecture a bit, but my programming will probably go no lower than C.
What's really astounding to me is that some CS graduates, I've interviewd recently, barely have any actual coding experience - assembly or otherwise. I really can't wrap my head around that one. The good candidates have done a lot of programming on their own, but in the program, more than one that I talked to had done some Java, but little else. I don't get it.
What are the engineering and design challenges in creating successful IoT devices? These devices are usually small, resource-constrained electronics designed to sense, collect, send, and/or interpret data. Some of the devices need to be smart enough to act upon data in real time, 24/7. Are the design challenges the same as with embedded systems, but with a little developer- and IT-skills added in? What do engineers need to know? Rick Merritt talks with two experts about the tools and best options for designing IoT devices in 2016. Specifically the guests will discuss sensors, security, and lessons from IoT deployments.