No. 1: Design reliability -- We did, of course, cover reliability as part of my university course; however, it was at a very academic and abstract level. It would have been great to have had someone explain about failure rates; redundancy architectures (hot spared, cold spared, triple modular redundancy, and so on); fault propagation, etc., in a little more detail and with real-world examples. If a system has a MTBF (mean time between failures) of one year, does that mean it will work fault free for the year? And just how much will be the product recall cost (financial and reputational) if you do not consider reliability?
I agree, while the accademic side is very important a few industry guest lectures from people in industry can really give students a great insights in to what their engineering careers will consist of.
I remember in my first year we had a professional engineering foundation module which looked at PCB design manufacture and test. building simple digital and analogue circuits and even doing semi conductor manufacture in the on site fab ;)
Design for manufacturing is taught in many universities but I think the curriculum and instruction should be made by those who have had a good experience in manufacturing. Professionals working in the industry do make excellent adjunct professors to do this!
I wish we had more practical exposure while doing the engineering course. Like make your own PCB, fabricate it assemble it, load the code and test it. It should be just part of the curriculum rather than a semester project.
You make a good point I use FFT's a lot along with filters and signal processing. I keep meaning to write something on FFT's.
I did write a primer on basic digital filters which EE Times re printed I think not long ago. The interesting thing I thing is somethimes the concepts need to be explained simply for instance where does the sinc response come from....
I so wish that when I was at university in the late 1970s I'd paid more attention to the signla processing part and things like FFTs. As I recall, it was all analog back then -- I don't remember them talking about digital signal processing (of course I don;t remember a lot of things these days), but I wish I'd spent more time listening to the lecturers and lett time noodling on my own projects.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.