In his dotage, my old man (a chemistry Ph.D) used to say, as he slowly stirred his martini with a crooked finger, “the more we know, the less we know.”
As systems become increasingly complex because we overcome old technological hurdles, they also become more unpredictable.
One recent example is report that NHTSA and the NASA Engineering and Safety Center (NESC) published regarding unintended acceleration of Toyota automobiles. Michael Barr has an excellent report on it. In short, NASA said it couldn’t rule in but couldn’t rule out software problems as a culprit in the unintended acceleration problem.
And Stanford, via its Facebook page, has described how engineers are addressing the "aeroelastic flutter" problem, a complicated, unpredictable phenomenon. (P.S. don't watch this video if you happen to be on a plane with WiFi).
The more complex the software (and hardware), the harder it is to model and find corner cases. We seem to be falling behind in assessing the known unknowns and we’re completely in the dark about how to approach unknown unknowns.
We race up the abstraction ladder to try to keep our arms around design complexity, but that creates other issues. I attended the annual meeting of college engineering departments recently in Phoenix and one questioner from industry stood before a panel of academics shaking his head. It’s great to turn out really smart kids who know theory and can deal with abstraction, but if they struggle with basic engineering concepts, companies need to train (or retrain, perhaps) them.
How are we going to get better at anticipating the unknown unknowns? It is formal methods? It is impossible?
Models are just that - models. It is kind of like going into war battles per the model(s) on the planning room table, and wondering why it didn't actually work out the way it was planned. Not only is the real world different than models, real people in real time/world situations don't give a damn about a model when reality is staring them in the face.
Please forgive the following snide comment: "What? Are there known unknowns?"
OK, I'm over it! Seriously - my experience has been that understanding basic concepts and seeing those concepts in action, or better yet, working with them in a "real world" manner will probably be of greatest benefit. Theoretical exercises certainly can work to demonstrate the fundamental concepts, but the real world is full of nth-order effects and dependencies that often seem to contradict theory, because the theoretical model is incomplete or incorrect. Read National Semi guru Bob Pease's work if you want to know how often theory (specifically SPICE models and simulation) do not match real world fact.
David Patterson, known for his pioneering research that led to RAID, clusters and more, is part of a team at UC Berkeley that recently made its RISC-V processor architecture an open source hardware offering. We talk with Patterson and one of his colleagues behind the effort about the opportunities they see, what new kinds of designs they hope to enable and what it means for today’s commercial processor giants such as Intel, ARM and Imagination Technologies.