Extracting insight from mounds of data may benefit from using old techniques
I just saw a fascinating story in a recent issue of Physics Today on “listening” to your data—literally, see “Shhhh. Listen to the data.” The article showed how making real-world data audible allows the ear and brain to sense patterns, extract features, and find occurrences which might otherwise not be found by conventional data-analysis packages.
This may seem a counterintuitive throwback to those quaint, ancient methods in our software-intensive world, but reality is that the brain can extract things that even our most impressive computers and algorithms can’t, or which require significant computing power to achieve. Also, the brain is good at dealing with the unexpected, while even the best data-analysis package can only find what it has been “programmed” to expect.
A few years ago, I spoke to some people doing software for the DARPA autonomous-vehicle road race, and asked them about the biggest challenges they faced. The answer was pretty quick and unambiguous: having the vehicle “see” where the actual road was, and not be misled by trees, signs, fences, obstacles, distractions, road irregularities, and the almost countless other realities of what the vehicle’s cameras could see. Many lines of code and corresponding MIPS were dedicated to image recognition and feature extraction, they added.
The irony is that seeing and then knowing where the road is turns out to be pretty easy for almost anyone, even those with poor actual driving skills. Yet the brain is not executing millions of lines of code, nor doing MFLOPS of processing to figure it out. Whether using audible, visual, or other senses, the brain is amazingly good at determining patterns and anomalies. And don’t kid yourself: we have almost no idea how the brain does this, despite what the neuroresearchers would like you to think.
Experienced engineers use all their senses when designing, assessing what’s going on, and finding out what’s not going as expected. Good design and debug is a combination of formal tools and also the human ones: sight, sound, feel, and yes, smell. The best debugging methods I have seen and used are also the oldest: look, listen, expect the unexpected, and then stop and think, before jumping to your next step.
Have you ever used the informal tools of human senses individually or in combination, to find the source of your problems or assess your designs? Did you do this intentionally or accidentally? ?