Imaging pressure with the piezo-phototronic effect arrays nanowire-LED sensors on a sapphire substrate (A) to enable a touch (B) to turn on zinc-oxide LEDs (bottom) in a character pattern.
(Source: Georgia Tech)
You may be right that training should not be necessary. In fact, one trend to take robots mainstream is called "co-robots"--which are supposed to accept instruction from an untrained human who just demonstrates the task while the robot watches. That is one goal, at least, of Obama's National Robotics Initiative:
The only thing that differentiate a human from robot is feeling and emotions. If robots can get that, we are making close to humans, but yes the the way robot would behave must be already programmed. Although humans may behave differently in same situations.
Looking at how robots are being visualized and used in many different applications, it is worthwhile to have them feel.
I don't think we'll be able to consider robots to truly be mainstream until they're easy enough to use without a training class. It really needs to be an appliance.
Of course, computers are mainstream and are still challenging enough to a lot of people that training classes would be a big help. But the number of people who operate them without training would indicate that most people would not bother with training for a robot. I think the only reason people get training in cars before getting a license is because the government requires it.
Right on, Larry. Sensitivity already rivals humans and is only getting better. You are also spot-on regarding applications--robots is just a headline grabber, but any touch enabled application today is a candidate.
Yes, there will definitely need to be user training classes included with general purpose robots, but we will probably see turnkey special purpose robots long before that (like a "cooking" robot whose arms extend from the top of your stove) where just pushing a few buttons will control its behaviors.
I'm no expert on the biology, but 6,300 dpi and 90 millisec are far beyond the specs that I would expect to see if they did the same tests on me. On the other hand, maybe I'm just insensitive and a little on the slow side... :-)
It seems like this could also go beyond robotics. There are other places where a touch-sensitive surface can be used for input. A touchpad for a laptop that could distinguish a user by fingerprints, for example.
All of the current robot makers have training programs for operators, although most don't call it certification (for instance, DaVinci surgical robots go to lengths to say its not certification--but I think that is to deflect lawsuits :)
The authors claim their prototype offers sensitivity "comparable" to humans plus that next they are improving them by growing even smaller nanowires. However, the big boon for robotics will be their ability to assemble things they can't today. In particular, robots are not good at putting in screws, because their fingers cannot sense when the threads are properly lined up. Robotic skin with built-in touch sensors like these could solve that problem.
Not being a specialist in biology, I'm curious about how these specs stack up to human sensitivity: "Wang's team grew 2.7 micron diameter zinc-oxide nanowires into arrays that were able to sense touch with a resolution of 6,300 dots per inch and a switching time of 90 milliseconds." Anyone?