Portland, Ore. - Robots have mastered picking and placing, welding, and similar tasks that can be precalibrated, but they cannot perform tasks that requite a sense of touch, such as "feeling" when a bolt's threads mesh before screwing it in. Even the most accurate robots today will strip the threads on bolts and otherwise damage items that require a sensitive tactile sense.
Electrical engineers at the University of Illinois (Urbana-Champaign) say they are on the way to solving this problem. The team has created a prototype robot "skin" from a flexible polymer with multiple sensors that simultaneously assess shape, force, hardness, motion, temperature and thermal conductivity.
"People can easily feel to line up the threads on a screw, because they evaluate many different simultaneous data streams from sensors on their skin. To give robots the same abilities, we have used photolithography to create 4 x 4 sensor arrays in a polymer-based microelectromechanical system (MEMS), plus the software algorithms that someday may enable a robot to perform the same kinds of tasks," said professor Chang Liu.
Liu performed the work with fellow EE professor Douglas Jones and their graduate students Jonathan Engel and Sung-Hoon Kim.
Robots with a sense of touch are rare, but even those with touch sensors usually just have a single strain gauge, making it impossible for them to determine the hardness of an object or even how hard they are squeezing it. Applying the same pressure to different objects may cause the robot to drop one that is very hard and slippery or break another that is soft and fragile. The challenge is to enable the robot to sense the material from which the object is made so that it can adjust its grip accordingly.
"Most people, when they close their eyes and put their fingers on an object, can make a pretty good guess at what that material is, and the reason is because you get multiple modalities of sensing. Namely, you measure the hardness, the roughness, the friction, the thermal conductivity and the temperature of the object," said Liu. The sensors built into the new tactile robotic skin perform the same functions, measuring hardness, surface roughness, contact force, temperature and thermal conductivity. "This is the basis for our tactile sensor array-to adequately inform the system of the nature of an object. Our sensor can tell the difference between metal, wood, plastic and other types of materials just by touching them."
The distribution of many different types of sensors is integrated with sophisticated software algorithms that intelligently determine the characteristics of the object, then apply the correct amount of force. To do that, the researchers used photolithography to pattern 200-micron side-by-side sensors into 4 x 4 arrays inside a flexible polymer that could someday become the skin of a robot's hand.
"Our approach is to integrate the readings from many different sensors in the skin of the robot's hand so that they apply enough force to keep it from slipping, but without so much force that it breaks," said Liu. The sensors are fabricated using MEMS.
The researchers made the sensors using a polymer substrate instead of a silicon substrate, thereby taking advantage of the former material's flexibility, robustness and cheaper cost for large-area lithography. "For example, to measure thermal conductivity, we place a small heater near a temperature sensor so that we can measure the heat-transfer properties of a material, which helps us to tell whether an object is steel or wood," Liu explained.
The tiny strain gauges in the sensor arrays are constructed from thin films of 200-nanometer-thick metal on a 50-micron Kapton substrate, which is etched with spin-coated photoresist. Then a 2-micron-thick layer of polyimide is spin coated on to form a diaphragm.
Next, photoresist is used to pattern 100-nm-thick nickel chromium films, thereby forming a strain gauge that changes in resistance when stretched. The other sensors in the array are likewise fashioned from a combination of low-temperature polymers and metal films.
"For hardness, we have a dual-membrane sensor that measures the differential displacement based on the same contact. It's basically two strain gauges with different strain constants so that you don't even need to calibrate the contact force; you only need the differential output from the two sensors to derive the harness of an object," said Liu.
For the future, the EEs plan to distribute large numbers of these arrays in a polymer fabric that can form the skin of future robots, with many local processors to break down the translation task between raw data and the physical characteristics of the object being handled. Then higher-level processors can decide how much pressure needs to be applied to safely hold an object.
"Today most engineers would not be very trusting about shaking hands with a robot, but we want to build more trust between humans and robots by making reliable sensor systems that analyze environmental objects accurately and reliably," said Liu.