A seemingly simple job like selecting the right screw is too much for today’s typical industrial robot. Even after recognizing and picking up the screw, the robot can’t be sure of where it’s holding the object, which way the object faces and whether the object is slipping. Robots that “see” often can’t cope with such fine details or turn out to be very complex and expensive. Researchers are beginning to realize that instilling a sense of touch may be a cheaper way of getting a blind robot to recognize and handle an object properly. This has prompted interest in developing “hands” with a subtle touch and sophisticated means of making sense of the data gathered by sensors. At Carnegie-Mellon University in Pittsburgh, Melvin W. Siegel and Gregory M.
Toto have developed a tactile sensor using a specially treated polyvinylidene film. When this polymer film is compressed, it generates an electrical signal, but only while the pressure is changing. to compensate for this limitation, the researchers focused on electronic and computational methods for capturing the transient signal and for assembling and understanding tactile images. Their prototype device, which is small enough for a human hand to grasp comfortably, consists of 16 separate sensor pads. Each sensor has its own microprocessor, which relays signals by way of a “supervising” microprocessor to a host computer for analysis. Future plans call for producing the polymer film in the form of a “skin” with a large number of pressure-sensitive dots. Such “artificial skins” have already been the subject of considerable research.
Scientists at the Massachusetts Institute of Technology have designed a flexible rubber sensor that fits on the tip of a finger and actually has a skinlike texture. This touch sensor has three layers. The base is a printed circuit board etched with parallel lines that conduct electricity. The top layer is a sheet of silicone rubber that has been treated to conduct current and is oriented so that its lines are at right angles to those on the circuit board. A nylon mesh or a fine film of sprayed lacquer separates these two layers.
The conducting layers meet only when pressure is applied, and each intersection where contact is made sends a signal to a computer. But rubber tends to deform with repeated use, and more work is needed on transforming these signals into one global picture of an object. A simpler answer may be to let robots “read” objects in the s smae way that blind people read Braille. Gale Nevill and Robert Patterson of the University of Florida in Gainesville have invented an inexpensive sensor that mimics the ridges of a human fingerprint. Rubbed across a surface, ridges of silicone rubber create vibrations.
A small sensor picks up the vibrations and transmits them to a computer where the pattern of vibrations is analyzed. This sensor can now read Braille, identify grades of sandpaper and tell in which direction the slot in the head of a screw is pointing. But the day when a robot’s steely grip softens enough to handle mundane but delicate tasks like picking up a coffee-filled Styrofoam cup or juggling a raw egg still seems far away.