Researchers have spent years trying to teach robots how to grip different objects without crushing or dropping them. They could be one step closer, thanks to this low-cost, sensor-packed glove. In a paper published in Nature, a team of MIT scientists share how they used the glove to help AI recognize objects through touch alone. That information could help robots better manipulate objects, and it may aid in prosthetics design.
The “scalable tactile glove,” or STAG, is a simple knit glove packed with more than 550 tiny sensors. The researchers wore STAG while handling 26 different objects — including a soda can, scissors, tennis ball, spoon, pen and a mug. As they did, the sensors gathered pressure-signal data, which was interpreted by a neural network. The system predicted the objects’ identity on touch alone with up to 76 percent accuracy, and it was able to predict the weight of most objects within about 60 grams. Read More