Researchers at MIT have announced that for the first time they have enabled a soft robotic arm to be aware of its configuration in 3D space by analyzing motion and position data from its own sensorized skin. traditionally, these tasks were achieved through the use of large systems of multiple motion-capture cameras that provided the robots with feedback on their 3D movement and position.
‘soft robots constructed from highly compliant materials, similar to those found in living organisms, are being championed as safer, and more adaptable, resilient, and bioinspired alternatives to traditional rigid robots,’ comments the official MIT release. ‘but giving autonomous control to these deformable robots is a monumental task because they can move in a virtually infinite number of directions at any given moment. that makes it difficult to train planning and control models that drive automation.’
once covered with it, MIT’s sensorized skin will give robots proprioception or the awareness of motion and position of its body. this information runs into a deep-learning model that sifts through the noise and captures clear signals to estimate the robot’s 3D configuration. the system was tested by the researchers on a soft robotic arm resembling an elephant trunk.
the sensors are fabricated using materials found on every lab, meaning each one can develop their own system, says ryan truby, a postdoc in the MIT computer science and artificial laboratory (CSAIL) who is co-first author on the paper along with CSAIL postdoc cosimo della santina. ‘we’re sensorizing soft robots to get feedback for control from sensors, not vision systems, using a very easy, rapid method for fabrication,’ he continues. ‘we want to use these soft robotic trunks, for instance, to orient and control themselves automatically, to pick things up and interact with the world. this is a first step toward that type of more sophisticated automated control.’
name: sensorized skin for soft robots
developed by: MIT computer science and artificial laboratory (CSAIL)