It can be a pain for workers to sort recycling, both because of the safety and the sheer monotony of it. But how do you get robots to do the job when they can’t always tell the difference between a can and a cardboard tube? For MIT CSAIL, it’s simple: give the robots a sense of touch. Its researchers have developed a recycling robot, RoCycle, that uses sensors in its hand to determine the nature of an item and sort it accordingly. A strain sensor gauges an object’s size, while two pressure sensors determine how squishy that object may be, whether it’s easily-crushed paper or more rigid plastic. It can even detect the presence of metal, since the sensors are conductive.

The hands themselves are made out of custom auxetics (materials that get wider when stretched) that twist when cut. Each finger in the robot’s hand includes “left-handed” and “right-handed” auxetics that counter each other’s rotation, allowing for more dynamic movement than a typical robot hand without having to resort to the air pumps and compressors of soft robots.

You’re not about to see RoCycle take over at the local recycling facility. It’s 85 percent accurate when objects are stationary, but only 63 percent accurate with a simulated conveyor belt. This also doesn’t factor in the complexities of sorting recycling in real life. What if somebody put their empty soft drink cans back into a cardboard box, for example? While there are refinements underway, the next big leap is likely to involve a planned combination of the touch system with camera-based computer vision.

To read the full story, visit https://www.engadget.com/2019/04/11/mit-recycling-robot/.

Sponsor