More

    Meta AI researchers give robots a sense of touch and we’re getting all the creepy feels


    AI has given robots the ability to ‘hear’ and ‘see’ the world to understand human orders and carry out tasks better, but Meta’s AI researchers are now testing ways to let robots mimic the sense of touch, too. The Fundamental AI Research (FAIR) division of Meta has just debuted a set of tools that could make robotic tools able to detect, decipher, and react to what they touch. That could make even the most basic robot arm sensitive enough to handle delicate objects without damaging them and make them useful in more settings.

    Meta showcased a combination of new technologies and features that work together to give robots the ability to feel things. Touch-sensing tech Sparsh gives AI a way of identifying things like pressure, texture, and movement without needing a huge database. It’s like an AI version of how you can feel something in the dark and describe how it feels even if you don’t know what you’re touching.

    https://cdn.mos.cms.futurecdn.net/firCGnzcaqaaFBFboXEPVd-1200-80.png



    Source link
    erichs211@gmail.com (Eric Hal Schwartz)

    Latest articles

    spot_imgspot_img

    Related articles

    spot_imgspot_img