Backup Header Below

One of Facebook’s first moves as Meta: Teaching robots to touch and feel

A defining feature of this metaverse will be creating a feeling of presence in the virtual world. Presence could mean simply interacting with other avatars and feeling like you are immersed in a foreign landscape. Or, it could even involve engineering some sort of haptic feedback for users when they touch or interact with objects in the virtual world. (A primitive form of this is when your controller used to vibrate after you hit a ball in Wii tennis).

As part of all this, a division of Meta called Meta AI wants to help machines learn how humans touch and feel by using a robot finger sensor called DIGIT, and a robot skin called ReSkin.

“We designed a high-res touch sensor and worked with Carnegie Mellon to create a thin robot skin,” Meta CEO Mark Zuckerberg wrote in a Facebook post today. “This brings us one step closer to realistic virtual objects and physical interactions in the metaverse.”

Meta AI sees robot touch as an interesting research domain that can further help artificial intelligence be better by receiving feedback from the environment. By working on this research centered around touch, Meta AI wants to both push the field of robotics further, and also possibly use this tech to incorporate a sense of touch into the metaverse down the road.

“Robotics is the nexus where people in AI research are trying to get the full loop of perception, reasoning, planning and action, and getting feedback from the environment,” says Yann LeCun, chief AI scientist at Meta. Going further, LeCun thinks that being able to understand how real objects feel might be vital context for AI assistants (like Siri or Alexa) to know if they were to one day help humans navigate around an augmented or virtual world.

That’s different from how a human works. “From a human perspective, we extensively use touch,” Meta AI research scientist Roberto Calandra, who works with DIGIT, says. “Historically, in robotics, touch has always been considered a sense that would’ve been extremely useful to have. But because of technical limitations, it has been hard to have a widespread use of touch.”

The goal of DIGIT, the research team wrote in a blog post, is to create a compact robot fingertip that is cheap and able to withstand wear and tear from repeated contact with surfaces. They also need to be sensitive enough to measure properties like surface features and contact forces.

When humans touch an object, they can gauge the general shape of what they’re touching and recognize what it is. DIGIT tries to mimic this through a vision-based tactile sensor.

DIGIT consists of a gel-like silicone pad shaped like the tip of your thumb that sits on top of a plastic square. That plastic container houses sensors, a camera, and PCB lights that line the silicone. Whenever you touch an object with the silicone gel, which resembles a disembodied robot finger, it creates shadows, or changes in color hues in the emerging image that is recorded by the camera. In other words, the touch it is sensing is expressed visually.

“What you really see is the geometrical deformation and the geometrical shape of the object you are touching,” Calandra says. “And from this geometrical deformation, you can also infer the forces that are being applied on the sensor.”

As a companion to the DIGIT sensor, the team is also opening up a machine learning library for touch processing called PyTouch.

DIGIT’s touch sensor can reveal a lot more than just looking at the object. It can provide information about the object’s contours, textures, elasticity or hardness, and depth of force that can be applied to it, says Mike Lambeta, a hardware engineer working on DIGIT at Meta AI. An algorithm can combine that information and provide feedback to a robot on how best to pick up, manipulate, move, and grasp different objects from eggs to marbles.

Source : https://www.popsci.com/technology/meta-ai-metaverse-robot-finger-skin/

    Other Press Releases