close
close

Meta AI has created new tools that will allow robots to touch and feel like humans.

Meta AI has created new tools that will allow robots to touch and feel like humans.

Meta’s artificial intelligence research group, known as FAIR (Fundamental Artificial Intelligence Research), is pushing robotics forward with new tools that aim to give robots the ability to “feel,” move skillfully, and work alongside humans. These advances aim to create robots that are not only technically capable, but can also perform real-world tasks in a way that makes them feel natural and safe around humans. Here’s a simple breakdown of what they announced and why it’s important.

Imagine the everyday tasks people perform: grabbing a cup of coffee, setting out plates, or even shaking hands. All of this requires a sense of touch and careful control that people take for granted. Robots, however, do not have such abilities. They typically rely on vision or programmed instructions, making it difficult for them to handle fragile objects, understand textures, or adapt to changes in place.

Meta’s latest tools help robots overcome these limitations. By giving robots a sense of “touch” through advanced sensors and systems, these tools can enable robots to perform tasks with the same sensitivity and adaptability that humans use. This could open up a whole world of possibilities for robots in areas such as healthcare, manufacturing and even virtual reality.

Meta has released three new technologies to improve the touch, movement and collaboration of robots with humans:

1. Meta Sparsh:
Sparsh is a sensor technology that helps AI recognize textures, pressure, and even movement through touch rather than just vision. Unlike many AI systems that require labeled data for each task, Sparsh uses raw data, making it more adaptable and accurate for different tasks. It’s as if the robot had a general sense of touch that could work with many different touch sensors.

2. MetaDigital 360:
Digit 360 is an advanced artificial fingertip with human-level touch sensitivity. It can sense subtle differences in texture and detect very small forces, picking up details of touch much like a human finger. It features a powerful lens that covers the entire fingertip, allowing it to “see” in all directions and even respond to different temperatures. This makes it useful for sensitive tasks such as medical procedures or detailed virtual interactions.

3. Metadigital plexus:
Plexus is a system that connects multiple touch sensors in a robotic arm, providing a sense of touch from the fingertips to the palm. This makes it easier to create robotic arms that can move with the finely tuned controls that humans have, helping researchers create robots that can handle fragile or unusual objects.

Meta does not develop these tools alone. They partner with two companies, GelSight Inc. and Wonik Robotics, for the production and distribution of these sensory tools:

– GelSight Inc. will produce and distribute the Digit 360 fingertip sensor. Researchers can apply for early access and explore new uses of this technology in areas such as healthcare, robotics, and more.

– Wonik Robotics is integrating Plexus technology into its existing Allegro Hand robotic arm, expanding its sensor capabilities and making it available to researchers who want to study or develop the technology.

These partnerships help ensure that researchers and developers around the world have access to these cutting-edge tools, allowing them to explore and expand Meta’s work in the areas of sensory perception and robotic dexterity.

New PARTNR tool

For robots to be useful in our daily lives, they need more than just physical skills; they also need to work well with people. To solve this problem, Meta created a tool called PARTNR that helps test and improve how robots handle tasks that require interaction with humans.

PARTNR operates as a simulation platform that allows robots to perform various household tasks and situations with “virtual” human partners. This is a safer and more scalable way to train robots before using them in real-life situations. This allows AI to acquire important social and spatial skills, such as the ability to move in shared spaces or adapt to human instructions.

With PARTNR, researchers can evaluate how robots and artificial intelligence plan, track tasks, and cope with unexpected events. This is the key to creating robots that not only work on their own, but also collaborate effectively with humans.

Meta’s advances in robotic touch and human-robot interaction could have a major impact on various industries:

– Healthcare: Robots with fine sensory skills can assist in operations or provide care assistance with gentle movements.
– Manufacturing and Warehousing: Robots can handle fragile items or perform complex assembly tasks.
– Virtual Reality: Robots and virtual reality devices can “feel” objects, creating a more immersive experience.