Zero-shot approach allows robots to manipulate articulated objects

Celebrity Gig
Credit: Zhao et al.

To help humans to complete everyday manual tasks, robots should be able to reliably manipulate everyday objects that vary in shape, texture and size. Many conventional approaches to enable robotic manipulation of various objects rely on extensive training and precise programming, also delineating the properties of objects that the robots will be manipulating.

These approaches have significant limitations, as they allow robots to only excel at specific tasks and in controlled settings. When tested in real-world unpredictable settings and on tasks that require manipulating objects they never encountered before, the robots tend to perform poorly.

Researchers at Peking University, the Beijing Institute for General Artificial Intelligence and Queen Mary University of London have developed Tac-Man, a new tactile-informed approach that could improve the ability of robots to manipulate any articulate objects, such as doors, drawers and appliances, without prior experience with these objects and knowledge of their underlying mechanisms.

This approach, published in IEEE Transaction on Robotics, uses tactile feedback collected by sensors to guide the movements of robots, ensuring a stable contact between robots and the objects they are holding throughout a manipulation task.

“Traditional approaches to robotic manipulation rely heavily on prior knowledge of object mechanics,” Zihang Zhao, the study’s co-first author, told Tech Xplore.

“However, this becomes problematic when robots encounter unfamiliar objects or when object properties change unexpectedly. Our approach, in contrast, mimics how humans naturally interact with objects—through touch and real-time adjustment. When we open a drawer or door, we don’t consciously calculate its mechanical properties; instead, we intuitively feel our way through the motion and adjust as needed.”

Objects that are very similar in nature, such as cabinet doors or appliances, can have entirely different internal mechanisms. Conventional approaches to enhance a robot’s object manipulation skills require feeding robots detailed information about the objects they will be manipulating during training, yet this information might not apply to similar objects with different mechanisms.

READ ALSO:  A quantum neural network can see optical illusions like humans do. Could it be the future of AI?

In addition, some properties of objects and their underlying mechanisms can sometimes be difficult to model or can change over time. Tac-Man, the approach developed by Zhao and his colleagues, is designed to overcome the limitations of conventional techniques for robotic object manipulation, allowing robots to perform better in dynamic environments and on tasks that involve objects that they were not programmed to engage with.

“These limitations have long hindered robots’ ability to operate autonomously in dynamic environments,” said Yixin Zhu, co-corresponding author of the study.

“Our goal was to develop a system capable of handling articulated objects without relying on prior knowledge of their internal mechanisms. We aimed to create a more intuitive and adaptive approach that could work with any articulated object, regardless of its design or complexity.”

The Tac-Man approach relies on a sensing system that continuously monitors changes in the contact patterns between robots and objects. Using this collected tactile data, the system can detect and adapt to any deviations from intended movements in real-time.

The key advantage of this approach is that it allows robots to adjust their actions on the fly, rapidly adjusting their grip and movements based on the tactile information they are picking up. Compared to other methods introduced in the past, Tac-Man prioritizes tactile feedback over visual feedback, which can produce more natural and adaptive interactions.

Zero-shot approach allows robots to manipulate articulated objects
Credit: Zhao et al.

“Think of it like using your hand to open a drawer without looking,” explained Lecheng Ruan, co-corresponding author.

“You don’t need to know exactly how the drawer mechanism works—you can feel the correct direction through touch and adjust your movement accordingly. Tac-Man employs this same principle, using advanced tactile sensors to continuously monitor and adjust its interactions with objects.”

READ ALSO:  How the Metaverse Will Transform Marketing

The approach developed by this research team mimics the process via which humans interact with objects in their environment. After touching an object, the robot adjusts its grip to ensure stable contact with it, then slowly and tentatively starts moving it, thus gradually figuring out how best to manipulate it.

While completing a task, the robot continues to collect tactile feedback, using this feedback to further refine its grip and movements. This results in more natural movements, while also eliminating the robot’s need for object-specific programming.

“Touch is an essential part of how we interact with the world,” said Wanlin Li, co-author of the study. “In our research, we’ve focused on developing robust tactile sensing solutions for real-world applications.

“By working with GelSight-type sensors, we’ve found ways to balance sensitivity with practicality, ensuring that the system can detect even the slightest variations in pressure and texture while remaining durable enough for everyday tasks.”

The researchers evaluated Tac-Man in a series of real-world experiments and found that it allowed robots to manipulate a wide range of articulate objects, adapting to their properties over time. The objects that robots were tested on included drawers with distinct sliding mechanisms, cabinet doors with hinges at different locations and other furniture items with complex rotational mechanisms.

“In real-world scenarios, our GelSight-type sensors use a deformable silica gel layer that provides detailed contact information,” said Yuyang Li, co-first author of the study. “To replicate this in simulation, we developed a specialized tactile simulation model that closely approximates the behavior of physical tactile sensors.”

The team also assessed their approach in simulations ran on NVIDIA Isaac Sim. These simulations further highlighted the potential of their approach, showing that it produced more adaptable behavior and yielded higher task completion rates than other well-established computational methods for robot manipulation.

READ ALSO:  How to use ChatGPT, Bard, other chatbots

“The beauty of our approach lies in its simplicity and adaptability,” said Kaspar Althoefer, co-author of the study. “By relying on tactile feedback rather than complex pre-programming, we’ve created a more robust and practical solution for real-world robotics. This could significantly reduce the cost and complexity of deploying robots in new and dynamic environments.”

The new approach developed by this research group could soon be advanced further and tested on a broader range of robotic systems. In the future, it could help to boost the performance of robots on tasks that require carefully interacting with objects under uncertain conditions.

For instance, it could improve the performance of robots on various household chores, while also allowing them to carefully guide a patient’s limb movements during rehabilitation and even move rubble or rescue survivors after natural disasters. Meanwhile, the researchers are working to further improve specific aspects of their system, to produce increasingly natural movements and facilitate its large-scale deployment.

“Our work demonstrates that prior knowledge—once considered essential for manipulating articulated objects—may not be necessary,” added Zhu.

“This opens up exciting new possibilities for developing more autonomous and adaptable robotic systems. We’re eager to see how this technology will evolve and positively impact various industries in the years ahead.”

More information:
Zihang Zhao et al, Tac-Man: Tactile-Informed Prior-Free Manipulation of Articulated Objects, IEEE Transactions on Robotics (2024). DOI: 10.1109/TRO.2024.3508134.

© 2024 Science X Network

Citation:
Zero-shot approach allows robots to manipulate articulated objects (2024, December 9)
retrieved 9 December 2024
from

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Categories

Share This Article
Leave a comment