Skip to content

Artificial Intelligence has been honed to master the sense of touch and vision, enabling robots to manipulate objects with a human-like dexterity.

Robots now advance to integrate both vision and tactile sensation, allowing for more precise object manipulation, mimicking human capabilities.

Artificial Intelligence-Powered Devices Master Physical Sensations for Object Manipulation Similar...
Artificial Intelligence-Powered Devices Master Physical Sensations for Object Manipulation Similar to Humans

Artificial Intelligence has been honed to master the sense of touch and vision, enabling robots to manipulate objects with a human-like dexterity.

In a groundbreaking development, a new integrated tactile-vision robotic system named TactileAloha has been introduced. This system, built upon the existing Aloha platform, boasts a tactile sensor mounted on the gripper, enabling it to adaptively change its generated manipulation sequence based on tactile sensing in a systematic manner.

The research behind TactileAloha was conducted by a collaborative team consisting of members from Tohoku University's Graduate School of Engineering, the Centre for Transformative Garment Production, Hong Kong Science Park, and the University of Hong Kong. The research was published in the esteemed IEEE Robotics and Automation Letters.

The system introduces two bimanual tasks: zip tie insertion and Velcro fastening, both requiring tactile sensing to perceive object texture and align two object orientations by two hands. To enhance action precision, an improved temporal aggregation scheme is employed at deployment. Furthermore, a weighted loss function is used during training to emphasize near-future actions.

TactileAloha supports real-time visualization during teleoperation, facilitating efficient data collection and manipulation. The system leverages tactile information to handle texture-related tasks that camera vision-based methods often struggle with. The research addresses the challenge of tactile texture in robotic manipulation, a difficulty that has been a long-standing issue in the field.

The combined observations are processed by a transformer-based policy with action chunking to predict future actions. The tactile signals are encoded with a pre-trained ResNet and fused with visual and proprioceptive features.

The research, titled "TactileAloha: Learning Bimanual Manipulation with Tactile Sensing", demonstrates impressive performance. The proposed system achieves an average relative improvement of approximately 11.0% compared to state-of-the-art methods with tactile input.

Despite extensive searches, no information could be found regarding who developed the system TactileAloha or when it was released. Nonetheless, the potential applications of this innovative system in various fields of robotics are vast and promising.

Read also: