AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch

Max Yang*, Chenghua Lu, Alex Church, Yijiong Lin, Chris Ford, Haoran Li, Efi Psomopoulou, David A.W. Barton, Nathan F. Lepora

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

Human hands are capable of in-hand manipulation in the presence of different hand motions. For a robot hand, harnessing rich tactile information to achieve this level of dexterity still remains a significant challenge. In this paper, we present AnyRotate, a system for gravity-invariant multi-axis in-hand object rotation using dense featured sim-to-real touch. We tackle this problem by training a dense tactile policy in simulation and present a sim-to-real method for rich tactile sensing to achieve zero-shot policy transfer. Our formulation allows the training of a unified policy to rotate unseen objects about arbitrary rotation axes in any hand direction. In our experiments, we highlight the benefit of capturing detailed contact information when handling objects of varying properties. Interestingly, we found rich multi-fingered tactile sensing can detect unstable grasps and provide a reactive behavior that improves the robustness of the policy.

Original languageEnglish
Pages (from-to)4727-4747
Number of pages21
JournalProceedings of Machine Learning Research
Volume270
Publication statusPublished - 2024
Externally publishedYes
Event8th Conference on Robot Learning, CoRL 2024 - Munich, Germany
Duration: 6 Nov 20249 Nov 2024

Keywords

  • In-hand Object Rotation
  • Reinforcement Learning
  • Tactile Sensing

Fingerprint

Dive into the research topics of 'AnyRotate: Gravity-Invariant In-Hand Object Rotation with Sim-to-Real Touch'. Together they form a unique fingerprint.

Cite this