MagicGripper: A Mini-MagicTac Integrated Gripper Enabling Multimodal Perception in Contact-Rich Manipulation

Wen Fan, Haoran Li, Qingzheng Cong, Dandan Zhang*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Contact-rich robotic manipulation in unstructured environments demands reliable multimodal perception. Here, we present MagicGripper, a multimodal robotic gripper built around mini-MagicTac, a compact variant of the MagicTac sensor. Mini-MagicTac embeds multi-layer grid structures in a 3D-printed elastomer, enabling visual, proximity, and tactile sensing in a gripper-compatible form factor. In this paper, we introduce the design and multimodal perception capabilities of mini-MagicTac, as well as two algorithmic frameworks for proximity and contact detection. Experimental evaluations show that mini-MagicTac achieves high spatial resolution, accurate contact localisation, and robust force estimation under mechanical and manufacturing variations. Autonomous grasping trials further validate MagicGripper's reliable multimodal perception and adaptability to complex manipulation scenarios. These results demonstrate MagicGripper as a compact and versatile platform for embodied intelligence in contact-rich environments. Note to Practitioners - Robotic end-effectors often break down when a task calls for both 'eyes' and 'skin': Adding multiple sensors usually makes the gripper bulky, fragile, and expensive to build. MagicGripper shows one practical way around that trade-off. Each finger is 3D-printed without casting or post-assembly is required; inside the soft skin a multi-layer grid acts as sensing feature, letting embedded camera read visual, proximity, and tactile cues simultaneously.

Original languageEnglish
Pages (from-to)24311-24332
Number of pages22
JournalIEEE Transactions on Automation Science and Engineering
Volume22
DOIs
Publication statusPublished - 2025

Keywords

  • multi-modality sensing
  • robotic manipulation
  • Vision-based tactile sensor

Cite this