TY - JOUR
T1 - Visual-Tactile Robot Grasping Based on Human Skill Learning From Demonstrations Using a Wearable Parallel Hand Exoskeleton
AU - Lu, Zhenyu
AU - Chen, Lu
AU - Dai, Hengtai
AU - Li, Haoran
AU - Zhao, Zhou
AU - Zheng, Bofang
AU - Lepora, Nathan F.
AU - Yang, Chenguang
N1 - Publisher Copyright:
© 2016 IEEE.
PY - 2023/9/1
Y1 - 2023/9/1
N2 - The soft fingers and strategic grasping skills enable the human hands to grasp objects in a stable manner. This letter is to model human grasping skills and transfer the learned skills to robots to improve grasping quality and success rate. First, we designed a wearable tool-like parallel hand exoskeleton equipped with optical tactile sensors to acquire multimodal information, including hand positions and postures, the relative distance of the exoskeleton claws, and tactile images. Using the demonstration data, we summarized three characteristics observed from human demonstrations, involving varying-speed actions, grasping effect read from tactile images and grasping strategies for different positions. The characteristics were then utilized in the robot skill modelling to achieve a more human-like grasp. Since no force sensors are fixed to the claws, we introduced a new variable, called 'grasp depth', to represent the grasping effect on the object. The robot grasping strategy diagram is constructed as follows: First, grasp quality is predicted using a linear array network (LAN) and global visual images as inputs. The conditions such as grasp width, depth, position, and angle are also predicted. Second, with the grasp width and depth of the object determined, dynamic movement primitives (DMPs) are employed to mimic human grasp actions with varying velocities. To further enhance grasp quality, a final action adjustment based on tactile detection is performed during the near-grasp time. The proposed strategy was validated through experiments conducted with a Franka robot with a self-designed gripper. The results demonstrate that robot grasping test achieved an increase in the grasping success rate from 82% to 96%, compared to the results obtained by pure LAN and constant grasp depth testing.
AB - The soft fingers and strategic grasping skills enable the human hands to grasp objects in a stable manner. This letter is to model human grasping skills and transfer the learned skills to robots to improve grasping quality and success rate. First, we designed a wearable tool-like parallel hand exoskeleton equipped with optical tactile sensors to acquire multimodal information, including hand positions and postures, the relative distance of the exoskeleton claws, and tactile images. Using the demonstration data, we summarized three characteristics observed from human demonstrations, involving varying-speed actions, grasping effect read from tactile images and grasping strategies for different positions. The characteristics were then utilized in the robot skill modelling to achieve a more human-like grasp. Since no force sensors are fixed to the claws, we introduced a new variable, called 'grasp depth', to represent the grasping effect on the object. The robot grasping strategy diagram is constructed as follows: First, grasp quality is predicted using a linear array network (LAN) and global visual images as inputs. The conditions such as grasp width, depth, position, and angle are also predicted. Second, with the grasp width and depth of the object determined, dynamic movement primitives (DMPs) are employed to mimic human grasp actions with varying velocities. To further enhance grasp quality, a final action adjustment based on tactile detection is performed during the near-grasp time. The proposed strategy was validated through experiments conducted with a Franka robot with a self-designed gripper. The results demonstrate that robot grasping test achieved an increase in the grasping success rate from 82% to 96%, compared to the results obtained by pure LAN and constant grasp depth testing.
KW - data-driven human modeling
KW - exoskeleton
KW - Force and tactile sensing
KW - learning from demonstration
KW - robot grasping
UR - http://www.scopus.com/inward/record.url?scp=85164788089&partnerID=8YFLogxK
U2 - 10.1109/LRA.2023.3295296
DO - 10.1109/LRA.2023.3295296
M3 - Article
AN - SCOPUS:85164788089
SN - 2377-3766
VL - 8
SP - 5384
EP - 5391
JO - IEEE Robotics and Automation Letters
JF - IEEE Robotics and Automation Letters
IS - 9
ER -