Title:

GS7-3 Robot Motion Generation by Hand Demonstration

Publication: ICAROB2021
Volume: 26
Pages: 768-771
ISSN: 2188-7829
DOI: 10.5954/ICAROB.2021.GS7-3
Author(s): Sakmongkon Chumkamon, Umaporn Yokkampon, Eiji Hayashi, Ryusuke Fujisawa
Publication Date: January 21, 2021
Keywords:
Abstract: Since traditional robot teaching requires time and instruction to the robot motion, we present a systematic framework based on deep learning and experiment for generating robot motion trajectories from human hand demonstration. In this system, the worker could teach robot easier rather than assigning the instruction to the robot controller manually. Therefore, the robot can imitate the action in a new situation instead of directly teaching the robot arm. Our contributions include three points 1) the real-time extracting method of hand movement without marker using hand detection in 3D from human 2) the motion generalization of the hand trajectories from human 3) Robot path planning for grasping and place the object to the target. We also present the experiment conducted by the user movement for real data and evaluate the system using the manipulator robot. The investigation shows the pick-and-place task of the robot for food by hand demonstration.
PDF File: https://alife-robotics.co.jp/members2021/icarob/data/html/data/GS/GS7/GS7-3.pdf
Copyright: © The authors.
This article is distributed under the terms of the Creative Commons Attribution License 4.0, which permits non-commercial use, distribution and reproduction in any medium, provided the original work is properly cited.
See for details: https://creativecommons.org/licenses/by-nc/4.0/

ALife Robotics Corporation Ltd.

HOME

 

 

(c)2008 Copyright The Regents of ALife Robotics Corporation Ltd. All Rights Reserved.