Transferring human skills to a Baxter robot
Tutor / director / evaluatorAranda López, Juan
Document typeMaster thesis
Rights accessOpen Access
Learning by demonstration is a machine learning technique which, in the robotics field, can be used for teaching robots new tasks without the need to manually program them. These kind of methods require a model from which the robot acquires the new knowledge. In the present paper, learning by demonstration is applied in a Baxter cobot using human movement as model. To achieve this, a Kinect V2 is used to record the cartesian trajectory of a human hand moving. The framework developed captures the relative coordinates of the right hand with respect to the subject’s torso, avoiding the need of a precise camera setup for each recording. After choosing and end-effector orientation, the cartesian data is then processed into a valid 7 DOF trajectory for the Baxter arm, which is used as model trajectory. The model trajectory is fed into a DMP (Dynamic Movement Primitives) algorithm with new starting and final trajectory positions as parameters. Once the DMP trajectories are generated, they are tested both in simulation and in the real Baxter robot. For this, a set of experiments have been designed to prove that the generated DMP trajectories can be used as approaching moves for different pickup operations. The setup used has been a kitchen environment with common objects placed in tables and shelves. The results show that the framework developed is able to create a valid model trajectory for the Baxter robot from human data. The DMP trajectories generated comply with the velocity and arm configuration limits of the robot, are smooth and continuous and fulfill the new requested start and goal positions for movements that do not require major orientation changes.