Robust Door Operation with the Toyota Human Support Robot. Robotic perception, manipulation and learning

View/Open
Cita com:
hdl:2117/168802
Author's e-mailmiguel.arduengo
gmail.com

Document typeBachelor thesis
Date2019-02-21
Rights accessOpen Access
This work is protected by the corresponding intellectual and industrial property rights.
Except where otherwise noted, its contents are licensed under a Creative Commons license
:
Attribution-NonCommercial-NoDerivs 3.0 Spain
Abstract
Robots are progressively spreading to urban, social and assistive domains. Service robots operating in domestic environments typically face a variety of objects they have to deal with to fulfill their tasks. Some of these objects are articulated such as cabinet doors and drawers. The ability to deal with such objects is relevant, as for example navigate between rooms or assist humans in their mobility. The exploration of this task rises interesting questions in some of the main robotic threads such as perception, manipulation and learning. In this work a general framework to robustly operate different types of doors with a mobile manipulator robot is proposed. To push the state-of-the-art, a novel algorithm, that fuses a Convolutional Neural Network with point cloud processing for estimating the end-effector grasping pose in real-time for multiple handles simultaneously from single RGB-D images, is proposed. Also, a Bayesian framework that embodies the robot with the ability to learn the kinematic model of the door from observations of its motion, as well as from previous experiences or human demonstrations. Combining this probabilistic approach with state-of-the-art motion plannin
DegreeGRAU EN ENGINYERIA FÍSICA/GRAU EN ENGINYERIA EN TECNOLOGIES INDUSTRIALS
Files | Description | Size | Format | View |
---|---|---|---|---|
TFG_Miguel_Arduengo.pdf | 2,781Mb | View/Open |