Visual feedback for humans about robots' perception in collaborative environments

View/Open
Document typeResearch report
Defense date2020-07-20
Rights accessOpen Access
Except where otherwise noted, content on this work
is licensed under a Creative Commons license
:
Attribution-NonCommercial-NoDerivs 3.0 Spain
Abstract
During the last years, major advances on artificial intelligence have successfully allowed robots to perceive their environment, which not only includes static but also dynamic objects such as humans. Indeed, robotic perception is a fundamental feature to achieve safe robots' autonomy in human-robot collaboration. However, in order to have true collaboration, both robots and humans should perceive each other’s intentions and interpret which actions they are performing.
In this work, we developed a visual representation tool that illustrates the robot's perception of the space that is shared with a person. Specifically, we adapted an existent system to estimate the human pose, and we created a visualisation tool to represent the robot's perception about the human-robot closeness.
We also performed a first evaluation of the system working in realistic conditions using the Tiago robot and a person as a test subject. This work is a first step towards allowing humans to have a better understanding about robots' perception in collaborative scenarios.
CitationGassó, J.; Olivares, A.; Alenyà, G. Visual feedback for humans about robots' perception in collaborative environments. 2020.
Is part ofIRI-TR-20-03
URL other repositoryhttps://www.iri.upc.edu/publications/show/2351
Files | Description | Size | Format | View |
---|---|---|---|---|
IRI-TR-20-03.pdf | 15,21Mb | View/Open |