Anticipating human activities from object interaction cues
16 ROMAN final 0209.pdf (1,311Mb) (Restricted access) Request copy
Què és aquest botó?
Aquest botó permet demanar una còpia d'un document restringit a l'autor. Es mostra quan:
- Disposem del correu electrònic de l'autor
- El document té una mida inferior a 20 Mb
- Es tracta d'un document d'accés restringit per decisió de l'autor o d'un document d'accés restringit per política de l'editorial
Document typeConference report
PublisherInstitute of Electrical and Electronics Engineers (IEEE)
Rights accessRestricted access - publisher's policy
A real time approach to the recognition of human activity based on the interaction with objects in domestic settings is presented. It is based on capturing partial images where the activity takes place using a color camera, and processing the images to recognize the present objects and their location changes. For object description and recognition, a histogram on a normalized chromaticity space has been selected. The interaction with the objects is classified as four types of possible actions regarding change in location (add, remove, move or unchanged). Activities are defined as receipts, where objects plays the role of ingredients, tools or substitutes. Sensed objects and actions are then used to analyze in real time the probability of the human activity that is being performed at the moment in a continuous activity sequence. Tests are carried on a automated kitchen scenario where proactive assistance wants to be implemented taking into account the early recognition user activity.
CitationAranda, J., Vinagre, M. Anticipating human activities from object interaction cues. A: IEEE International Symposium on Robot and Human Interactive Communication. "The 25th IEEE International Symposium on Robot and Human Interactive Communication: August 26 to August 31, 2016 Teachers College, Columbia University New York, USA". New York: Institute of Electrical and Electronics Engineers (IEEE), 2016, p. 58-63.
|16 ROMAN final 0209.pdf||1,311Mb||Restricted access|
All rights reserved. This work is protected by the corresponding intellectual and industrial property rights. Without prejudice to any existing legal exemptions, reproduction, distribution, public communication or transformation of this work are prohibited without permission of the copyright holder