Mostra el registre d'ítem simple
Activities of daily living monitoring via a wearable camera: toward real-world applications
dc.contributor.author | Cartas Ayala, Alejandro |
dc.contributor.author | Radeva, Petia |
dc.contributor.author | Dimiccoli, Mariella |
dc.contributor.other | Institut de Robòtica i Informàtica Industrial |
dc.date.accessioned | 2020-11-03T13:57:20Z |
dc.date.available | 2020-11-03T13:57:20Z |
dc.date.issued | 2020 |
dc.identifier.citation | Cartas, A.; Radeva, P.; Dimiccoli, M. Activities of daily living monitoring via a wearable camera: toward real-world applications. "IEEE access", 2020, vol. 8, p. 77344-77363. |
dc.identifier.issn | 2169-3536 |
dc.identifier.uri | http://hdl.handle.net/2117/331216 |
dc.description.abstract | Activity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been tested only on relatively small datasets consisting of data collected by a few users that are partially seen during training. In this paper, we built a new egocentric dataset acquired by 15 people through a wearable photo-camera and used it to test the generalization capabilities of several state-of-the-art methods for egocentric activity recognition on unseen users and daily image sequences. In addition, we propose several variants to state-of-the-art deep learning architectures, and we show that it is possible to achieve 79.87% accuracy on users unseen during training. Furthermore, to show that the proposed dataset and approach can be useful in real-world applications, where data can be acquired by different wearable cameras and labeled data are scarcely available, we employed a domain adaptation strategy on two egocentric activity recognition benchmark datasets. These experiments show that the model learned with our dataset, can easily be transferred to other domains with a very small amount of labeled data. Taken together, those results show that activity recognition from wearable photo-cameras is mature enough to be tested in real-world applications. |
dc.description.sponsorship | This work was supported in part by the TIN2018-095232-B-C21, in part by the SGR-2017 1742, in part by the Nestore ID: 769643, in partby the Validithi EIT Health Program, in part by the CERCA Programme/Generalitat de Catalunya, in part by the Spanish Ministry of Economy and Competitiveness, and in part by the European Regional Development Fund (MINECO/ERDF, EU) through the program Ramon y Cajal. The work of Alejandro Cartas was supported by a doctoral fellowship from the Mexican Council of Science andTechnology (CONACYT) under Grant 366596. |
dc.format.extent | 20 p. |
dc.language.iso | eng |
dc.publisher | Institute of Electrical and Electronics Engineers (IEEE) |
dc.rights | Attribution-NonCommercial-NoDerivs 3.0 Spain |
dc.rights.uri | http://creativecommons.org/licenses/by-nc-nd/3.0/es/ |
dc.subject | Àrees temàtiques de la UPC::Informàtica::Automàtica i control |
dc.subject.other | Pattern recognition |
dc.subject.other | Wearable cameras |
dc.subject.other | Activity recognition |
dc.subject.other | Domain adaptation |
dc.title | Activities of daily living monitoring via a wearable camera: toward real-world applications |
dc.type | Article |
dc.identifier.doi | 10.1109/ACCESS.2020.2990333 |
dc.description.peerreviewed | Peer Reviewed |
dc.subject.inspec | Classificació INSPEC::Pattern recognition |
dc.relation.publisherversion | https://ieeexplore.ieee.org/document/9078767 |
dc.rights.access | Open Access |
local.identifier.drac | 28989731 |
dc.description.version | Postprint (published version) |
local.citation.author | Cartas, A.; Radeva, P.; Dimiccoli, M. |
local.citation.publicationName | IEEE access |
local.citation.volume | 8 |
local.citation.startingPage | 77344 |
local.citation.endingPage | 77363 |
Fitxers d'aquest items
Aquest ítem apareix a les col·leccions següents
-
Articles de revista [376]