Mostra el registre d'ítem simple

dc.contributor.authorCartas Ayala, Alejandro
dc.contributor.authorRadeva, Petia
dc.contributor.authorDimiccoli, Mariella
dc.contributor.otherInstitut de Robòtica i Informàtica Industrial
dc.date.accessioned2020-11-03T13:57:20Z
dc.date.available2020-11-03T13:57:20Z
dc.date.issued2020
dc.identifier.citationCartas, A.; Radeva, P.; Dimiccoli, M. Activities of daily living monitoring via a wearable camera: toward real-world applications. "IEEE access", 2020, vol. 8, p. 77344-77363.
dc.identifier.issn2169-3536
dc.identifier.urihttp://hdl.handle.net/2117/331216
dc.description.abstractActivity recognition from wearable photo-cameras is crucial for lifestyle characterization and health monitoring. However, to enable its wide-spreading use in real-world applications, a high level of generalization needs to be ensured on unseen users. Currently, state-of-the-art methods have been tested only on relatively small datasets consisting of data collected by a few users that are partially seen during training. In this paper, we built a new egocentric dataset acquired by 15 people through a wearable photo-camera and used it to test the generalization capabilities of several state-of-the-art methods for egocentric activity recognition on unseen users and daily image sequences. In addition, we propose several variants to state-of-the-art deep learning architectures, and we show that it is possible to achieve 79.87% accuracy on users unseen during training. Furthermore, to show that the proposed dataset and approach can be useful in real-world applications, where data can be acquired by different wearable cameras and labeled data are scarcely available, we employed a domain adaptation strategy on two egocentric activity recognition benchmark datasets. These experiments show that the model learned with our dataset, can easily be transferred to other domains with a very small amount of labeled data. Taken together, those results show that activity recognition from wearable photo-cameras is mature enough to be tested in real-world applications.
dc.description.sponsorshipThis work was supported in part by the TIN2018-095232-B-C21, in part by the SGR-2017 1742, in part by the Nestore ID: 769643, in partby the Validithi EIT Health Program, in part by the CERCA Programme/Generalitat de Catalunya, in part by the Spanish Ministry of Economy and Competitiveness, and in part by the European Regional Development Fund (MINECO/ERDF, EU) through the program Ramon y Cajal. The work of Alejandro Cartas was supported by a doctoral fellowship from the Mexican Council of Science andTechnology (CONACYT) under Grant 366596.
dc.format.extent20 p.
dc.language.isoeng
dc.publisherInstitute of Electrical and Electronics Engineers (IEEE)
dc.rightsAttribution-NonCommercial-NoDerivs 3.0 Spain
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/3.0/es/
dc.subjectÀrees temàtiques de la UPC::Informàtica::Automàtica i control
dc.subject.otherPattern recognition
dc.subject.otherWearable cameras
dc.subject.otherActivity recognition
dc.subject.otherDomain adaptation
dc.titleActivities of daily living monitoring via a wearable camera: toward real-world applications
dc.typeArticle
dc.identifier.doi10.1109/ACCESS.2020.2990333
dc.description.peerreviewedPeer Reviewed
dc.subject.inspecClassificació INSPEC::Pattern recognition
dc.relation.publisherversionhttps://ieeexplore.ieee.org/document/9078767
dc.rights.accessOpen Access
local.identifier.drac28989731
dc.description.versionPostprint (published version)
local.citation.authorCartas, A.; Radeva, P.; Dimiccoli, M.
local.citation.publicationNameIEEE access
local.citation.volume8
local.citation.startingPage77344
local.citation.endingPage77363


Fitxers d'aquest items

Thumbnail

Aquest ítem apareix a les col·leccions següents

Mostra el registre d'ítem simple