FuCiTNet: improving the generalization of deep learning networks by the fusion of learned class-inherent transformations
Tipo de documentoArtículo
Fecha de publicación2020-10
Condiciones de accesoAcceso abierto
It is widely known that very small datasets produce overfitting in Deep Neural Networks (DNNs), i.e., the network becomes highly biased to the data it has been trained on. This issue is often alleviated using transfer learning, regularization techniques and/or data augmentation. This work presents a new approach, independent but complementary to the previous mentioned techniques, for improving the generalization of DNNs on very small datasets in which the involved classes share many visual features. The proposed model, called FuCiTNet (Fusion Class inherent Transformations Network), inspired by GANs, creates as many generators as classes in the problem. Each generator, k, learns the transformations that bring the input image into the k-class domain. We introduce a classification loss in the generators to drive the leaning of specific k-class transformations. Our experiments demonstrate that the proposed transformations improve the generalization of the classification model in three diverse datasets.
© <2020>. This manuscript version is made available under the CC-BY-NC-ND 4.0 license http://creativecommons.org/licenses/by-nc-nd/4.0/
CitaciónRey-Arena, M. [et al.]. FuCiTNet: improving the generalization of deep learning networks by the fusion of learned class-inherent transformations. "Information fusion", Octubre 2020, vol. 63, p. 188-195.
Versión del editorhttps://www.sciencedirect.com/science/article/abs/pii/S1566253520303122