Generative Topographic Mapping (GTM) is a
manifold learning model for the simultaneous visualization and clustering of multivariate data. It was originally formulated as a constrained mixture of distributions, for which the adaptive parameters were determined by Maximum Likelihood (ML), using the Expectation-Maximization (EM) algorithm. In this formulation, GTM is prone to data overfitting unless a regularization mechanism is included. The theoretical principles of Variational GTM, an approximate method that provides a full
Bayesian treatment to a Gaussian Process (GP)-based variation of the GTM, were recently introduced as alternative way to
control data overfitting. In this paper we assess in some detail the generalization capabilities of Variational GTM and compare
them with those of alternative regularization approaches in terms of test log-likelihood, using several artificial and real datasets.
CitationOlier, I.; Vellido, A. On the benefits for model regularization of a variational formulation of GTM. A: IEEE World Congress on Computational Intelligence / International Joint-Conference on Artificial Neural Networks. "IEEE International Joint Conference on Neural Networks 2008". IEEE, 2008, p. 1569-1576.
All rights reserved. This work is protected by the corresponding intellectual and industrial property rights. Without prejudice to any existing legal exemptions, reproduction, distribution, public communication or transformation of this work are prohibited without permission of the copyright holder. If you wish to make any use of the work not provided for in the law, please contact: email@example.com