An adaptive gradient-search based algorithm for discriminative training of hmm's
Tipo de documentoComunicación de congreso
Fecha de publicación1998
EditorRobert H. Mannel and Jordi Robert-Ribes
Condiciones de accesoAcceso restringido por política de la editorial
Although having revealed to be a very powerful tool in acoustic modelling, discriminative training presents a major drawback: the lack of a formulation guaranteeing convergence in no matter which initial conditions, such as the Baum-Welch algorithm in maximum likelihood training. For this reason, a gradient descent search is usually used in this kind of problem. Unfortunately, standard gradient descent algorithms rely heavily on the election of the learning rates. This dependence is specially cumbersome because it represents that, at each run of the discriminative training procedure, a search should be carried out over the parameters ruling the algorithm. In this paper we describe an adaptive procedure for determining the optimal value of the step size at each iteration. While the calculus and memory overhead of the algorithm is negligible, results show less dependence on the initial learning rate than standard gradient descent and, using the same idea in order to apply self-scaling, it clearly outperforms it.
CitaciónNogueiras, A.; Mariño, J.; Monte, E. An adaptive gradient-search based algorithm for discriminative training of hmm's. A: 5th International Conference on spoken language processing (ICSLP'98). "ICSLP'98 Proceedings". Sidney: Robert H. Mannel and Jordi Robert-Ribes, 1998, p. 2979-2982.
ISBN1 876346 17 5
Versión del editorhttp://nlp.lsi.upc.edu/papers/nogueiras98.pdf
|An Adaptive Gra ... tive Training of HMM's.pdf||294.0Kb||Acceso restringido|