The normalized backpropagation and some experiments on speech recognition
Document typeConference report
Rights accessOpen Access
In the paper we present the theoretical development of the normalized backpropagation, and we compare it with other algorithms that have been presented in the literature. The algorithm that we propose is based on the idea of normalizing the adaptation step in the gradient search by the variance of the input. This algorithm is simple and gives good results in comparison with other algorithms that accelerate the learning and has the additional advantage that the parameters are calculated by the algorithm, so the user does not have to make several trials in order to trim the adaptation step and the momentum until the best combination is found. The task which we have designed in order to compare the algorithms is the recognition of digits in the Catalan language, with a data base of 1000 items, spoken by 10 speakers. The algorithms that we have compared with the normalized back propagation are: D.E.Rumelhart and J .L. McCielland, Franzini, Suddhard, Fahlman, Monte.
CitationMonte, E., Mariño, J.B. The normalized backpropagation and some experiments on speech recognition. A: Neural Networks - EURASIP Workshop. "Neural Networks - EURASIP Workshop 1990: Sesimbra, Portugal: February 15-17, 1990: proceedings". Sesimbra: Springer, 1990, p. 1515-1519.