论文部分内容阅读
This paper presents the hybrid algorithm of global optimization of dynamic leing rate for multilayer feedforward neural networks (MLFNN).The effect of inexact line search on conjugacy was studied, based on which a generalized conjugate gradient method was proposed, showing global convergence for error backpagation of MLFNN. It overcomes the drawback of conventional BP and Polak-Ribieve conjugate gradient algorithms that maybe plunge into local minima. The hybrid algorithm’s recognition rate is higher than that of Polak-Ribieve algorithm and convergence BP for test data, its training time is less than that of Fletcher-Reeves algorithm and far less than that of convergence BP, and it has a less complicated and stronger robustness to real speech data.