论文部分内容阅读
In this paper, the constrained optimization technique for a substantial prob-lem is explored, that is accelerating training the globally recurrent neural net-work. Unlike most of the previous methods in feedforward neuxal networks, the authors adopt the constrained optimization technique to improve the gradiellt-based algorithm of the globally recuxrent neural network for the adaptive learn-ing rate during training. Using the recurrent network with the improved algo-rithm, some experiments in two real-world problems, namely filtering additive noises in acoustic data and classification of temporal signals for speaker identifi-cation, have been performed. The experimental results show that the recurrent neural network with the improved learning algorithm yields significantly faster training and achieves the satisfactory performance.
Unlike most of the previous methods in feedforward neuxal networks, the authors adopt the constrained optimization technique to improve the gradiellt-based algorithm of the globally recuxrent neural network for the adaptive learn-ing rate during training. Using the recurrent network with the improved algo-rithm, some experiments in two real-world problems, ie filtering additive noises in acoustic data and classification of The experimental results show that the recurrent neural network with the improved learning algorithm has demonstrated significantly faster training and achieves the satisfactory performance.