Abstract:
At present, neural networks have been widely used, and have achieved some success in many fields. However, there is not much theoretical analysis about neural networks. This paper analyzed the convergence of the back-propagation algorithm with momentum for the three-layer feed-forward neural networks. In our model, the learning rate is set to be a constant, and the momentum coefficient is set as an adaptive variable to accelerate and stabilize the training procedure of network parameters. The corresponding convergence results and detailed proofs are given. Compared with the existing results, our results are more general.