BFGS修正算法在前馈神经元网络学习中的应用
The Application of BFGS Revised Algorithm in Learning Feed Forward Neural Networks
-
摘要: 介绍了拟牛顿公式中BFGS修正算法和Wolf-Powell不精确线性搜索准则所具有的全局收敛,分析了将该算法应用到前馈神经元网络的训练学习中存在局部最优的原因。Abstract: BFGS revised algorithm for Quasi Newton formula and Wolf Powell rule without exact linear search is introduced. The requirements for holding global convergence property are proposed. The reason of local optimization for learning multi layer feed forward neural networks is discussed.