Abstract:
Generally speaking, the classical Bayes classification methods must hypothesize what distribution a random variable be subject to before analyzing. It is impossible to get a high correct rate if the selected model is not agreement with the true one. The statistic approaches, for example, distance, Fisher, k nearest neighbor, wise linear classifiers, fail to solve multi regional distributions such as the alternate table problems. Not only does the two spiral problem give a challenge to the statistic methods again, but also brings a doubt about the abilities of the general feedforward multi layered LBF neural networks. This paper presents an adaptive algorithm of optimally determining the structures, number, positions and widths of kernel functions of the improved radial basis function(IRBF) neural networks. The algorithm has approximately computational complexity comparing with the back propagation fellow used in the general feedforward three layered LBF networks. Whether or not a kernel function comes into being depends on the relationships between some misclassification patterns and their neighbor classes. The extreme importance of the two layered LBF networks is testified by many experiments. Whether or not a kernel be finally continued to have is determined by its contribution to improve the classification correct rate of the test set. A lot of applications show this kind of IRBF networks have advantages over the feedforward three layered RBF and LBF ones at such aspects as convergence rates and classification precision, achievements of optimal structures, capabilities of getting rid of local points. This kind of networks are able to work well in a real time way.