Kernel orthonormalization in radial basis function neural networks

W. Kaminski, P. Strumillo
1997 IEEE Transactions on Neural Networks  
This paper deals with optimization of the computations involved in training radial basis function (RBF) neural networks. The main contribution of the reported work is the method for network weights calculation, in which the key idea is to transform the RBF kernels into an orthonormal set of functions (by using the standard Gram-Schmidt orthogonalization). This significantly reduces computing time (in comparison to other methods for weights calculations, e.g., the direct matrix inversion method)
more » ... x inversion method) if the RBF training scheme, which relies on adding one kernel hidden node at a time to improve network performance, is adopted. Another property of the method is that, after the RBF network weights are computed (via the orthonormalization procedure), the original network structure can be restored back (i.e., to the one containing RBF's in the hidden layer). An additional strength of the method is the possibility to decompose the proposed computing task into a number of parallel subtasks so gaining further savings on computing time. Also, the proposed weight calculation technique has low storage requirements. These features make the method very attractive for hardware implementation. The paper presents a detailed derivation of the proposed network weights calculation procedure and demonstrates its validity for RBF network training on a number of data classification and function approximation problems.
doi:10.1109/72.623218 pmid:18255719 fatcat:3lub6t7wuzhohg6q4vaqde7orm