Orthogonal least squares algorithm for the approximation of a map and its derivatives with a RBF network

C. Drioli, D. Rocchesso
2003 Signal Processing  
Radial Basis Function Networks (RBFNs) are used primarily to solve curve-fitting problems and for non-linear system modeling. Several algorithms are known for the approximation of a non-linear curve from a sparse data set by means of RBFNs. Regularization techniques allow to define constraints on the smoothness of the curve by using the gradient of the function in the training. However, procedures that permit to arbitrarily set the value of the derivatives for the data are rarely found in the
more » ... terature. In this paper, the Orthogonal Least Squares (OLS) algorithm for the identification of RBFNs is modified to provide the approximation of a non-linear single-input single-output map along with its derivatives, given a set of training data. The interest in the derivatives of non-linear functions concerns many identification and control tasks where the study of system stability and robustness is addressed. The effectiveness of the proposed algorithm is demonstrated with examples in the field of data interpolation and control of non-linear dynamical systems. networks in function learning and interpolation [16] . Moreover, in many applications the approximation of the mapping alone will not suffice, and learning of the constraints on differential data becomes important as well as learning input-output relations. For example, in the simulation of mechanical systems or plants by physical modeling, the physical knowledge is given in the form of partial differential equations (PDEs), or constraints on differential data [12]. In dynamical system modeling and control tasks the stability of the identified system depends on the gradient of the map [17, 6] , so that, exploiting the derivatives, the modeling can be improved and the stabilization of desired dynamical behaviors can be achieved. Despite of the importance of learning differential data, the problem of efficiently approximating a non-linear function along with its derivatives seems to be rarely addressed. Some theoretical results as well as some application examples that apply to generic feedforward neural networks are found in [12, 2, 9] . More emphasis on the procedural aspects of differential data learning are found in [15] , where a back-propagation based algorithm for multilayer neural network is proposed, and [19] , where a RBFN with raised-cosine kernels is introduced, that can fit up to first-order differential data. In this paper, an extended version of the OLS algorithm for the training of single-input single-output RBFNs is presented, which permits to approximate an unknown function by specifying a set of data points along with its desired higher-order derivatives. The paper is organized as follows: in Section 2, the OLS algorithm is reviewed and modified to add control over the derivative of the function to be approximated. The extension to higher order derivatives is introduced in Section 3 3. Application examples in the field of function interpolation and non-linear dynamics are given in Section 4. In Section 5, the conclusions are presented.
doi:10.1016/s0165-1684(02)00397-3 fatcat:soft4fwdyjeydhiqrdekmj2r7i