Novel Four-Layer Neural Network and Its Incremental Learning Based on Randomly Mapped Features

YANG Yue, WANG Shitong
2021 Jisuanji kexue yu tansuo  
This paper proposes a four-layer neural network based on randomly feature mapping (FRMFNN) and its fast incremental learning algorithms. First, FRMFNN transforms the original input features into randomly mapped features by certain randomly mapping algorithm and stores them in its nodes of first hidden layer. Then, the FRMFNN generates its nodes of second hidden layer using non-linear activation function on all random mapping features. Finally, the second hidden layer is linked to the output
more » ... r through the output weights. Since the weights of the first and the second hidden layers are randomly generated according to certain continuous sampling probability distribution, without the updates of the weights, and the output weights can be quickly solved by the ridge regression, avoiding time-consuming training process of the traditional back propagation neural networks. When FRMFNN can't reach the prescribed accuracy, its performance can be continuously improved by its rapid incremental algorithm, thereby avoiding retraining the whole network. This paper, a detail introduction of proposed FRMFNN and its incremental algorithms is provided. What's more, a proof of universal approximation property of FRMFNN is also given. Compared with broad learning system (BLS) and the incremental learning of extreme learning machine (ELM), the experimental results on several popular classification and regression datasets demonstrate the effectiveness of the proposed FRMFNN and its incremental learning algorithms.
doi:10.3778/j.issn.1673-9418.2005028 doaj:451e4f59dcb445238c248956e2d22eef fatcat:ya3jzvsp2fgf5gvr2b3ncpaeqa