Evolving fuzzy neural networks for supervised/unsupervised online knowledge-based learning

N. Kasabov
2001 IEEE Transactions on Systems Man and Cybernetics Part B (Cybernetics)  
The paper introduces evolving fuzzy neural networks (EFuNNs) as a means for the implementation of the evolving connectionist systems (ECOS) paradigm that is aimed at building on-line, adaptive intelligent systems that have both their structure and functionality evolving in time. EFuNNs evolve their structure and parameter values through incremental, hybrid supervised/unsupervised, on-line learning. They can accommodate new input data, including new features, new classes, etc. through local
more » ... nt tuning. New connections and new neurons are created during the operation of the system. EFuNNs can learn spatial-temporal sequences in an adaptive way through one pass learning, and automatically adapt their parameter values as they operate. Fuzzy or crisp rules can be inserted and extracted at any time of the EFuNN operation. The characteristics of EFuNNs are illustrated on several case study data sets for time series prediction and spoken word classification. Their performance is compared with traditional connectionist methods and systems. The applicability of EFuNNs as general purpose online learning machines is discussed what concerns systems that learn from large databases, life-long learning systems, on-line adaptive systems in different areas of Engineering. 2 Key words: evolving connectionist systems; evolving fuzzy neural networks; on-line learning; knowledge-based neural networks; sleep learning. The EFuNN model presented in the paper has elements from all the groups above. The model is called evolving because of the nature of the structural growth and structural adaptation of the whole evolving connectionist system (ECOS) it is part of. In terms of on-line neuron allocation, the EFuNN model is similar to the Resource Allocating Network (RAN) suggested by Platt [61] and improved in other related models [21,66]. The RAN model allocates a new neuron for a new input example (x,y) if the input vector x is not close in the input space to any of the already allocated radial basis neurons (centers), and also -if the output error evaluation (y-y'), where y' is the produced by the system output for the input vector x, is above an error threshold. Otherwise, centers will be adapted to minimise the error for the example (x,y) through a gradient descent algorithm. In terms of adaptive optimisation of many individual linear units, EFuNN is close to the Receptive Field Weight regression (RFWR) model by Schaal and Atkeson [71]. EFuNNs have also similarities with the Fritzke's Growing Cell Structures and Growing Neural Gas models [19], and with other dynamic radial basis function networks (RBFN) [58,5,78] and with the counter-propagation networks [27] in terms of separating the unsupervised learning, which is performed first, from the supervised learning, applied next, in a two-tyre structure. Creating new nodes is a feature also of SCONN [12] and VC network [84]. The EFuNN learning algorithm differs from the above in many aspects, mainly in the local element tuning, in the employment of simpler and faster learning modes, in more flexibility when evolving internal structures and representations, and in the knowledge-based orientation. A comparative analysis between EFuNNs and other similar models on benchmark problems shows that while EFuNNs are comparable with the other methods in terms of accuracy of the obtained results, they are much faster, more 4 controllable, and evolve meaningful internal representations. EFuNNs suggest a new neuro-fuzzy systemic approach that employs more sophisticated supervised/unsupervised, knowledge-based learning methods. The functionality of EFuNNs can be fully utilised when EFuNNs are used as elements of an ECOS framework for adaptive, intelligent, knowledge-based systems. The ECOS framework Evolving connectionist systems (ECOS) are systems that evolve their structure and functionality over time through interaction with the environment - fig.1 [35]. They have some ("genetically") pre-defined parameters (knowledge) but they also learn and adapt as they operate. They emerge, evolve, develop, unfold through learning, and through changing their structure in order to better represent incoming data. [ Figure 1 ] A block diagram of the ECOS framework is given in fig.2 . ECOS are multi-level, multimodular structures where many neural network modules (denoted as NNM) are connected with inter-, and intra-connections. [ Figure 2 ] The main blocks of ECOS are described below. Feature selection part. It performs filtering of the input information, feature extraction and forming the input vectors. Representation (memory) part, where information (patterns) are stored. It is a multimodular, evolving structure of NNMs organised in groups. This is the most important part of ECOS. One realisation of a NNM is the EFuNN, presented in the next section. Higher-level decision part. It consists of modules that receive inputs from the representation part and also feedback from the environment. 5 (1) Action part. These are modules that take input values from the decision part and pass output information to the environment. Knowledge-based part. This part extracts compressed abstract information from the representation modules and from the decision modules in different forms of rules, abstract associations, etc. This part requires that the NNM should operate in a knowledge-based learning mode and provide with the knowledge about the problem under consideration. Adaptation part. This part uses statistical, evolutionary (e.g. genetic algorithms [23,79]), and other techniques to evaluate and optimise the parameters of the ECOS during its operation. The ECOS operation principles correspond to the seven requirements to the intelligent systems presented in section 1 [35]. They are also based on some biological facts and biological principles (see for example [55,62,68,72,82]). Implementing NNMs of the ECOS framework require connectionist models that support these principles. Such model is the evolving fuzzy neural network (EFuNN). 3. The Evolving Fuzzy Neural Network (EFuNN) Model 3.1. General principles of EFuNNs Fuzzy neural networks are connectionist structures that implement fuzzy rules and fuzzy inference [25,51,63,83,38]. FuNNs represent a class of them [38,33,39,40]. The EFuNN model presented here is principally different from all the fuzzy neural network models introduced so far despite some structural similarities. EFuNNs evolve according to the ECOS principles. A brief first introduction of EFuNN was given in [36]. Here the EFuNN architecture and functionality are further developed, illustrated and analysed in details. EFuNNs have a five-layer structure, similar to the structure of FuNNs (fig.3a). But here nodes and connections are created/connected as data examples are presented. An optional
doi:10.1109/3477.969494 pmid:18244856 fatcat:qi6jnbbmkneb3a7tj7oqezoqee