Significant Feature Set Driven Enhanced Classification Significant Feature Set Driven and Optimized FFN for Enhanced Classification

Asha Gowda Karegowda, Sha Karegowda, M Jayaram
International Journal of Computational Intelligence and Informatics   unpublished
Neural Networks augmented with back propagation learning is one the extensively used data classification tools. In this paper, a novel classification scheme is elaborated. The method evolved has two steps: In the first step, significant feature selection is made b algorithm based correlation based feature selection). In the second step, the connection weights of feed forward network (FFN) are optimized using Particle swarm optimization (PSO) and GA. To convalidate the efficacy of the method, it
more » ... was applied to four benchmark datasets namely diabetes, iris, ionosphere and heart statlog. PSO showed best classification accuracy in the range of 86% considered when compared with BPN and GA based networks. T was also modest, with a few neurons in the hidden layer. FFN is an information-processing paradigm that is inspired by the way biological nervous systems, such as the brain, process information. It is composed of a large number of highly interconnected processing elements (neurons) working in unison to solve specific problems. Developing a neural network involves first training the network to carry out the desired computations. In Supervised learning, the network is trained by providing it with input and matching output patterns (training data). A comm which tries to minimize the average squared error between the network's output, and the expected output. Neural networks have been criticized for their poor interpretability, since it is difficult for humans symbolic meaning behind the learned weights. Advantages of neural networks, however, include their high tolerance to noisy data and their ability to classify patterns on which they have not been trained. common method adopted for training of FFN is back propagation method (BPN). In BPN model each node is connected to all nodes in the adjoining layer and each connection has an unbounded positive or negative weight associated with it. Back propagation learning works by making modifi output layer then moving backward through the hidden layers of the network [14] . BPN uses the gradient approach, which either trains slowly or may get struck with local minimum [10, 23, 26, 27, 30, 36] . There several variants and extensions of BP used for training neural network: gradient descent with momentum, scaled conjugate gradient (SCG), resilient propagation (RPROP), BFGS quasi (LM) algorithms [12] . In addition, one m Algorithms (GAs), Particle swarm optimization (PSO), Artificial Bee Colony (ABC) Optimization Algorithm and Ant Colony optimization algorithm for determining not only the connection weights, b various parameters of NN such as number of hidden layers, number of nodes in hidden layers, relevant feature subsets, the learning rate and the momentum. This paper presents the application of two evolutionary algorithms namely PSO a network connection weights of FFN. Computational work has been carried out on UCI machine learning benchmark datasets. Section 2 elaborates on GA and optimizing connection weights of FFN using GA. The applications of PSO for optimizing connection weights of FFN are explained in Section 3. Section 4 describes the two filters: GA-Correlation based feature selection (GA significant inputs for FFN. Computational results and conclusions are pre respectively. GA is a stochastic general search method, capable of effectively exploring large search space, which is usually required in case of attribute selection. Further, unlike many search algo etworks augmented with back propagation learning is one the extensively used data classification tools. In this paper, a novel classification scheme is elaborated. The method evolved has two steps: In the first step, significant feature selection is made by using decision tree and GA algorithm based correlation based feature selection). In the second step, the connection weights of feed forward network (FFN) are optimized using Particle swarm optimization (PSO) and GA. To convalidate cy of the method, it was applied to four benchmark datasets namely diabetes, iris, ionosphere and heart statlog. PSO showed best classification accuracy in the range of 86%-97% for all the datasets considered when compared with BPN and GA based networks. The topology of the PSO optimized FFN was also modest, with a few neurons in the hidden layer.
fatcat:f3zwx5oej5g2tgrdxgh3m6gb3m