Evolutionary Induction of Sparse Neural Trees

Byoung-Tak Zhang, Peter Ohm, Heinz Mühlenbein
1997 Evolutionary Computation  
This paper is concerned with the automatic induction of parsimonious neural networks. In contrast to other program induction situations, network induction entails parametric learning as well as structural adaptation. We present a novel representation scheme called neural trees that allows efficient learning of both network architectures and parameters by genetic search. A hybrid evolutionary method is developed for neural tree induction that combines genetic programming and the breeder genetic
more » ... he breeder genetic algorithm under the unified framework of the minimum description length principle. The method is successfully applied to the induction of higher order neural trees while still keeping the resulting structures sparse to ensure good generalization performance. Empirical results are provided on two chaotic time series prediction problems of practical interest. Program induction, genetic programming, higher order neural networks, neural tree representation, minimum description length principle, time series prediction, breeder genetic algorithm. Keywords @ 1997 by the Massachusetts Institute of Technology Evolutionary Computation S(2): 2 13-236
doi:10.1162/evco.1997.5.2.213 pmid:10021759 fatcat:yx4a3cax5zfddpiads7nhudvn4