Skip to main content

Evolutionary Programming of Near-Optimal Neural Networks

D. Lock, C. Giraud-Carrier, Evolutionary Programming of Near-Optimal Neural Networks. Proceedings of the Fourth International Conference on Artificial Neural Networks and Genetic Algorithms (ICANNGA99). ISBN 3-211-83364-1, pp. 302–306. April 1999. No electronic version available.

Abstract

A genetic algorithm (GA) method that evolves both the topology and training parameters of backpropagation-trained, fully-connected, feed-forward neural networks is presented. The GA uses a weak encoding scheme with real-valued alleles. One contribution of the proposed approach is to replace the needed but potentially slow evolution of final weights by the more efficient evolution of a single weight spread parameter used to set the initial weights only. In addition, the co-evolution of an input mask effects a form of automatic feature selection. Preliminary experiments suggest that the resulting system is able to produce networks that perform well under backpropagation.

Bibtex entry.

Contact details

Publication Admin