A genetic algorithm (GA) method that evolves both the topology and training parameters of backpropagation-trained, fully-connected, feed-forward neural networks is presented. The GA uses a weak encoding scheme with real-valued alleles. One contribution of the proposed approach is to replace the needed but potentially slow evolution of final weights by the more efficient evolution of a single weight spread parameter used to set the initial weights only. In addition, the co-evolution of an input mask effects a form of automatic feature selection. Preliminary experiments suggest that the resulting system is able to produce networks that perform well under backpropagation.