This paper shows how a meta-learning technique can be applied to decisions about pruning and representation adequacy. It describes a meta-learning technique that uses unpruned decision trees and information from decision tree construction to describe the learning tasks. Based on previous experience, a meta-learner decides which attribute addition strategy is the more appropriate for a given new learning problem. The technique is applied to simplicity and representation issues. In the simplicity camp, it is used to decide when to prune, how much pruning is appropriate and what the best pruning technique is for a given learning task. In constructive induction, it is used to decide between a pool of alternative new attribute constructors. Results suggest that induction on the connection between problems and simplicity and representation biases improves learning performance.