Automated feature selection in neuroevolution

Maxine Tan, Michael Hartley, Michel Bister, Rudi Deklerck

Research output: Contribution to journalArticleResearchpeer-review

19 Citations (Scopus)


Feature selection is a task of great importance. Many feature selection methods have been proposed, and can be divided generally into two groups based on their dependence on the learning algorithm/classifier. Recently, a feature selection method that selects features at the same time as it evolves neural networks that use those features as inputs called Feature Selective NeuroEvolution of Augmenting Topologies (FS-NEAT) was proposed by Whiteson et al. In this paper, a novel feature selection method called Feature Deselective NeuroEvolution of Augmenting Topologies (FD-NEAT) is presented. FD-NEAT begins with fully connected inputs in its networks, and drops irrelevant or redundant inputs as evolution progresses. Herein, the performances of FD-NEAT, FS-NEAT and traditional NEAT are compared in some mathematical problems, and in a challenging race car simulator domain (RARS). On the whole, the results show that FD-NEAT significantly outperforms FS-NEAT in terms of network performance and feature selection, and evolves networks that offer the best compromise between network size and performance.

Original languageEnglish
Pages (from-to)271-292
Number of pages22
JournalEvolutionary Intelligence
Issue number4
Publication statusPublished - 2009
Externally publishedYes


  • Genetic algorithms evolution
  • Learning
  • Neural networks

Cite this