TY - JOUR
T1 - Automated feature selection in neuroevolution
AU - Tan, Maxine
AU - Hartley, Michael
AU - Bister, Michel
AU - Deklerck, Rudi
N1 - Copyright:
Copyright 2014 Elsevier B.V., All rights reserved.
PY - 2009
Y1 - 2009
N2 - Feature selection is a task of great importance. Many feature selection methods have been proposed, and can be divided generally into two groups based on their dependence on the learning algorithm/classifier. Recently, a feature selection method that selects features at the same time as it evolves neural networks that use those features as inputs called Feature Selective NeuroEvolution of Augmenting Topologies (FS-NEAT) was proposed by Whiteson et al. In this paper, a novel feature selection method called Feature Deselective NeuroEvolution of Augmenting Topologies (FD-NEAT) is presented. FD-NEAT begins with fully connected inputs in its networks, and drops irrelevant or redundant inputs as evolution progresses. Herein, the performances of FD-NEAT, FS-NEAT and traditional NEAT are compared in some mathematical problems, and in a challenging race car simulator domain (RARS). On the whole, the results show that FD-NEAT significantly outperforms FS-NEAT in terms of network performance and feature selection, and evolves networks that offer the best compromise between network size and performance.
AB - Feature selection is a task of great importance. Many feature selection methods have been proposed, and can be divided generally into two groups based on their dependence on the learning algorithm/classifier. Recently, a feature selection method that selects features at the same time as it evolves neural networks that use those features as inputs called Feature Selective NeuroEvolution of Augmenting Topologies (FS-NEAT) was proposed by Whiteson et al. In this paper, a novel feature selection method called Feature Deselective NeuroEvolution of Augmenting Topologies (FD-NEAT) is presented. FD-NEAT begins with fully connected inputs in its networks, and drops irrelevant or redundant inputs as evolution progresses. Herein, the performances of FD-NEAT, FS-NEAT and traditional NEAT are compared in some mathematical problems, and in a challenging race car simulator domain (RARS). On the whole, the results show that FD-NEAT significantly outperforms FS-NEAT in terms of network performance and feature selection, and evolves networks that offer the best compromise between network size and performance.
KW - Genetic algorithms evolution
KW - Learning
KW - Neural networks
UR - http://www.scopus.com/inward/record.url?scp=84865033087&partnerID=8YFLogxK
U2 - 10.1007/s12065-009-0018-z
DO - 10.1007/s12065-009-0018-z
M3 - Article
AN - SCOPUS:84865033087
VL - 1
SP - 271
EP - 292
JO - Evolutionary Intelligence
JF - Evolutionary Intelligence
SN - 1864-5909
IS - 4
ER -