Abstract
Alternating decision tree (ADTree) is a special decision tree representation that brings interpretability to boosting, a well-established ensemble algorithm. This has found success in wide applications. However, existing variants of ADTree are implementing univariate decision nodes where potential interactions between features are ignored. To date, there has been no multivariate ADTree. We propose a sparse version of multivariate ADTree such that it remains comprehensible. The proposed sparse ADTree is empirically tested on UCI datasets as well as spectral datasets from the University of Eastern Finland (UEF). We show that sparse ADTree is competitive against both univariate decision trees (original ADTree, C4.5, and CART) and multivariate decision trees (Fisher s decision tree and a single multivariate decision tree from oblique Random Forest). It achieves the best average rank in terms of prediction accuracy, second in terms of decision tree size and faster induction time than existing ADTree. In addition, it performs especially well on datasets with correlated features such as UEF spectral datasets. Thus, the proposed sparse ADTree extends the applicability of ADTree to a wider variety of applications.
Original language | English |
---|---|
Pages (from-to) | 57 - 64 |
Number of pages | 8 |
Journal | Pattern Recognition Letters |
Volume | 60-61 |
Issue number | August 2015 |
DOIs | |
Publication status | Published - 2015 |