Cross Split Decision Trees for pattern classification

Zahra Mirzamomen, Mohammad Navid Fekri, Mohammadreza Kangavari

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearch

3 Citations (Scopus)


One of the most important problems of decision trees is instability. It means that small changes in the dataset can result in different trees and different predictions. In this paper we introduce Cross Split Decision Tree (CSDT) which is a new decision tree learning algorithm with improved stability. This new algorithm uses multiple attributes as the split test in the internal nodes, in spite of the classical decision tree learning algorithms which use a single attribute. We have employed a heuristic based on the hoeffding bound to select the best attributes in the internal nodes. The experimental results show that in comparison with the well-known C4.5 decision tree learning algorithm, the proposed algorithm creates shallower decision trees with comparable accuracy.

Original languageEnglish
Title of host publicationProceedings of The 5th International Conference on Computer and Knowledge Engineering (ICCKE 2015)
EditorsAbbas Rasoolzadegan
Place of PublicationPiscataway NJ USA
PublisherIEEE, Institute of Electrical and Electronics Engineers
Number of pages6
ISBN (Electronic)9781467392808
Publication statusPublished - 2015
Externally publishedYes
EventInternational Conference on Computer and Knowledge Engineering 2015 - Mashhad, Iran
Duration: 29 Oct 201530 Oct 2020
Conference number: 5th (Proceedings) (Website)


ConferenceInternational Conference on Computer and Knowledge Engineering 2015
Abbreviated titleICCKE 2015
Internet address


  • Decision Tree
  • Hoeffding Bound
  • Stability

Cite this