Alternating decision trees

Melanie Po Leen Ooi, Hong Kuan Sok, Ye Chow Kuang, Serge Demidenko

    Research output: Chapter in Book/Report/Conference proceedingChapter (Book)Otherpeer-review

    2 Citations (Scopus)


    Alternating Decision Tree (ADTree) is a special class of classification models. It is a generalization of classical Decision Trees, Voted Decision Trees, and Voted Decision Stumps. It allows any boosting implementation as a learning mechanism to extract the ADTree model from the data. Boosting is a modern computational statistical tool to improve an overall classification performance. ADTree is an attractive extension of boosting in the context of the decision tree. It allows adaptation of different boosting techniques to design an ADTree model with unique characteristics to handle vast applications. Despite attractive characteristics, the ADTree model has not received as much attention as compared to other types of decision trees. This chapter presents a detailed review and discussion on some of the most powerful ADTree variants with their recent applications and future perspectives.

    Original languageEnglish
    Title of host publicationHandbook of Neural Computation
    EditorsPijush Samui, Sanjiban Sekhar Roy, Valentina E. Balas
    Place of PublicationLondon UK
    PublisherAcademic Press
    Number of pages27
    ISBN (Electronic)9780128113196
    ISBN (Print)9780128113189
    Publication statusPublished - 1 Jan 2017


    • AdaBoost
    • ADTree
    • Boosting iteration
    • LogitBoost
    • Optimal score vector
    • Penalization function
    • Precondition set
    • Root decision rule
    • Weak classifier
    • Weight distribution

    Cite this