SODE

Self-adaptive one-dependence estimators for classification

Jia Wu, Shirui Pan, Xingquan Zhu, Peng Zhang, Chengqi Zhang

Research output: Contribution to journalArticleResearchpeer-review

Abstract

SuperParent-One-Dependence Estimators (SPODEs) represent a family of semi-naive Bayesian classifiers which relax the attribute independence assumption of Naive Bayes (NB) to allow each attribute to depend on a common single attribute (superparent). SPODEs can effectively handle data with attribute dependency but still inherent NB's key advantages such as computational efficiency and robustness for high dimensional data. In reality, determining an optimal superparent for SPODEs is difficult. One common approach is to use weighted combinations of multiple SPODEs, each having a different superparent with a properly assigned weight value (i.e., a weight value is assigned to each attribute). In this paper, we propose a self-adaptive SPODEs, namely SODE, which uses immunity theory in artificial immune systems to automatically and self-adaptively select the weight for each single SPODE. SODE does not need to know the importance of individual SPODE nor the relevance among SPODEs, and can flexibly and efficiently search optimal weight values for each SPODE during the learning process. Extensive experiments and comparisons on 56 benchmark data sets, and validations on image and text classification, demonstrate that SODE outperforms state-of-the-art weighted SPODE algorithms and is suitable for a wide range of learning tasks. Results also confirm that SODE provides an appropriate balance between runtime efficiency and accuracy.

Original languageEnglish
Pages (from-to)358-377
Number of pages20
JournalPattern Recognition
Volume51
DOIs
Publication statusPublished - Mar 2016
Externally publishedYes

Keywords

  • Artificial immune systems
  • Attribute weighting
  • Classification
  • Evolutionary machine learning
  • Naive Bayes
  • Self-adaptive

Cite this

Wu, Jia ; Pan, Shirui ; Zhu, Xingquan ; Zhang, Peng ; Zhang, Chengqi. / SODE : Self-adaptive one-dependence estimators for classification. In: Pattern Recognition. 2016 ; Vol. 51. pp. 358-377.
@article{adf6cd5f206846b5a8dd8b180cec86b0,
title = "SODE: Self-adaptive one-dependence estimators for classification",
abstract = "SuperParent-One-Dependence Estimators (SPODEs) represent a family of semi-naive Bayesian classifiers which relax the attribute independence assumption of Naive Bayes (NB) to allow each attribute to depend on a common single attribute (superparent). SPODEs can effectively handle data with attribute dependency but still inherent NB's key advantages such as computational efficiency and robustness for high dimensional data. In reality, determining an optimal superparent for SPODEs is difficult. One common approach is to use weighted combinations of multiple SPODEs, each having a different superparent with a properly assigned weight value (i.e., a weight value is assigned to each attribute). In this paper, we propose a self-adaptive SPODEs, namely SODE, which uses immunity theory in artificial immune systems to automatically and self-adaptively select the weight for each single SPODE. SODE does not need to know the importance of individual SPODE nor the relevance among SPODEs, and can flexibly and efficiently search optimal weight values for each SPODE during the learning process. Extensive experiments and comparisons on 56 benchmark data sets, and validations on image and text classification, demonstrate that SODE outperforms state-of-the-art weighted SPODE algorithms and is suitable for a wide range of learning tasks. Results also confirm that SODE provides an appropriate balance between runtime efficiency and accuracy.",
keywords = "Artificial immune systems, Attribute weighting, Classification, Evolutionary machine learning, Naive Bayes, Self-adaptive",
author = "Jia Wu and Shirui Pan and Xingquan Zhu and Peng Zhang and Chengqi Zhang",
year = "2016",
month = "3",
doi = "10.1016/j.patcog.2015.08.023",
language = "English",
volume = "51",
pages = "358--377",
journal = "Pattern Recognition",
issn = "0031-3203",
publisher = "Elsevier",

}

SODE : Self-adaptive one-dependence estimators for classification. / Wu, Jia; Pan, Shirui; Zhu, Xingquan; Zhang, Peng; Zhang, Chengqi.

In: Pattern Recognition, Vol. 51, 03.2016, p. 358-377.

Research output: Contribution to journalArticleResearchpeer-review

TY - JOUR

T1 - SODE

T2 - Self-adaptive one-dependence estimators for classification

AU - Wu, Jia

AU - Pan, Shirui

AU - Zhu, Xingquan

AU - Zhang, Peng

AU - Zhang, Chengqi

PY - 2016/3

Y1 - 2016/3

N2 - SuperParent-One-Dependence Estimators (SPODEs) represent a family of semi-naive Bayesian classifiers which relax the attribute independence assumption of Naive Bayes (NB) to allow each attribute to depend on a common single attribute (superparent). SPODEs can effectively handle data with attribute dependency but still inherent NB's key advantages such as computational efficiency and robustness for high dimensional data. In reality, determining an optimal superparent for SPODEs is difficult. One common approach is to use weighted combinations of multiple SPODEs, each having a different superparent with a properly assigned weight value (i.e., a weight value is assigned to each attribute). In this paper, we propose a self-adaptive SPODEs, namely SODE, which uses immunity theory in artificial immune systems to automatically and self-adaptively select the weight for each single SPODE. SODE does not need to know the importance of individual SPODE nor the relevance among SPODEs, and can flexibly and efficiently search optimal weight values for each SPODE during the learning process. Extensive experiments and comparisons on 56 benchmark data sets, and validations on image and text classification, demonstrate that SODE outperforms state-of-the-art weighted SPODE algorithms and is suitable for a wide range of learning tasks. Results also confirm that SODE provides an appropriate balance between runtime efficiency and accuracy.

AB - SuperParent-One-Dependence Estimators (SPODEs) represent a family of semi-naive Bayesian classifiers which relax the attribute independence assumption of Naive Bayes (NB) to allow each attribute to depend on a common single attribute (superparent). SPODEs can effectively handle data with attribute dependency but still inherent NB's key advantages such as computational efficiency and robustness for high dimensional data. In reality, determining an optimal superparent for SPODEs is difficult. One common approach is to use weighted combinations of multiple SPODEs, each having a different superparent with a properly assigned weight value (i.e., a weight value is assigned to each attribute). In this paper, we propose a self-adaptive SPODEs, namely SODE, which uses immunity theory in artificial immune systems to automatically and self-adaptively select the weight for each single SPODE. SODE does not need to know the importance of individual SPODE nor the relevance among SPODEs, and can flexibly and efficiently search optimal weight values for each SPODE during the learning process. Extensive experiments and comparisons on 56 benchmark data sets, and validations on image and text classification, demonstrate that SODE outperforms state-of-the-art weighted SPODE algorithms and is suitable for a wide range of learning tasks. Results also confirm that SODE provides an appropriate balance between runtime efficiency and accuracy.

KW - Artificial immune systems

KW - Attribute weighting

KW - Classification

KW - Evolutionary machine learning

KW - Naive Bayes

KW - Self-adaptive

UR - http://www.scopus.com/inward/record.url?scp=84955724165&partnerID=8YFLogxK

U2 - 10.1016/j.patcog.2015.08.023

DO - 10.1016/j.patcog.2015.08.023

M3 - Article

VL - 51

SP - 358

EP - 377

JO - Pattern Recognition

JF - Pattern Recognition

SN - 0031-3203

ER -