Not Logged In

Exploiting the Cost (In) Sensitivity of Decision Tree Splitting Criteria

Full Text: icml2000.ps PS

This paper investigates how the splitting cri­ teria and pruning methods of decision tree learning algorithms are influenced by misclas­ sification costs or changes to the class distri­ bution. Splitting criteria that are relatively insensitive to costs (class distributions) are found to perform as well as or better than, in terms of expected misclassification cost, splitting criteria that are cost sensitive. Con­ sequently there are two opposite ways of deal­ ing with imbalance. One is to combine a cost­ insensitive splitting criterion with a cost in­ sensitive pruning method to produce a deci­ sion tree algorithm little affected by cost or prior class distribution. The other is to grow a cost­independent tree which is then pruned in a cost­sensitive manner.

Citation

C. Drummond, R. Holte. "Exploiting the Cost (In) Sensitivity of Decision Tree Splitting Criteria". International Conference on Machine Learning (ICML), Stanford University, pp 239-246, January 2000.

Keywords: exploiting, tree, splitting, criteria, machine learning
Category: In Conference

BibTeX

@incollection{Drummond+Holte:ICML00,
  author = {Chris Drummond and Robert Holte},
  title = {Exploiting the Cost (In) Sensitivity of Decision Tree Splitting
    Criteria},
  Pages = {239-246},
  booktitle = {International Conference on Machine Learning (ICML)},
  year = 2000,
}

Last Updated: June 18, 2007
Submitted by Staurt H. Johnson

University of Alberta Logo AICML Logo