Not Logged In

A New Metric-Based Approach to Model Selection

Full Text: schuurmans97new.pdf PDF

We introduce a new approach to model selection that performs better than the standard complexity­ penalization and hold­out error estimation techniques in many cases. The basic idea is to exploit the intrin­ sic metric structure of a hypothesis space, as deter­ mined by the natural distribution of unlabeled training patterns, and use this metric as a reference to detect whether the empirical error estimates derived from a small (labeled) training sample can be trusted in the region around an empirically optimal hypothesis. Us­ ing simple metric intuitions we develop new geometric strategies for detecting overfitting and performing ro­ bust yet responsive model selection in spaces of can­ didate functions. These new metric­based strategies dramatically outperform previous approaches in ex­ perimental studies of classical polynomial curve fit­ ting. Moreover, the technique is simple, efficient, and can be applied to most function learning tasks. The only requirement is access to an auxiliary collection of unlabeled training data.

Citation

D. Schuurmans. "A New Metric-Based Approach to Model Selection". National Conference on Artificial Intelligence (AAAI), Providence, Rhode Island, January 1997.

Keywords: metric-based, model, machine learning
Category: In Conference

BibTeX

@incollection{Schuurmans:AAAI97,
  author = {Dale Schuurmans},
  title = {A New Metric-Based Approach to Model Selection},
  booktitle = {National Conference on Artificial Intelligence (AAAI)},
  year = 1997,
}

Last Updated: August 16, 2007
Submitted by Russ Greiner

University of Alberta Logo AICML Logo