Estimating the class prior and posterior from noisy positives and unlabeled data
- Shantanu Jain
- Martha White, University of Alberta
- Predrag Radivojac
We develop a classification algorithm for estimating posterior distributions from positive-unlabeled data, that is robust to noise in the positive labels and effective for high-dimensional data. In recent years, several algorithms have been proposed to learn from positive-unlabeled data; however, many of these contributions remain theoretical, performing poorly on real high-dimensional data that is typically contaminated with noise. We build on this previous work to develop two practical classification algorithms that explicitly model the noise in the positive labels and utilize univariate transforms built on discriminative classifiers. We prove that these univariate transforms preserve the class prior, enabling estimation in the univariate space and avoiding kernel density estimation for high-dimensional data. The theoretical development and parametric and nonparametric algorithms proposed here constitute an important step towards wide-spread use of robust classification algorithms for positive-unlabeled data.
Citation
S. Jain, M. White, P. Radivojac. "Estimating the class prior and posterior from noisy positives and unlabeled data". Neural Information Processing Systems (NIPS), (ed: Daniel D. Lee, Masashi Sugiyama, Ulrike von Luxburg, Isabelle Guyon, Roman Garnett), pp 2685-2693, December 2016.Keywords: | |
Category: | In Conference |
Web Links: | NeurIPS |
BibTeX
@incollection{Jain+al:NIPS16, author = {Shantanu Jain and Martha White and Predrag Radivojac}, title = {Estimating the class prior and posterior from noisy positives and unlabeled data}, Editor = {Daniel D. Lee, Masashi Sugiyama, Ulrike von Luxburg, Isabelle Guyon, Roman Garnett}, Pages = {2685-2693}, booktitle = {Neural Information Processing Systems (NIPS)}, year = 2016, }Last Updated: February 25, 2020
Submitted by Sabina P