Technion, IEM faculty - Statistics Seminar
Speaker: Yair Wiener
Title: Selective Classification and Active Learning: Two Sides Of the Same Coin ?
Or read here:
An ongoing challenge in machine learning is the development of
formal performance guarantees for learning algorithms. Given a
labeled sample, our goal is to generate both a classifier and an
upper bound on its expected error. In this talk we explore a
method for improving the error bound by compromising its
coverage (a learning model termed selective classification or
classification with a reject option).
For the realizable case we will show that perfect classification
(zero test error) can be guaranteed even for finite sample
sizes. Using known results from statistical geometry we will
derive upper bound on the rejection rate and show that for many
real life problems a meaningful coverage can be guaranteed.
We will also show a strong relation between two known learning
models: stream-based active learning and perfect selective
classification (an extreme case of ‘classification with a reject
option’). ). For these models, restricted to the realizable
case, we will show a reduction of active learning to selective
classification that preserves fast rates. Applying this
reduction to recent results for selective classification, we
will derive exponential label complexity speedup for actively
learning general (non-homogeneous) linear classifiers when the
data distribution is an arbitrary high dimensional mixture of
Gaussians. We will conclude by showing some recent results on
the performance guarantees of agnostic selective classification.
Inspired by the theory we will show a simple rejection rule that
outperform the well known ambiguity reject method based on
distance from SVM decision boundary.
Joint work with Ran El-Yaniv
Technion Math. Net (TECHMATH)
Editor: Michael Cwikel <email@example.com>
Announcement from: <firstname.lastname@example.org>