Technion, IEM faculty - Statistics Seminar
Speaker: Yair Wiener, Technion
Title: Selective classification
Date: 30/10/2011
Time: 14:30
Place: Bloomfield-527
Abstract:  <>
Or read this:
An ongoing challenge in machine learning is the development of
formal performance guarantees for learning schemes and
algorithms. Given a labeled sample, our goal is to generate both
a classifier and an upper bound on its expected error. In this
talk we explore a method for improving the error bound by
compromising its coverage (a learning model termed selective
classification or classification with a reject option).
For the noise-free case we will show that selectivity allows for
perfect classification (zero test error), which can be
guaranteed after observing finite training samples. Using known
results from probabilistic geometry we will derive upper bound
on the rejection rate, ensuring diminishing rejection rates in
practical scenarios.
For the agnostic case we will introduce a new performance
objective termed weak optimality. We will introduce a learning
scheme for learning a weakly optimal selective classifier that
does not err more than the optimal classifier in hindsight over
the accepted region. We will also analyze a practical technique
that approximates this scheme and present some empirical
results. We will conclude by showing a strong relation between
two known learning models: streambased active learning and
perfect selective classification. For these models, restricted
to the realizable case, we will show a reduction of active
learning to selective classification that preserves fast rates.
Applying this reduction to recent results for selective
classification, we will derive exponential label complexity
speedup for actively learning general (non-homogeneous) linear
classifiers when the data distribution is an arbitrary high
dimensional mixture of Gaussians.
Technion Math. Net (TECHMATH)
Editor: Michael Cwikel   <> 
Announcement from:  <>