Jump to Content

Fidelity-Weighted Learning

Arash Mehrjou
Stephan Gouws
Jaap Kamps
Bernhard Scholkopf
ICLR (2018)

Abstract

Learning meaningful and useful task-dependent data representations requires many training instances -- but training labels are expensive to obtain, and may be of varying quality. This creates a fundamental quality-versus-quantity trade-off in the learning process. Do we learn from the small amount of high-quality data or the potentially large amount of weakly-labeled data (obtained from heuristics or crowd-sourcing, etc.)? We argue that if we could somehow know and take the label-quality into account when learning the data representation, we could get the best of both worlds. To this end, we propose ``fidelity-weighted learning'' (\fwl), a semi-supervised student-teacher approach for training deep neural networks using weakly-labeled data. \fwl modulates the parameter updates to a \emph{student} network (trained on the task we care about) on a per-sample basis according to the posterior confidence of the label-quality estimated by a \emph{teacher}. Both student and teacher are learned from the data. We evaluate \fwl on two real-world tasks in information retrieval and natural language processing where we outperform state-of-the-art alternative semi-supervised methods, indicating that our approach makes better use of the label information and results in better task-dependent data representations.

Research Areas