On the Sample Complexity of Weakly Learning

Sally A. Goldman, Washington University in St Louis
Michael J. Kearns, AT&T Bell Laboratories
Robert E. Schapire, AT&T Bell Laboratories

Abstract

In this paper, we study the sample complexity of weak learning. That is, we ask how much data must be collected from an unknown distribution in order to extract a small but significant advantage in prediction. We show that it is important to distinguish between those learning algorithms that output deterministic hypotheses and those that output randomized hypotheses. We prove that in the weak learning model, any algorithm using deterministic hypotheses to weakly learn a class of Vapnik-Chervonenkis dimension d(n) requires (omega) (square root of d(n) examples.