On Space-Bounded Learning and the Vapnik-Chervonenkis Dimension (Thesis)

TitleOn Space-Bounded Learning and the Vapnik-Chervonenkis Dimension (Thesis)
Publication TypeTechnical Report
Year of Publication1989
AuthorsFloyd, S.
Other Numbers560

This thesis explores algorithms that learn a concept from a concept class of Vapnik-Chervonenkis (VC) dimension d by saving at most d examples at a time. The framework is the model of probably approximately correct (pac) learning introduced by Valiant [V84]. A maximum concept class of VC dimension d is defined. For a maximum class C of VC dimension d, we give an algorithm for representing a finite set of positive and negative examples of a concept by a subset of d labeled examples of that set. This data compression scheme of size d is used to construct a space-bounded algorithm called the iterative compression algorithm that learns a concept from the class C by saving at most d examples at a time. These d examples represent the current hypothesis of the learning algorithm. A space-bounded algorithm is called acyclic if a hypothesis that has been rejected as incorrect is never reinstated. We give a sufficient condition for the iterative compression algorithm to be acyclic on a maximum class C. Classes for which the iterative compression algorithm is acyclic include positive half-spaces in Euclidean space E(superscript n), balls in E(superscript n), and arbitrary rectangles and triangles in the plane. The iterative compression algorithm can be thought of as learning a boundary between the positive and the negative examples.

Bibliographic Notes

ICSI Technical Report TR-89-061

Abbreviated Authors

S. Floyd

ICSI Publication Type

Technical Report