Learning with Recursive Perceptual Representations

TitleLearning with Recursive Perceptual Representations
Publication TypeMiscellaneous
Year of Publication2012
AuthorsVinyals, O., Jia Y., Deng L., & Darrell T.
Other Numbers3355
Abstract

Linear Support Vector Machines (SVMs) have become very popular in vision as part of state-of-the-art object recognition and other classification tasks but require high dimensional feature spaces for good performance. Deep learning methods can find more compact representations but current methods employ multilayer perceptrons that require solving a difficult, non-convex optimization problem. We propose a deep non-linear classifier whose layers are SVMs and which incorporates random projection as its core stacking element. Our method learns layers of linear SVMs recursively transforming the original data manifold through a random projection of the weak prediction computed from each layer. Our method scales as linear SVMs, does not rely on any kernel computations or nonconvex optimization, and exhibits better generalization ability than kernel-based SVMs. This is especially true when the number of training samples is smaller than the dimensionality of data, a common scenario in many real-world applications. The use of random projections is key to our method, as we show in the experiments section, in which we observe a consistent improvement over previous --often more complicated-- methods on several vision and speech benchmarks.

Acknowledgment

This work was partially supported by a Microsoft Research Fellowship.

URLhttp://www.icsi.berkeley.edu/pubs/vision/ICSI_learningwith12.pdf
Bibliographic Notes

Presented at the 26th Annual Conference on Neural Information Processing Systems (NIPS 2012), Lake Tahoe, Nevada

Abbreviated Authors

O. Vinyals, Y. Jia, L. Deng, and T. Darrell

ICSI Research Group

Vision

ICSI Publication Type

Talk or presentation