Universal Learning Curves of Support Vector Machines

M. Opper and R. Urbanczik
Phys. Rev. Lett. 86, 4410 – Published 7 May 2001
PDFExport Citation

Abstract

Using methods of statistical physics, we investigate the role of model complexity in learning with support vector machines (SVMs), which are an important alternative to neural networks. We show the advantages of using SVMs with kernels of infinite complexity on noisy target rules, which, in contrast to common theoretical beliefs, are found to achieve optimal generalization error although the training error does not converge to the generalization error. Moreover, we find a universal asymptotics of the learning curves which depend only on the target rule but not on the SVM kernel.

  • Received 12 February 2001

DOI:https://doi.org/10.1103/PhysRevLett.86.4410

©2001 American Physical Society

Authors & Affiliations

M. Opper1 and R. Urbanczik2

  • 1Department of Computer Science and Applied Mathematics, Aston University, Birmingham B4 7ET, United Kingdom
  • 2Institut für Theoretische Physik, Universität Würzburg, Am Hubland, D-97074 Würzburg, Germany

References (Subscription Required)

Click to Expand
Issue

Vol. 86, Iss. 19 — 7 May 2001

Reuse & Permissions
Access Options
Author publication services for translation and copyediting assistance advertisement

Authorization Required


×
×

Images

×

Sign up to receive regular email alerts from Physical Review Letters

Log In

Cancel
×

Search


Article Lookup

Paste a citation or DOI

Enter a citation
×