Training in the Primal

Following the recent interest on the solution of the primal optimization problem of SVM-like classifiers, we developed a library that allows to the user to solve the primal problem of Laplacian SVMs (see the related JMLR paper). Training times and complexity are appreaciably reduced with respect to the original dual formulation. This is due to some early stopping conditions on the gradient descent that are based on the stability of the prediction. See our JMRL paper for all details. In the current implementation, it requires the whole kernel matrix to fit in memory.

Our library is somewhat generic, and it includes other classifiers as by-product of the primal optimization framework! This is the full list of included classifiers:

The dual formulations are optimized by LibSVM (the included interface is for LibSVM 2.89, but it can be straighforwardly extended to use the latest LibSVM - contact me if you update my code!).

NEWS: One-Class SVM and One-Class Laplacian SVM have been added (both solved in their primal formulations).

Details

The library is written in Matlab, and it has been tested with Matlab versions from 7.6 to 2012a. An example on how to use the code can be found on the "example.m" script, whereas a complete GUI for playing with Laplacian SVMs (and other classifiers) on some toy datasets can be ran from the "demo.m" file (after positioning the in the "gui" folder, otherwise the Matlab demo will start). It is very intuitive to use, so that you should not have any problems with it.

First, run the "setpaths" command, and then use the Matlab help function to see all the details of each script. If an unregularized bias is added to the kernel expansion of primal LapSVM, then the Laplacian is assumed to be not normalized.

In order to use the original dual LibSVM code, the included MEX interface must be complied with "make_libsvm.m" (this requires a C/C++ compiler installed in your machine!). The binary for Windows Vista 64-bit is included.

If you use the early stopping conditions for Preconditioned Conjugate Gradient, be sure to have understood how they work, and remember that each condition has a corresponding parameter that must be tuned for better results (using the default values is not necessarily the best choice!).

Demo GUI Demo GUI

 

Downloads

If you use this library, you are kindly requested to cite the reference paper, for which we provide the BibTeX code:

@article{melacci2011primallapsvm,
  title={{Laplacian Support Vector Machines Trained in the Primal}},
  author={Melacci, Stefano and Belkin, Mikhail},
  journal={Journal of Machine Learning Research},
  volume={12},
  month={March},
  year={2011},  
  issn={1532-4435},  
  pages={1149--1184},
  numpages={36},
  publisher={JMLR.org}
}

Primal LapSVM (ver. 0.2 - updated on November 2012)

Credits

The primal Laplacian SVM library has been developed by Stefano Melacci.

The code for building the graph Laplacian and computing the Kernel matrix is just a minor rewrite of the code written by Vikas Sindwhani for the paper "V. Sindhwani, P. Niyogi, M. Belkin. Beyond the Point Cloud: from Transductive to Semi-supervised Learning, International Conference on Machine Learning (ICML), 2005".

The general code structure of the original implementation was based on the primal SVM solver of Oliver Chapelle. The reference paper for primal SVM is "O. Chapelle, Training a support vector machine in the primal, Neural Computation, MIT Press, 2007".

The One-Class extension is a joint work with Salvatore Frandina.