This unit represents an Support Vector Machine and is implemented through the LIBSVM Python interface. It functions somewhat like a Model or a Network, but combining it with other PyBrain Models is currently discouraged. Its main function is to compare against feed-forward network classifications. You cannot get or set model parameters, but you can load and save the entire model in LIBSVM format. Sequential data and backward passes are not supported. See the corresponding example code for usage.
Initializes as empty module.
If model is given, initialize using this LIBSVM model instead. indim and outdim are for compatibility only, and ignored.
Produce the output from the current input vector, or process a dataset.
If values is False or ‘class’, output is set to the number of the predicted class. If True or ‘raw’, produces decision values instead. These are stored in a dictionary for multi-class SVM. If prob, class probabilities are produced. This works only if probability option was set for SVM training.
Run the module’s forward pass on the given dataset unconditionally and return the output as a list.
|Parameter:||dataset – A non-sequential supervised data set.|
|Key values:||Passed trough to forwardPass() method.|
A class performing supervised learning of a DataSet by an SVM unit. See the remarks on SVMUnit above. This whole class is a bit of a hack, and provided mostly for convenience of comparisons.
Initialize data and unit to be trained, and load the model, if provided.
The passed svmunit has to be an object of class SVMUnit that is going to be trained on the ClassificationDataSet object dataset. Compared to FNN training we do not use a test data set, instead 5-fold cross-validation is performed if needed.
If modelfile is provided, this model is loaded instead of training. If plot is True, a grid search is performed and the resulting pattern is plotted.
Train the SVM on the dataset. For RBF kernels (the default), an optional meta-parameter search can be performed.
|Key search:||optional name of grid search class to use for RBF kernels: ‘GridSearch’ or ‘GridSearchDOE’|
|Key log2g:||base 2 log of the RBF width parameter|
|Key log2c:||base 2 log of the slack parameter|
|Key searchlog:||filename into which to dump the search log|
|Key others:||...are passed through to the grid search and/or libsvm|
Set parameters for SVM training. Apart from the ones below, you can use all parameters defined for the LIBSVM svm_model class, see their documentation.
|Key searchlog:||Save a list of coordinates and the achieved CV accuracy to this file.|
Helper class used by SVMTrainer to perform an exhaustive grid search, and plot the resulting accuracy surface, if desired. Adapted from the LIBSVM python toolkit.
Set up (log) grid search over the two RBF kernel parameters C and gamma.
step width for log2C and log2gamma (ignored for DOE search)
split dataset into this many parts for cross-validation
if True, plot the error surface contour (regular) or search pattern (DOE)
maximum window bisection depth (DOE only)
Save a list of coordinates and the achieved CV accuracy to this file
...are passed through to the cross_validation method of LIBSVM
Same as GridSearch, but implements a design-of-experiments based search pattern, as described by C. Staelin, http://www.hpl.hp.com/techreports/2002/HPL-2002-354R1.pdf