Orange Forum • View topic - SVMLearnerEasy -- what's next

SVMLearnerEasy -- what's next

A place to ask questions about methods in Orange and how they are used and other general support.

SVMLearnerEasy -- what's next

Postby bgbg » Wed Nov 25, 2009 9:06

According to the documentation, SVMLearnerEasy is equivalent to easy.py from libsvm. More specifically, SVMLearnerEasy is supposed to scale the data and to find the appropriate SVM parameters.

Assume I do the following:

Code: Select all

l = orngSVM.SVMLearnerEasy()
c = l(training)



When using different data sets for training and validation, am I supposed to scale the validation data before feeding it to the classifier, c or is the scaling pefrormed by the classifier itself? Also, where are the scaling parameters stored?

Another issue is re-training the classifier. Say I would like to re-train the classifier with another data set, but keeping the same SVM parameters. How is this task achieved?


Thank you

Postby JC » Thu Nov 26, 2009 17:08

I believe the datas are scaled by default... that's what is done with your datas when calling the classifier (i.e. classifier(testingData))

example=orange.ExampleTable([example]).translate(self.domain)[0]

This line builds a new example table out of the single example and applies the domain contained in the class (which is the domain containing the scale).

Postby Ales » Mon Nov 30, 2009 11:05

Also, where are the scaling parameters stored?

Currently they are stored in the classifiers transformed domain.
You can access them by e.g.
Code: Select all
c.examples.domain[0].getValueFrom.transformer.average
c.examples.domain[0].getValueFrom.transformer.span

but note that this may change in the short future (with the implementation of a more general preprocessors)

. Say I would like to re-train the classifier with another data set, but keeping the same SVM parameters. How is this task achieved?


Use the SVMLearner.tuneParameters function.
Code: Select all
l.tuneParemeters(examples, parameters=["nu", "gamma"])
kwargs = dict(nu=l.nu, gamma=l.gamma)

c = orngSVM.SVMLearner(examples, **kwargs)

Postby bgbg » Thu Dec 03, 2009 13:11

Dear Ales,
the tuneParameters function is not documented anywhere. From what I see in the code, it is part of SVMLearner and is never called inside that class. How is this function related to SVMLearnerEasy? Do they perform same, or similar job?


Thank you very much

Postby Ales » Mon Dec 07, 2009 10:15

They perform very similar jobs. In fact the parameter optimizations code in SVMLearnerEasy is identical to calling tuneParameters with folds=4 in SVMLearner.

Postby bgbg » Tue Dec 08, 2009 14:01

Following your answer I briefly reviewed the relevant code in orngSVM. It seems that neither svm_type or kernel_type are tuned by tuneParameters (and, hence, by SVMLearnerEasy). As far as I understand, easy.py from libsvm does optimize those parameters. Thus, if I am correct, please update the documentation of SVMLearnerEasy.

Postby Ales » Tue Dec 08, 2009 15:55

No. The easy.py script optimizes only C and gamma parameters.


Return to Questions & Support