source: orange/Orange/doc/modules/orngSVM.htm @ 9671:a7b056375472

Revision 9671:a7b056375472, 7.1 KB checked in by anze <anze.staric@…>, 2 years ago (diff)

Moved orange to Orange (part 2)

Line 
1<html>
2<head>
3<link rel=stylesheet href="../style.css" type="text/css" media=screen>
4</head>
5<body>
6<h1>orngSVM</h1>
7<index name="classifiers+support vector machines">
8<index name="modules/support vector machines">
9
10<p>orngSVM provides accsess to orange Support Vector Machine functionality.</p>
11<p>Important!!! On some datasets this learner can perform very badly. It is a known fact that SVM's can be very sensitive to the proper choice of the parameters. If you are having problems with
12learner's accuracy try scaling the data and using diffrent parameters or choose an easier approach and use the <code>SVMLearnerEasy</code> class whitch does this automatically.</p>
13<h2>SVMLearner</h2>
14<p><INDEX name="classes/SVMLearner (in orngSVM)">SVMLearner is a function that constructs a SVMLearner class and optionaly trains it on provided examples</p>
15<p class=section >Arguments</p>
16<dl class=arguments>
17  <dt>svm_type</dt>
18  <dd>Defines the type of SVM (can be SVMLearner.C_SVC, SVMLearner.Nu_SVC (default), SVMLearner.OneClass, SVMLearner.Epsilon_SVR, SVMLearner.Nu_SVR)</dd>
19  <dt>kernel_type</dt>
20  <dd>Defines the type of a kernel to use for learning (can be SVMLearner.RBF (default), SVMLearner.Linear, SVMLearner.Polynomial, SVMLearner.Sigmoid, SVMLearner.Custom)</dd>
21  <dt>degree</dt>
22  <dd>Kernel parameter (Polynomial) (default 3)</dd>
23  <dt>gamma</dt>
24  <dd>Kernel parameter (Polynomial/RBF/Sigmoid) (default 1/number_of_examples)</dd>
25  <dt>coef0</dt>
26  <dd>Kernel parameter (Polynomial/Sigmoid) (default 0)</dd>
27  <dt>kernelFunc</dt>
28  <dd>Function that will be called if <code>kernel_type</code> is SVMLearner.Custom. It must accept two orange.Example arguments and return a float.</dd>
29  <dt>C</dt>
30  <dd>C parameter for C_SVC, Epsilon_SVR, Nu_SVR</dd>
31  <dt>nu</dt>
32  <dd>Nu parameter for Nu_SVC, Nu_SVR and OneClass (default 0.5)</dd>
33  <dt>p</dt>
34  <dd>Epsilon in loss-function for Epsilon_SVR</dd>
35  <dt>cache_size</dt>
36  <dd>Cache memory size in MB (default 100)</dd>
37  <dt>eps</dt>
38  <dd>Tolerance of termination criterion (default 0.001)</dd>
39  <dt>shrinking</dt>
40  <dd>Determines whether to use shrinking heuristics (default True)</dd>
41  <dt>probability</dt>
42  <dd>Determines if a probability model should be build (default False)</dd>
43</dl>
44<h2>SVMLearnerSparse</h2>
45<p><INDEX name="classes/SVMLearnerSparse (in orngSVM)">Same as <code>SVMLearner</code> except that it learns from the examples mata attributes.  Note that meta attributes dont need to be registerd with the dataset domain, or present in all the examples.
46Use this if you are using large sparse datasets. </p>
47
48<h2>SVMLearnerEasy</h2>
49<p><INDEX name="classes/SVMLearner (in orngSVM)">Same as above except that it will automaticaly scale the data and perform parameter optimization using the <code>parameter_selection</code> similar to the easy.py script
50in libSVM package. Use this if the <code>SVMLearner</code> performs badly. </p> 
51
52<h2>SVMLearnerSparseEasy</h2>
53<p><INDEX name="classes/SVMLearnerSparseEasy (in orngSVM)">Same as <code>SVMLearnerEasy</code> except that it learns from the examples mata attributes.  Note that meta attributes dont need to be registerd with the dataset domain, or present in all the examples.
54Use this if you are using large sparse datasets (and have absolutely no respect for the fourth dimension commonly named as time). </p>
55
56<h2>getLinearSVMWeights</h2>
57<p>Returns a list of weights of linear class vs. class classifiers for the linear multiclass svm classifier. The list is in the order of 1vs2, 1vs3 ... 1vsN, 2vs3 ...
58
59<h2>KernelWrapper (DualKernelWrapper)</h2>
60<p><INDEX name="classes/KernelWrapper (in orngSVM)">KernelWrapper (DualKernelWrapper) is an abstract wrapper class that take one (two) kernel function (functions) as a initalization parameters
61and uses them to compute a new kernel function. The available kernel wrappers are RBFKernelWrapper, PolyKernelWrapper, AdditionKernelWrapper, MultiplicationKernelWrapper.</p>
62<p class=section>Methods</p>
63<dl class=methods>
64    <dt>__call__(example1, example2)</dt>
65    <dd>Computes the kernel function for the two examples</dd>
66</dl>
67<h2><INDEX name="classes/RBFKernelWrapper (in orngSVM)">RBFKernelWrapper</h2>
68<p>Takes one kernel function (K1) in initialization and uses it to compute a new kernel function: K(x,y)=exp(K1(x,y)^2/gamma)
69<p class=section>Attributes</p>
70<dl class=attributes>
71    <dt>gamma</dt>
72    <dd>gamma to use in the kernel function</dd>
73</dl>
74<h2><INDEX name="classes/PolyKernelWrapper (in orngSVM)">PolyKernelWrapper</h2>
75<p>Takes one kernel function (K1) in initialization and uses it to compute a new kernel function: K(x,y)=K1(x,y)^degree
76<p class=section>Attributes</p>
77<dl class=attributes>
78    <dt>degree</dt>
79    <dd>degree to use in the kernel function</dd>
80</dl>
81<h2><INDEX name="classes/AdditionKernelWrapper (in orngSVM)">AdditionKernelWrapper</h2>
82<p>Takes two kernel functions (K1  and K2) in initialization and uses them to compute a new kernel function: K(x,y)=K1(x,y)+K2(x,y)
83<h2><INDEX name="classes/MultiplicationKernelWrapper (in orngSVM)">MultiplicationKernelWrapper</h2>
84<p>Takes two kernel functions (K1  and K2) in initialization and uses them to compute a new kernel function: K(x,y)=K1(x,y)*K2(x,y)</p>
85<h2><INDEX name="classes/CompositeKernelWrapper (in orngSVM)">CompositeKernelWrapper</h2>
86<p>Takes two kernel functions (K1  and K2) in initialization and uses them to compute a new kernel function: K(x,y)=&lambda*K1(x,y)+(1-&lambda)*K2(x,y)</p>
87<p class=section>Attributes</p>
88<dl class=attributes>
89    <dt>_lambda</dt>
90    <dd>lambda to use in the kernel function</dd>
91</dl>
92<h2><INDEX name="classes/SparseLinKernel (in orngSVM)">SparseLinKernel</h2>
93<p>A linear kernel function that uses the examples meta attributes (must be floats) that need not be present in all examples</p>
94<h2>Examples</h2>
95<p class="header">part of <a href="svm-custom-kernel.py">svm-custom-kernel.py</a>
96(uses <a href="iris.tab">iris.tab</a>)</p>
97<xmp class=code>import orange, orngSVM
98data=orange.ExampleTable("iris.tab")
99l1=orngSVM.SVMLearner()
100l1.kernelFunc=orngSVM.RBFKernelWrapper(orange.ExamplesDistanceConstructor_Euclidean(data), gamma=0.5)
101l1.kernel_type=orange.SVMLearner.Custom
102l1.probability=True
103c1=l1(data)
104l1.name="SVM - RBF(Euclidean)"
105
106l2=orngSVM.SVMLearner()
107l2.kernelFunc=orngSVM.RBFKernelWrapper(orange.ExamplesDistanceConstructor_Hamming(data), gamma=0.5)
108l2.kernel_type=orange.SVMLearner.Custom
109l2.probability=True
110c2=l2(data)
111l2.name="SVM - RBF(Hamming)"
112
113l3=orngSVM.SVMLearner()
114l3.kernelFunc=orngSVM.CompositeKernelWrapper(orngSVM.RBFKernelWrapper(orange.ExamplesDistanceConstructor_Euclidean(data), gamma=0.5),orngSVM.RBFKernelWrapper(orange.ExamplesDistanceConstructor_Hamming(data), gamma=0.5), l=0.5)
115l3.kernel_type=orange.SVMLearner.Custom
116l3.probability=True
117c3=l1(data)
118l3.name="SVM - Composite"
119
120
121import orngTest, orngStat
122tests=orngTest.crossValidation([l1, l2, l3], data, folds=5)
123[ca1, ca2, ca3]=orngStat.CA(tests)
124print l1.name, "CA:", ca1
125print l2.name, "CA:", ca2
126print l3.name, "CA:", ca3
127</xmp>
128
129<h2>LinearLearner</h2>
130<p>A wrapper around <a href=../reference/LinearLearner.htm>orange.LinearLearner</a> with a default solver_type == L2Loss_SVM_Dual (the default in orange.LinearLearner is L2_LR).</p>
131
Note: See TracBrowser for help on using the repository browser.