source: orange/orange/doc/reference/SupportVectorMachines.htm @ 6538:a5f65d7f0b2c

Revision 6538:a5f65d7f0b2c, 5.5 KB checked in by Mitar <Mitar@…>, 4 years ago (diff)

Made XPM version of the icon 32x32.

Line 
1<html><HEAD>
2<LINK REL=StyleSheet HREF="../style.css" TYPE="text/css" MEDIA=screen>
3<LINK REL=StyleSheet HREF="style-print.css" TYPE="text/css" MEDIA=print>
4</HEAD> <body>
5
6<h1>Support Vector Machines</h1>
7<index name="classes+support vector machines">
8
9<p>Support vector machines (SVM) is a popular machine learning method
10with variants for classification, regression and distribution
11estimation that can learn a problem in a higher dimensional space
12through the use of a kernel trick. Integrated with Orange is a popular
13implementation by Chang and Lin, <a
14href="http://www.csie.ntu.edu.tw/~cjlin/libsvm">libsvm</a>. Orange
15currently embeds version 8.1 of the library, which supports:</p>
16
17<ul>
18 <li>C support vector classification (C_SVC)</li>
19 <li>NU support vector classification (Nu_SVC)</li>
20 <li>ONE CLASS distribution estimation (OneClass)</li>
21 <li>EPSILON support vector regression (Epsilon_SVR)</li>
22 <li>NU support vector regression (Nu_SVR)</li>
23</ul>
24
25<p>The support for the following kernel functions is provided by
26libsvm library:</p>
27
28<ul>
29 <li>linear: u'*v</li>
30 <li>polynomial: (gamma*u'*v + coef0)^degree</li>
31 <li>radial basis function: exp(-gamma*|u-v|^2)</li>
32 <li>sigmoid: tanh(gamma*u'*v + coef0)</li>
33 <li>custom kernel (any function that remotely resembles a distance measure between two examples that can be implemented in python)</li>
34</ul>
35</p>
36
37<p>See also <a href="LinearLearner.htm">LinearLearner</a> for a fast linear SVM implementation.
38
39<h2>SVMLearner</h2>
40
41<p><INDEX name="classes/SVMLearner">SVMLearner class constructs a <INDEX name="classes/SVMClassifier">SVMClassifier </p>
42<p class=section >Attributes</p>
43<dl class=attributes>
44  <dt>svm_type</dt>
45  <dd>Defines the type of SVM (can be SVMLearner.C_SVC (default), SVMLearner.Nu_SVC, SVMLearner.OneClass, SVMLearner.Epsilon_SVR, SVMLearner.Nu_SVR)</dd>
46  <dt>kernel_type</dt>
47  <dd>Defines the type of a kernel to use for learning (can be SVMLearner.RBF (default), SVMLearner.Linear, SVMLearner.Polynomial, SVMLearner.Sigmoid, SVMLearner.Custom)</dd>
48  <dt>degree</dt>
49  <dd>Kernel parameter (Polynomial) (default 3)</dd>
50  <dt>gamma</dt>
51  <dd>Kernel parameter (Polynomial/RBF/Sigmoid) (default 1.0/number_of_examples)</dd>
52  <dt>coef0</dt>
53  <dd>Kernel parameter (Polynomial/Sigmoid) (default 0)</dd>
54  <dt>kernelFunc</dt>
55  <dd>Function that will be called if <code>kernel_type</code> is SVMLearner.Custom. It must accept two orange.Example arguments and return a float.</dd>
56  <dt>C</dt>
57  <dd>C parameter for C_SVC, Epsilon_SVR, Nu_SVR</dd>
58  <dt>nu</dt>
59  <dd>Nu parameter for Nu_SVC, Nu_SVR and OneClass (default 0.5)</dd>
60  <dt>p</dt>
61  <dd>Epsilon in loss-function for Epsilon_SVR</dd>
62  <dt>cache_size</dt>
63  <dd>Cache memory size in MB (default 100)</dd>
64  <dt>eps</dt>
65  <dd>Tolerance of termination criterion (default 0.001)</dd>
66  <dt>shrinking</dt>
67  <dd>Determines whether to use shrinking heuristics (default True)</dd>
68  <dt>probability</dt>
69  <dd>Determines if a probability model should be build (default False)</dd>
70</dl>
71<h2>SVMLearnerSparse</h2>
72<p><INDEX name="classes/SVMLearnerSparse">Same as above except that it learns from the examples mata attributes.  Note that meta attributes dont need to be registerd with the dataset domain, or present in all the examples.
73Use this if you are using large sparse datasets. </p>
74
75<h2>SVMClassifier</h2>
76<p>Classifier used for classification, regression or distribution estimation (OneClass). In the later case the return value of the __call__ function can be 1.0 (positive case) or -1.0(negative case).</p>
77<p>For a multiclass classification problem with k classes there are k*(k-1)/2 1class vs. 1class internal binary classifiers being build. The multiclass classification is then performed by a majority vote.</p>
78<p class=section>Attributes</p>
79<dl class=attributes>
80  <dt>examples</dt>
81  <dd>Holds the examples used for training</dd>
82  <dt>supportVectors</dt>
83  <dd>Holds the support vectors. They are listed in the order of their classes (i.e. they are grouped by the order of classes as they apear in the domains <code>classVar.values</code>) </dd>
84  <dt>nSV</dt>
85  <dd>Number of support vectors for each class (the same order as above)</dd>
86  <dt>rho</dt>
87  <dd>Constants in decision functions in the order of 1v2, 1v3, ... 1vsN, 2vs3, 2vs4, ...</dd>
88  <dt>coef</dt>
89  <dd>Coefficients for support vectors in decision functions (coef[nClass-1][nSupportVectors]). If k is the total number of classes then, for each support vector there are k-1 coefficients y*alpha where alpha are dual solution of the following two class problems: 1 vs j, 2 vs j, ..., j-1 vs j, j vs j+1, j vs j+2, ..., j vs k; and y=1 in first j-1 coefficients, y=-1 in the remaining k-j coefficients</dd>
90</dl>
91<p class=section>Methods</p>
92<dl class=attributes>
93  <dt>getDecisionValues(example)</dt>
94  <dd>Return the decision values of all nClass*(nClass-1)/2 internal binary classifiers in the order of 1v2, 1v3, ... 1vsN, 2vs3, 2vs4, ...</dd>
95</dl>
96</p>
97<h2>Examples</h2>
98<xmp class=code>>>> import orange
99>>> data=orange.ExampleTable("iris.tab")
100>>> l=orange.SVMLearner()
101>>> l.svm_type=orange.SVMLearner.Nu_SVC
102>>> l.nu=0.3
103>>> l.probability=True
104>>> c=l(data)
105>>> for e in data:
106...  print e[-1], c(e), c(e, c.GetProbabilities)
107...
108Iris-setosa Iris-setosa <0.971, 0.015, 0.014>
109Iris-setosa Iris-setosa <0.964, 0.019, 0.016>
110Iris-setosa Iris-setosa <0.968, 0.016, 0.016>
111...
112</xmp>
113
114<hr>
115
116<H2>References</H2>
117
118<p>Chih-Chung Chang and Chih-Jen Lin, LIBSVM : a library for support
119vector machines, 2001. Software available at <a
120href=http://www.csie.ntu.edu.tw/~cjlin/libsvm>http://www.csie.ntu.edu.tw/~cjlin/libsvm</a></P>
121
122</body></html>
123
124
Note: See TracBrowser for help on using the repository browser.