Changeset 9054:7fa8073a3557 in orange
 Timestamp:
 10/04/11 18:08:56 (3 years ago)
 Branch:
 default
 Convert:
 d3c078f2b7f03adbadb1b004b23d4e284d8e5ea9
 Location:
 orange/Orange/classification/svm
 Files:

 2 edited
Legend:
 Unmodified
 Added
 Removed

orange/Orange/classification/svm/__init__.py
r9025 r9054 8 8 ********************************* 9 9 10 This module wraps the `LibSVM library 11 <http://www.csie.ntu.edu.tw/~cjlin/libsvm/>`_, a library for `support vector 12 machines <http://en.wikipedia.org/wiki/Support_vector_machine>`_ (SVM). SVM 13 learners from LibSVM behave like ordinary Orange learners and can be 14 used as Python objects for training, classification and evaluation. The 15 implementation supports Pythonbased kernels, that can be pluggedin the 16 LibSVM. 10 This is a module for `Support Vector Machine`_ (SVM) classification. It 11 exposes the underlying `LibSVM`_ and `LIBLINEAR`_ library in a standard 12 Orange Learner/Classifier interface. 13 14 Choosing the right learner 15 ========================== 16 17 Choose an SVM learner suitable for the problem. 18 :obj:`SVMLearner` is a general SVM learner. :obj:`SVMLearnerEasy` will 19 help with the data normalization and parameter tuning. Learn with a fast 20 :obj:`LinearSVMLearner` on data sets with a large number of features. 17 21 18 22 .. note:: SVM can perform poorly on some data sets. Choose the parameters 19 23 carefully. In cases of low classification accuracy, try scaling the 20 data and try with different parameters. :obj:`SVMLearnerEasy` class does this 21 automatically (it is similar to the `svmeasy.py` script in the LibSVM 22 distribution). 24 data and experiment with different parameters. 25 :obj:`SVMLearnerEasy` class does this automatically (it is similar 26 to the `svmeasy.py` script in the LibSVM distribution). 27 23 28 24 SVM learners 25 ============ 26 27 Choose an SVM learner suitable for the problem. :obj:`SVMLearner` is a 28 general SVM learner. Use :obj:`SVMLearnerSparse` to learn from the 29 meta attributes. :obj:`SVMLearnerEasy` helps with 30 the data normalization and parameter tuning. Learn with a fast 31 :obj:`LinearLearner` on data sets with a large number of features. 29 SVM learners (from `LibSVM`_) 30 ============================= 31 32 The most basic :class:`SVMLearner` implements the standard `LibSVM`_ learner 33 (if you have used the `svmtrain` commandline tool this will be familiar). 34 It supports four builtin kernel types (Linear, Polynomial, RBF and Sigmoid). 35 Additionally kernel functions defined in Python can be used instead. 36 37 .. note:: For learning from ordinary :class:`Orange.data.Table` use the 38 :class:`SVMLearner`. For learning from sparse dataset (i.e. 39 data in `basket` format) use the :class:`SVMLearnerSparse` class. 40 41 .. autoclass:: Orange.classification.svm.SVMLearner 42 :members: 43 44 .. autoclass:: Orange.classification.svm.SVMLearnerSparse 45 :members: 46 :showinheritance: 47 48 .. autoclass:: Orange.classification.svm.SVMLearnerEasy 49 :members: 50 :showinheritance: 32 51 33 52 The next example shows how to use SVM learners and that :obj:`SVMLearnerEasy` 34 53 with automatic data preprocessing and parameter tuning 35 outperforms :obj:`SVMLearner` with the default :obj:`~SVMLearner.nu` and :obj:`~SVMLearner.gamma`: 54 outperforms :obj:`SVMLearner` with the default :obj:`~SVMLearner.nu` and :obj:`~SVMLearner.gamma`: 36 55 37 56 .. literalinclude:: code/svmeasy.py 38 57 39 .. autoclass:: Orange.classification.svm.SVMLearner 40 :members: 58 41 59 42 .. autoclass:: Orange.classification.svm.SVMLearnerSparse 43 :members: 60 Linear SVM learners (from `LIBLINEAR`_) 61 ======================================= 62 63 The :class:`LinearSVMLearner` learner is more suitable for large scale 64 problems as it is significantly faster then :class:`SVMLearner` and its 65 subclasses. A down side is it only supports a linear kernel (as the name 66 suggests) and does not support probability estimation for the 67 classifications. Furthermore a Multiclass SVM learner 68 :class:`MultiClassSVMLearner` is provided. 44 69 45 .. autoclass:: Orange.classification.svm. SVMLearnerEasy70 .. autoclass:: Orange.classification.svm.LinearSVMLearner 46 71 :members: 47 72 48 .. autoclass:: Orange.classification.svm. LinearLearner73 .. autoclass:: Orange.classification.svm.MultiClassSVMLearner 49 74 :members: 50 75 76 77 SVM Based feature selection and scoring 78 ======================================= 79 80 .. autoclass:: Orange.classification.svm.RFE 81 82 .. autoclass:: Orange.classification.svm.Score_SVMWeights 83 :showinheritance: 84 85 51 86 Utility functions 52  87 ================= 88 89 Some utility functions which are used in this module but might also 90 be of use to you. 53 91 54 92 .. automethod:: Orange.classification.svm.max_nu … … 58 96 .. automethod:: Orange.classification.svm.table_to_svm_format 59 97 60 The following example shows how to get linear SVM weights: 61 98 The following example shows how to get linear SVM weights: 99 62 100 .. literalinclude:: code/svmlinearweights.py 63 101 64 SVMderived feature weights65 66 67 .. autoclass:: Orange.classification.svm.Score_SVMWeights68 :members:69 102 70 103 .. _kernelwrapper: … … 73 106 =============== 74 107 75 Kernel wrappers are used to build custom kernels. All wrapper constructors take one 76 or more Python functions (`wrapped` attribute) to wrap. The function must be a 77 positive definite kernel, taking two floating point parameters of type double, and return a 78 floating point number. 108 Kernel wrappers are helper classes used to build custom kernels for use 109 with :class:`SVMLearner` and subclasses. All wrapper constructors take 110 one or more Python functions (`wrapped` attribute) to wrap. The 111 function must be a positive definite kernel, taking two arguments of 112 type :class:`Orange.data.Instance` and return a float. 79 113 80 114 .. autoclass:: Orange.classification.svm.kernels.KernelWrapper … … 102 136 :members: 103 137 104 .. autoclass:: Orange.classification.svm.kernels.BagOfWords105 :members:106 107 138 Example: 108 139 109 140 .. literalinclude:: code/svmcustomkernel.py 110 111 112 113 SVMbased recursive feature elimination114 =======================================115 116 .. autoclass:: Orange.classification.svm.RFE117 :members:118 141 119 142 … … 125 148 .. _vehicle.tab: code/vehicle.tab 126 149 150 .. _`Support Vector Machine`: http://en.wikipedia.org/wiki/Support_vector_machine 151 .. _`LibSVM`: http://www.csie.ntu.edu.tw/~cjlin/libsvm/ 152 .. _`LIBLINEAR`: http://www.csie.ntu.edu.tw/~cjlin/liblinear/ 153 127 154 """ 128 155 … … 145 172 SVMClassifier, \ 146 173 SVMClassifierSparse 147 148 # ORANGE Support Vector Machines 149 # This module was written by Ales Erjavec 150 # and supersedes an earlier one written by Alex Jakulin (jakulin@acm.org), 151 # based on: ChihChung Chang and ChihJen Lin's 152 # LIBSVM : a library for support vector machines 153 # (http://www.csie.ntu.edu.tw/~cjlin/papers/libsvm.ps.gz) 154 155 #from Orange.misc import _orange__new__ 156 174 175 from Orange.preprocess import Preprocessor_impute, \ 176 Preprocessor_continuize, \ 177 Preprocessor_preprocessorList, \ 178 DomainContinuizer 179 180 from Orange.misc import _orange__new__ 181 157 182 def _orange__new__(base=Orange.core.Learner): 158 183 """Return an orange 'schizofrenic' __new__ class method. … … 168 193 from functools import wraps 169 194 @wraps(base.__new__) 170 def _orange__new_wrapped(cls, data=None, **kwargs):195 def _orange__new_wrapped(cls, data=None, weight_id=None, **kwargs): 171 196 self = base.__new__(cls, **kwargs) 172 197 if data: 173 198 self.__init__(**kwargs) 174 return self.__call__(data )199 return self.__call__(data, weight_id) 175 200 else: 176 201 return self … … 178 203 179 204 def max_nu(data): 180 """Return the maximum nu parameter for Nu_SVC support vector learning 181 for the given data.182 183 :param data: data with continuous features205 """Return the maximum nu parameter for Nu_SVC support vector learning 206 for the given data table. 207 208 :param data: Data with discrete class variable 184 209 :type data: Orange.data.Table 185 210 … … 197 222 198 223 class SVMLearner(_SVMLearner): 199 """:param svm_type: defines the SVM type (can be C_SVC, Nu_SVC 224 """ 225 :param svm_type: defines the SVM type (can be C_SVC, Nu_SVC 200 226 (default), OneClass, Epsilon_SVR, Nu_SVR) 201 227 :type svm_type: SVMLearner.SVMType … … 203 229 (can be kernels.RBF (default), kernels.Linear, kernels.Polynomial, 204 230 kernels.Sigmoid, kernels.Custom) 205 :type kernel_type: kernel function, see :ref:`kernelwrapper`231 :type kernel_type: SVMLearner.Kernel 206 232 :param degree: kernel parameter (for Polynomial) (default 3) 207 233 :type degree: int … … 212 238 :type coef0: int 213 239 :param kernel_func: function that will be called if `kernel_type` is 214 `Custom`. It must accept two :obj:`Orange.data.Instance` arguments and 215 return a distance between the instances. 240 `kernels.Custom`. It must accept two :obj:`Orange.data.Instance` 241 arguments and return a float (see :ref:`kernelwrapper` for some 242 examples). 216 243 :type kernel_func: callable function 217 :param C: C parameter for C_SVC, Epsilon_SVR ,Nu_SVR244 :param C: C parameter for C_SVC, Epsilon_SVR and Nu_SVR 218 245 :type C: float 219 246 :param nu: Nu parameter for Nu_SVC, Nu_SVR and OneClass (default 0.5) … … 221 248 :param p: epsilon in lossfunction for Epsilon_SVR 222 249 :type p: float 223 :param cache_size: cache memory size in MB (default 100)250 :param cache_size: cache memory size in MB (default 200) 224 251 :type cache_size: int 225 252 :param eps: tolerance of termination criterion (default 0.001) … … 231 258 (default True) 232 259 :type shrinking: bool 233 :param weight s: a list of class weights234 :type weight s: list260 :param weight: a list of class weights 261 :type weight: list 235 262 236 263 Example: 237 264 238 265 >>> import Orange 266 >>> from Orange.classification import svm 267 >>> from Orange.evaluation import testing, scoring 239 268 >>> table = Orange.data.Table("vehicle.tab") 240 >>> svm = Orange.classification.svm.SVMLearner()241 >>> results = Orange.evaluation.testing.cross_validation([svm], table, folds=5)242 >>> print Orange.evaluation.scoring.CA(results)269 >>> learner = svm.SVMLearner() 270 >>> results = testing.cross_validation([learner], table, folds=5) 271 >>> print scoring.CA(results) 243 272 244 273 """ … … 281 310 282 311 def __call__(self, data, weight=0): 283 """Construct a SVM classifier .312 """Construct a SVM classifier 284 313 285 314 :param table: data with continuous features 286 315 :type table: Orange.data.Table 316 287 317 :param weight: refer to `LibSVM documentation 288 318 <http://http://www.csie.ntu.edu.tw/~cjlin/libsvm/>`_ … … 291 321 292 322 examples = Orange.core.Preprocessor_dropMissingClasses(data) 323 class_var = examples.domain.class_var 293 324 if len(examples) == 0: 294 325 raise ValueError("Example table is without any defined classes") 326 327 # Fix the svm_type parameter if we have a class_var/svm_type mismatch 295 328 if self.svm_type in [0,1] and \ 296 examples.domain.classVar.varType!=Orange.data.Type.Discrete:297 self.svm_type +=3329 isinstance(class_var, Orange.data.variable.Continuous): 330 self.svm_type += 3 298 331 #raise AttributeError, "Cannot learn a discrete classifier from non descrete class data. Use EPSILON_SVR or NU_SVR for regression" 299 332 if self.svm_type in [3,4] and \ 300 examples.domain.classVar.varType==Orange.data.Type.Discrete:301 self.svm_type =3333 isinstance(class_var, Orange.data.variable.Discrete): 334 self.svm_type = 3 302 335 #raise AttributeError, "Cannot do regression on descrete class data. Use C_SVC or NU_SVC for classification" 303 if self.kernel_type==4 and not self.kernel_func: 304 raise AttributeError, "Custom kernel function not supplied" 305 ################################################## 306 # if self.kernel_type==4: #There is a bug in svm. For some unknown reason only the probability model works with custom kernels 307 # self.probability=True 308 ################################################## 336 if self.kernel_type == kernels.Custom and not self.kernel_func: 337 raise ValueError("Custom kernel function not supplied") 338 309 339 nu = self.nu 310 if self.svm_type == SVMLearner.Nu_SVC: #is nu feasib ile340 if self.svm_type == SVMLearner.Nu_SVC: #is nu feasible 311 341 max_nu= self.max_nu(examples) 312 342 if self.nu > max_nu: … … 329 359 data = self._normalize(data) 330 360 svm = self.learner(data) 331 # if self.:332 # return SVMClassifierWrapper(svm)333 # else:334 361 return SVMClassifierWrapper(svm) 335 362 return self.learner(data) … … 338 365 def tune_parameters(self, data, parameters=None, folds=5, verbose=0, 339 366 progress_callback=None): 340 """Tune parameters on given datausing367 """Tune the ``parameters`` on given ``data`` using 341 368 cross validation. 342 369 … … 357 384 >>> svm = Orange.classification.svm.SVMLearner() 358 385 >>> svm.tune_parameters(table, parameters=["gamma"], folds=3) 359 386 360 387 """ 361 388 … … 389 416 def _normalize(self, data): 390 417 dc = Orange.core.DomainContinuizer() 391 dc.class Treatment = Orange.core.DomainContinuizer.Ignore392 dc.continuous Treatment = Orange.core.DomainContinuizer.NormalizeBySpan393 dc.multinomial Treatment = Orange.core.DomainContinuizer.NValues418 dc.class_treatment = Orange.core.DomainContinuizer.Ignore 419 dc.continuous_treatment = Orange.core.DomainContinuizer.NormalizeBySpan 420 dc.multinomial_treatment = Orange.core.DomainContinuizer.NValues 394 421 newdomain = dc(data) 395 422 return data.translate(newdomain) … … 417 444 def class_distribution(self, example): 418 445 example = Orange.data.Instance(self.wrapped.domain, example) 419 return self.wrapped.class Distribution(example)446 return self.wrapped.class_distribution(example) 420 447 421 448 def get_decision_values(self, example): 422 449 example = Orange.data.Instance(self.wrapped.domain, example) 423 return self.wrapped.get DecisionValues(example)450 return self.wrapped.get_decision_values(example) 424 451 425 452 def get_model(self): 426 return self.wrapped.get Model()453 return self.wrapped.get_model() 427 454 428 455 def __reduce__(self): … … 439 466 class SVMLearnerSparse(SVMLearner): 440 467 441 """A :class:`SVMLearner` that learns from 468 """A :class:`SVMLearner` that learns from 442 469 meta attributes. 443 470 444 471 Meta attributes do not need to be registered with the data set domain, or 445 472 present in all the instances. Use this for large … … 458 485 :func:`SVMLearner.tune_parameters`. It is similar to the easy.py script in 459 486 the LibSVM package. 460 487 461 488 """ 462 489 … … 508 535 SVMLearnerSparse.__init__(self, **kwds) 509 536 510 class LinearLearner(Orange.core.LinearLearner): 511 """A fast learner (LinearLearner) with a default solver type 512 ``L2Loss_SVM_Dual``. 513 """ 514 515 def __new__(cls, data=None, weightId=0, **kwargs): 516 self = Orange.core.LinearLearner.__new__(cls, **kwargs) 517 if data: 518 self.__init__(**kwargs) 519 return self.__call__(data, weightId) 520 else: 521 return self 522 523 def __init__(self, **kwargs): 524 if "solver_type" not in kwargs: 525 #The default in Orange.core.LinearLearner is L2_LR 526 kwargs["solver_type"] = Orange.core.LinearLearner.L2Loss_SVM_Dual 537 def default_preprocessor(): 538 # Construct and return a default preprocessor for use by 539 # Orange.core.LinearLearner learner. 540 impute = Preprocessor_impute() 541 cont = Preprocessor_continuize(multinomialTreatment= 542 DomainContinuizer.AsOrdinal) 543 preproc = Preprocessor_preprocessorList(preprocessors= 544 [impute, cont]) 545 return preproc 546 547 class LinearSVMLearner(Orange.core.LinearLearner): 548 """Train a linear SVM model.""" 549 550 L2R_L2LOSS_DUAL = Orange.core.LinearLearner.L2R_L2Loss_SVC_Dual 551 L2R_L2LOSS = Orange.core.LinearLearner.L2R_L2Loss_SVC 552 L2R_L1LOSS_DUAL = Orange.core.LinearLearner.L2R_L1Loss_SVC_Dual 553 L2R_L1LOSS_DUAL = Orange.core.LinearLearner.L2R_L2Loss_SVC_Dual 554 L1R_L2LOSS = Orange.core.LinearLearner.L1R_L2Loss_SVC 555 556 __new__ = _orange__new__(base=Orange.core.LinearLearner) 557 558 def __init__(self, solver_type=L2R_L2LOSS_DUAL, C=1.0, eps=0.01, **kwargs): 559 """ 560 :param solver_type: Can be one of class constants: 561 562  L2R_L2LOSS_DUAL 563  L2R_L2LOSS 564  L2R_L1LOSS_DUAL 565  L2R_L1LOSS 566  L1R_L2LOSS 567 568 :param C: Regularization parameter (default 1.0) 569 :type C: float 570 571 :param eps: Stopping criteria (default 0.01) 572 :type eps: float 573 574 """ 575 self.solver_type = solver_type 576 self.eps = eps 577 self.C = C 527 578 for name, val in kwargs.items(): 528 579 setattr(self, name, val) 580 if self.solver_type not in [self.L2R_L2LOSS_DUAL, self.L2R_L2LOSS, 581 self.L2R_L1LOSS_DUAL, self.L2R_L1LOSS_DUAL, self.L1R_L2LOSS]: 582 pass 583 # raise ValueError("Invalid solver_type parameter.") 584 585 self.preproc = default_preprocessor() 586 587 def __call__(self, instances, weight_id=None): 588 instances = self.preproc(instances) 589 classifier = super(LinearSVMLearner, self).__call__(instances, weight_id) 590 return classifier 591 592 LinearLearner = LinearSVMLearner 593 594 class MultiClassSVMLearner(Orange.core.LinearLearner): 595 """ Multiclass SVM (Crammer and Singer) from the `LIBLINEAR`_ library. 596 """ 597 __new__ = _orange__new__(base=Orange.core.LinearLearner) 598 599 def __init__(self, C=1.0, eps=0.01, **kwargs): 600 """\ 601 :param C: Regularization parameter (default 1.0) 602 :type C: float 603 604 :param eps: Stopping criteria (default 0.01) 605 :type eps: float 606 607 """ 608 self.C = C 609 self.eps = eps 610 for name, val in kwargs.items(): 611 setattr(self, name, val) 612 613 self.solver_type = self.MCSVM_CS 614 self.preproc = default_preprocessor() 615 616 def __call__(self, instances, weight_id=None): 617 instances = self.preproc(instances) 618 classifier = super(MultiClassSVMLearner, self).__call__(instances, weight_id) 619 return classifier 620 621 #TODO: Unified way to get attr weights for linear SVMs. 529 622 530 623 def get_linear_svm_weights(classifier, sum=True): 531 624 """Extract attribute weights from the linear SVM classifier. 532 625 533 For multi class classification the weights are squaresummed over all binary 534 one vs. one classifiers. If obj:`sum` is False, the reported weights are a 535 seqeunce: class1 vs class2, class1 vs class3 ... class2 vs class3 ... . 626 For multi class classification the weights are squaresummed over all 627 binary one vs. one classifiers unles obj:`sum` is False, in which case 628 the return value is a list of weights for each individual binary 629 classifier (in the order of [class1 vs class2, class1 vs class3 ... class2 630 vs class3 ...]). 536 631 537 632 """ … … 546 641 return float(val) if not val.isSpecial() else 0.0 547 642 548 SVs=classifier.supportVectors 549 weights=[] 550 classes=classifier.supportVectors.domain.classVar.values 551 classSV=dict([(value, filter(lambda sv: sv.getclass()==value, \ 552 classifier.supportVectors)) \ 553 for value in classes]) 554 svRanges=[(0, classifier.nSV[0])] 555 for n in classifier.nSV[1:]: 556 svRanges.append((svRanges[1][1], svRanges[1][1]+n)) 557 for i in range(len(classes)1): 643 SVs=classifier.support_vectors 644 weights = [] 645 646 class_var = SVs.domain.class_var 647 if classifier.svm_type in [SVMLearner.C_SVC, SVMLearner.Nu_SVC]: 648 classes = class_var.values 649 else: 650 classes = [""] 651 if len(classes) > 1: 652 sv_ranges = [(0, classifier.nSV[0])] 653 for n in classifier.nSV[1:]: 654 sv_ranges.append((sv_ranges[1][1], sv_ranges[1][1]+n)) 655 else: 656 sv_ranges = [(0, len(SVs))] 657 658 for i in range(len(classes)  1): 558 659 for j in range(i+1, len(classes)): 559 w ={}560 coef Ind=j1561 for sv Ind in apply(range, svRanges[i]):660 w = {} 661 coef_ind = j  1 662 for sv_ind in range(*sv_ranges[i]): 562 663 attributes = SVs.domain.attributes + \ 563 SVs[svInd].getmetas(False, Orange.data.variable.Variable).keys() 664 SVs[sv_ind].getmetas(False, Orange.data.variable.Variable).keys() 665 for attr in attributes: 666 if attr.varType == Orange.data.Type.Continuous: 667 update_weights(w, attr, to_float(SVs[sv_ind][attr]), \ 668 classifier.coef[coef_ind][sv_ind]) 669 coef_ind=i 670 for sv_ind in range(*sv_ranges[j]): 671 attributes = SVs.domain.attributes + \ 672 SVs[sv_ind].getmetas(False, Orange.data.variable.Variable).keys() 564 673 for attr in attributes: 565 674 if attr.varType==Orange.data.Type.Continuous: 566 update_weights(w, attr, to_float(SVs[svInd][attr]), \ 567 classifier.coef[coefInd][svInd]) 568 coefInd=i 569 for svInd in apply(range, svRanges[j]): 570 attributes = SVs.domain.attributes + \ 571 SVs[svInd].getmetas(False, Orange.data.variable.Variable).keys() 572 for attr in attributes: 573 if attr.varType==Orange.data.Type.Continuous: 574 update_weights(w, attr, to_float(SVs[svInd][attr]), \ 575 classifier.coef[coefInd][svInd]) 675 update_weights(w, attr, to_float(SVs[sv_ind][attr]), \ 676 classifier.coef[coef_ind][sv_ind]) 576 677 weights.append(w) 577 678 … … 580 681 581 682 for w in weights: 582 for attr, w Attr in w.items():583 scores[attr] += w Attr**2683 for attr, w_attr in w.items(): 684 scores[attr] += w_attr**2 584 685 for key in scores: 585 686 scores[key] = math.sqrt(scores[key]) … … 593 694 sum=0 594 695 for attr, w in weights.items(): 595 sum +=float(example[attr])*w696 sum += float(example[attr]) * w 596 697 return sum 597 698 … … 599 700 600 701 class Score_SVMWeights(Orange.feature.scoring.Score): 601 602 """Base: :obj:`Orange.feature.scoring.Score` 603 604 Score feature by training a linear SVM classifier, using a squared sum of 702 """Score feature by training a linear SVM classifier, using a squared sum of 605 703 weights (of each binary classifier) as the returned score. 606 704 … … 625 723 626 724 def __init__(self, learner=None, **kwargs): 627 """:param learner: Learner used for weight estimation 725 """ 726 :param learner: Learner used for weight estimation 628 727 (default LinearLearner(solver_type=L2Loss_SVM_Dual)) 629 728 :type learner: Orange.core.Learner … … 661 760 662 761 >>> rfe = RFE(SVMLearner(kernel_type=kernels.Linear, \ 663 normalization=False)) # normalization=False doesnot change the domain762 normalization=False)) # normalization=False > do not change the domain 664 763 >>> data_with_removed_features = rfe(table, 5) # table with 5 best attributes 665 764 … … 675 774 A score is a step number at which the attribute 676 775 was removed from the recursive evaluation. 776 677 777 """ 678 778 iter = 1 … … 753 853 for i, attr in enumerate(attrs): 754 854 if not ex[attr].isSpecial(): 755 file.write(" "+str(i+1)+":"+str( ex[attr]))855 file.write(" "+str(i+1)+":"+str(float(ex[attr]))) 756 856 file.write("\n") 757 857 
orange/Orange/classification/svm/kernels.py
r9013 r9054 15 15 16 16 :param wrapped: a function to wrap 17 :type wrapped: function( double, double)17 :type wrapped: function(:class:`Orange.data.Instance`, :class:`Orange.data.Instance`) 18 18 19 19 """ … … 30 30 31 31 :param wrapped1: a function to wrap 32 :type wrapped1: function( double, double)32 :type wrapped1: function(:class:`Orange.data.Instance`, :class:`Orange.data.Instance`) 33 33 :param wrapped2: a function to wrap 34 :type wrapped2: function( double, double)34 :type wrapped2: function(:class:`Orange.data.Instance`, :class:`Orange.data.Instance`) 35 35 36 36 """ … … 46 46 47 47 :param wrapped: a function to wrap 48 :type wrapped: function( double, double)48 :type wrapped: function(:class:`Orange.data.Instance`, :class:`Orange.data.Instance`) 49 49 :param gamma: the gamma of the RBF 50 50 :type gamma: double … … 69 69 70 70 :param wrapped: a function to wrap 71 :type wrapped: function( double, double)72 :param degree: degree of the pol inomial71 :type wrapped: function(:class:`Orange.data.Instance`, :class:`Orange.data.Instance`) 72 :param degree: degree of the polynomial 73 73 :type degree: double 74 74 … … 113 113 114 114 :param wrapped1: a function to wrap 115 :type wrapped1: function( double, double)115 :type wrapped1: function(:class:`Orange.data.Instance`, :class:`Orange.data.Instance`) 116 116 :param wrapped2: a function to wrap 117 :type wrapped2: function( double, double)117 :type wrapped2: function(:class:`Orange.data.Instance`, :class:`Orange.data.Instance`) 118 118 :param l: coefficient 119 119 :type l: double … … 138 138 139 139 """ 140 s=set(example1.getmetas().keys()+example2.getmetas().keys())141 sum=0142 getmeta=lambda e: e.hasmeta(key) and float(e[key]) or 0.0143 for key in s:144 sum+=pow(getmeta(example2)getmeta(example1), 2)145 return pow(sum, 0.5)146 147 class BagOfWords(object):148 def __call__(self, example1, example2):149 """Computes a BOW kernel function:150 151 :math:`\sum_{i=1}^n example1_i * example2_i`152 153 using the examples meta attributes (need to be floats).154 155 """156 140 s = set(example1.getmetas().keys()) & set(example2.getmetas().keys()) 157 141 sum = 0
Note: See TracChangeset
for help on using the changeset viewer.