Changeset 10077:aafe4468ce2d in orange


Ignore:
Timestamp:
02/08/12 15:20:45 (2 years ago)
Author:
ales_erjavec
Branch:
default
Message:

Camel case parameters, attributes to underscore, fixes to the documentation.

Files:
3 edited

Legend:

Unmodified
Added
Removed
  • Orange/optimization/__init__.py

    r9671 r10077  
    1515:obj:`Orange.optimization.Tune1Parameter` for fitting a single parameter and 
    1616:obj:`Orange.optimization.TuneMParameters` fitting multiple parameters at once, 
    17 trying all possible combinations. When called with examples and, optionally, id 
     17trying all possible combinations. When called with data and, optionally, id 
    1818of meta attribute with weights, they find the optimal setting of arguments 
    19 using the cross validation. The classes can also be used as ordinary learning 
     19using cross validation. The classes can also be used as ordinary learning 
    2020algorithms - they are in fact derived from 
    2121:obj:`Orange.classification.Learner`. 
     
    3838 
    3939Some models may perform well in terms of AUC which measures the ability to 
    40 distinguish between examples of two classes, but have low classifications 
     40distinguish between instances of two classes, but have low classifications 
    4141accuracies. The reason may be in the threshold: in binary problems, classifiers 
    4242usually classify into the more probable class, while sometimes, when class 
     
    6565    With threshold at 0.80: 0.449 
    6666 
    67 shows that fitting threshold is good (well, although 2.5 percent increase in 
    68 the accuracy absolutely guarantees you a publication at ICML, the difference is 
    69 still unimportant), while setting it at 80% is a bad idea. Or is it? 
    70  
    7167part of :download:`optimization-thresholding2.py <code/optimization-thresholding2.py>` 
    7268 
    7369.. literalinclude:: code/optimization-thresholding2.py 
    7470 
    75 The script first divides the data into training and testing examples. It trains 
     71The script first divides the data into training and testing subsets. It trains 
    7672a naive Bayesian classifier and than wraps it into 
    7773:obj:`Orange.optimization.ThresholdClassifiers` with thresholds of .2, .5 and 
    78 .8. The three models are tested on the left-out examples, and we compute the 
     74.8. The three models are tested on the left-out data, and we compute the 
    7975confusion matrices from the results. The printout:: 
    8076 
     
    9793import Orange.misc 
    9894 
     95from Orange.misc import (deprecated_class_attribute, deprecated_keywords, 
     96                         deprecated_members) 
     97 
    9998class TuneParameters(Orange.classification.Learner): 
    10099     
    101     """.. attribute:: examples 
     100    """.. attribute:: data 
    102101     
    103102        Data table with either discrete or continuous features 
    104103     
    105     .. attribute:: weightID 
    106      
    107         The ID of the weight meta attribute 
    108      
    109     .. attribute:: object 
     104    .. attribute:: weight_id 
     105     
     106        The id of the weight meta attribute 
     107     
     108    .. attribute:: learner 
    110109     
    111110        The learning algorithm whose parameters are to be tuned. This can be, 
    112         for instance, :obj:`Orange.classification.tree.TreeLearner`. You will 
    113         usually use the wrapped learners from modules, not the built-in 
    114         classifiers, such as :obj:`Orange.classification.tree.TreeLearner` 
    115         directly, since the arguments to be fitted are easier to address in the 
    116         wrapped versions. But in principle it doesn't matter. 
     111        for instance, :obj:`Orange.classification.tree.TreeLearner`. 
    117112     
    118113    .. attribute:: evaluate 
     
    135130        The function used to compare the results. The function should accept 
    136131        two arguments (e.g. two classification accuracies, AUCs or whatever the 
    137         result of evaluate is) and return a positive value if the first 
     132        result of ``evaluate`` is) and return a positive value if the first 
    138133        argument is better, 0 if they are equal and a negative value if the 
    139         first is worse than the second. The default compare function is cmp. 
    140         You don't need to change this if evaluate is such that higher values 
    141         mean a better classifier. 
    142      
    143     .. attribute:: returnWhat 
     134        first is worse than the second. The default compare function is  
     135        ``cmp``. You don't need to change this if evaluate is such that higher 
     136        values mean a better classifier. 
     137     
     138    .. attribute:: return_what 
    144139     
    145140        Decides what should be result of tuning. Possible values are: 
    146141     
    147         * TuneParameters.returnNone (or 0): tuning will return nothing, 
    148         * TuneParameters.returnParameters (or 1): return the optimal value(s) of parameter(s), 
    149         * TuneParameters.returnLearner (or 2): return the learner set to optimal parameters, 
    150         * TuneParameters.returnClassifier (or 3): return a classifier trained with the optimal parameters on the entire data set. This is the default setting. 
    151          
    152         Regardless of this, the learner (given as object) is left set to the 
    153         optimal parameters. 
     142        * ``TuneParameters.RETURN_NONE`` (or 0): tuning will return nothing, 
     143        * ``TuneParameters.RETURN_PARAMETERS`` (or 1): return the optimal value(s) of parameter(s), 
     144        * ``TuneParameters.RETURN_LEARNER`` (or 2): return the learner set to optimal parameters, 
     145        * ``TuneParameters.RETURN_CLASSIFIER`` (or 3): return a classifier trained with the optimal parameters on the entire data set. This is the default setting. 
     146         
     147        Regardless of this, the learner (given as parameter ``learner``) is  
     148        left set to the optimal parameters. 
    154149     
    155150    .. attribute:: verbose 
     
    160155     
    161156    If tuner returns the classifier, it behaves as a learning algorithm. As the 
    162     examples below will demonstrate, it can be called, given the examples and 
     157    examples below will demonstrate, it can be called, given the data and 
    163158    the result is a "trained" classifier. It can, for instance, be used in 
    164159    cross-validation. 
    165160 
    166     Out of these attributes, the only necessary argument is object. The real 
    167     tuning classes add two additional - the attributes that tell what 
    168     parameter(s) to optimize and which values to use. 
     161    Out of these attributes, the only necessary argument is ``learner``. The 
     162    real tuning classes (subclasses of this class) add two additional -  
     163    the attributes that tell what parameter(s) to optimize and which values 
     164    to use. 
    169165     
    170166    """ 
    171167     
    172     returnNone=0 
    173     returnParameters=1 
    174     returnLearner=2 
    175     returnClassifier=3 
    176      
    177     def __new__(cls, examples = None, weightID = 0, **argkw): 
     168    RETURN_NONE = 0 
     169    RETURN_PARAMETERS = 1 
     170    RETURN_LEARNER = 2 
     171    RETURN_CLASSIFIER = 3 
     172     
     173    returnNone = \ 
     174        deprecated_class_attribute("returnNone", "RETURN_NONE") 
     175    returnParameters = \ 
     176        deprecated_class_attribute("returnParameters", "RETURN_PARAMETERS") 
     177    returnLearner = \ 
     178        deprecated_class_attribute("returnLearner", "RETURN_LEARNER") 
     179    returnClassifier = \ 
     180        deprecated_class_attribute("returnClassifier", "RETURN_CLASSIFIER") 
     181     
     182    @deprecated_keywords({"examples": "data","weightID": "weight_id"}) 
     183    def __new__(cls, data = None, weight_id = 0, **argkw): 
    178184        self = Orange.classification.Learner.__new__(cls, **argkw) 
    179         self.__dict__.update(argkw) 
    180         if examples: 
    181             return self.__call__(examples, weightID) 
     185        if data: 
     186            for name, value in argkw.items(): 
     187                setattr(self, name, value) 
     188            self.__init__(**argkw) 
     189            return self.__call__(data, weight_id) 
    182190        else: 
    183191            return self 
     
    185193    def findobj(self, name): 
    186194        import string 
    187         names=string.split(name, ".") 
    188         lastobj=self.object 
     195        names = string.split(name, ".") 
     196        lastobj = self.object 
    189197        for i in names[:-1]: 
    190             lastobj=getattr(lastobj, i) 
     198            lastobj = getattr(lastobj, i) 
    191199        return lastobj, names[-1] 
    192          
     200     
     201TuneParameters = deprecated_members( 
     202    {"returnWhat": "return_what", 
     203     "object": "learner"}, 
     204    )(TuneParameters) 
     205     
     206     
    193207class Tune1Parameter(TuneParameters): 
    194208     
     
    212226        :lines: 3-11 
    213227 
    214     Set up like this, when the tuner is called, set learner.minSubset to 1, 2, 
    215     3, 4, 5, 10, 15 and 20, and measure the AUC in 5-fold cross validation. It 
    216     will then reset the learner.minSubset to the optimal value found and, since 
    217     we left returnWhat at the default (returnClassifier), construct and return 
    218     the classifier from the entire data set. So, what we get is a classifier, 
    219     but if we'd also like to know what the optimal value was, we can get it 
    220     from learner.minSubset. 
     228    Set up like this, when the tuner is called, set ``learner.min_subset`` to  
     229    1, 2, 3, 4, 5, 10, 15 and 20, and measure the AUC in 5-fold cross  
     230    validation. It will then reset the learner.minSubset to the optimal value 
     231    found and, since we left ``return_what`` at the default  
     232    (``RETURN_CLASSIFIER``), construct and return the classifier from the  
     233    entire data set. So, what we get is a  classifier, but if we'd also like  
     234    to know what the optimal value was, we can get it from 
     235    ``learner.min_subset``. 
    221236 
    222237    Tuning is of course not limited to setting numeric parameters. You can, for 
    223238    instance, try to find the optimal criteria for assessing the quality of 
    224     attributes by tuning parameter="measure", trying settings like 
    225     values=[orange.MeasureAttribute_gainRatio(), 
    226     orange.MeasureAttribute_gini()]. 
     239    attributes by tuning ``parameter="measure"``, trying settings like 
     240    ``values=[Orange.feature.scoring.GainRatio(), Orange.feature.scoring.Gini()]`` 
    227241     
    228242    Since the tuner returns a classifier and thus behaves like a learner, it 
     
    236250        :lines: 13-18 
    237251     
    238     This will take some time: for each of 8 values for minSubset it will 
     252    This can be time consuming: for each of 8 values for ``min_subset`` it will 
    239253    perform 5-fold cross validation inside a 10-fold cross validation - 
    240254    altogether 400 trees. Plus, it will learn the optimal tree afterwards for 
    241     each fold. Add a tree without tuning, and you get 420 trees build. 
    242      
    243     Well, not that long, and the results are good:: 
     255    each fold. Adding a tree without tuning, that makes 420 trees build in  
     256    total. 
     257     
     258    Nevertheless, results are good:: 
    244259     
    245260        Untuned tree: 0.930 
     
    248263    """ 
    249264     
    250     def __call__(self, table, weight=None, verbose=0): 
     265    def __call__(self, data, weight=None, verbose=0): 
    251266        verbose = verbose or getattr(self, "verbose", 0) 
    252267        evaluate = getattr(self, "evaluate", Orange.evaluation.scoring.CA) 
    253268        folds = getattr(self, "folds", 5) 
    254269        compare = getattr(self, "compare", cmp) 
    255         returnWhat = getattr(self, "returnWhat",  
     270        return_what = getattr(self, "return_what",  
    256271                             Tune1Parameter.returnClassifier) 
    257272 
     
    261276            to_set = [self.findobj(self.parameter)] 
    262277 
    263         cvind = Orange.core.MakeRandomIndicesCV(table, folds) 
    264         findBest = Orange.misc.selection.BestOnTheFly(seed = table.checksum(),  
     278        cvind = Orange.core.MakeRandomIndicesCV(data, folds) 
     279        findBest = Orange.misc.selection.BestOnTheFly(seed = data.checksum(),  
    265280                                         callCompareOn1st = True) 
    266         tableAndWeight = weight and (table, weight) or table 
     281        tableAndWeight = weight and (data, weight) or data 
    267282        for par in self.values: 
    268283            for i in to_set: 
     
    281296            print "*** Optimal parameter: %s = %s" % (self.parameter, bestpar) 
    282297 
    283         if returnWhat==Tune1Parameter.returnNone: 
     298        if return_what==Tune1Parameter.returnNone: 
    284299            return None 
    285         elif returnWhat==Tune1Parameter.returnParameters: 
     300        elif return_what==Tune1Parameter.returnParameters: 
    286301            return bestpar 
    287         elif returnWhat==Tune1Parameter.returnLearner: 
     302        elif return_what==Tune1Parameter.returnLearner: 
    288303            return self.object 
    289304        else: 
    290             classifier = self.object(table) 
    291             classifier.setattr("fittedParameter", bestpar) 
     305            classifier = self.object(data) 
     306            if not Orange.misc.environ.orange_no_deprecated_members: 
     307                classifier.setattr("fittedParameter", bestpar) 
     308            classifier.setattr("fitted_parameter", bestpar) 
    292309            return classifier 
    293310 
     
    303320        and its possible values. 
    304321     
    305     For exercise we can try to tune both settings mentioned above, the minimal 
    306     number of examples in leaves and the splitting criteria by setting the 
    307     tuner as follows: 
     322    For example we can try to tune both the minimal number of instances in  
     323    leaves and the splitting criteria by setting the tuner as follows: 
    308324     
    309325    :download:`optimization-tuningm.py <code/optimization-tuningm.py>` 
    310326 
    311327    .. literalinclude:: code/optimization-tuningm.py 
    312          
    313     Everything else stays like above, in examples for 
    314     :obj:`Orange.optimization.Tune1Parameter`. 
    315328     
    316329    """ 
    317330     
    318     def __call__(self, table, weight=None, verbose=0): 
     331    def __call__(self, data, weight=None, verbose=0): 
    319332        evaluate = getattr(self, "evaluate", Orange.evaluation.scoring.CA) 
    320333        folds = getattr(self, "folds", 5) 
    321334        compare = getattr(self, "compare", cmp) 
    322335        verbose = verbose or getattr(self, "verbose", 0) 
    323         returnWhat=getattr(self, "returnWhat", Tune1Parameter.returnClassifier) 
    324         progressCallback = getattr(self, "progressCallback", lambda i: None) 
     336        return_what = getattr(self, "return_what", Tune1Parameter.returnClassifier) 
     337        progress_callback = getattr(self, "progress_callback", lambda i: None) 
    325338         
    326339        to_set = [] 
     
    335348 
    336349 
    337         cvind = Orange.core.MakeRandomIndicesCV(table, folds) 
    338         findBest = Orange.misc.selection.BestOnTheFly(seed = table.checksum(),  
     350        cvind = Orange.core.MakeRandomIndicesCV(data, folds) 
     351        findBest = Orange.misc.selection.BestOnTheFly(seed = data.checksum(),  
    339352                                         callCompareOn1st = True) 
    340         tableAndWeight = weight and (table, weight) or table 
     353        tableAndWeight = weight and (data, weight) or data 
    341354        numOfTests = sum([len(x[1]) for x in self.parameters]) 
    342355        milestones = set(range(0, numOfTests, max(numOfTests / 100, 1))) 
     
    354367                                        [self.object], tableAndWeight, cvind)) 
    355368            if itercount in milestones: 
    356                 progressCallback(100.0 * itercount / numOfTests) 
     369                progress_callback(100.0 * itercount / numOfTests) 
    357370             
    358371            findBest.candidate((res, values)) 
     
    371384            print 
    372385 
    373         if returnWhat==Tune1Parameter.returnNone: 
     386        if return_what==Tune1Parameter.returnNone: 
    374387            return None 
    375         elif returnWhat==Tune1Parameter.returnParameters: 
     388        elif return_what==Tune1Parameter.returnParameters: 
    376389            return bestpar 
    377         elif returnWhat==Tune1Parameter.returnLearner: 
     390        elif return_what==Tune1Parameter.returnLearner: 
    378391            return self.object 
    379392        else: 
    380             classifier = self.object(table) 
    381             classifier.fittedParameters = bestpar 
     393            classifier = self.object(data) 
     394            if Orange.misc.environ.orange_no_deprecated_members: 
     395                classifier.fittedParameters = bestpar 
     396            classifier.fitted_parameters = bestpar 
    382397            return classifier 
     398         
     399TuneMParameters = deprecated_members( 
     400    {"progressCallback": "progress_callback"}, 
     401    )(TuneMParameters) 
    383402 
    384403class ThresholdLearner(Orange.classification.Learner): 
    385404     
    386     """:obj:`Orange.optimization.ThresholdLearner` is a class that wraps around  
     405    """:obj:`Orange.optimization.ThresholdLearner` is a class that wraps  
    387406    another learner. When given the data, it calls the wrapped learner to build 
    388407    a classifier, than it uses the classifier to predict the class 
    389     probabilities on the training examples. Storing the probabilities, it 
     408    probabilities on the training instances. Storing the probabilities, it 
    390409    computes the threshold that would give the optimal classification accuracy. 
    391410    Then it wraps the classifier and the threshold into an instance of 
     
    393412 
    394413    Note that the learner doesn't perform internal cross-validation. Also, the 
    395     learner doesn't work for multivalued classes. If you don't understand why, 
    396     think harder. If you still don't, try to program it yourself, this should 
    397     help. :) 
     414    learner doesn't work for multivalued classes. 
    398415 
    399416    :obj:`Orange.optimization.ThresholdLearner` has the same interface as any 
    400     learner: if the constructor is given examples, it returns a classifier, 
     417    learner: if the constructor is given data, it returns a classifier, 
    401418    else it returns a learner. It has two attributes. 
    402419     
     
    406423        :obj:`Orange.classification.bayes.NaiveLearner`. 
    407424     
    408     .. attribute:: storeCurve 
     425    .. attribute:: store_curve 
    409426     
    410427        If `True`, the resulting classifier will contain an attribute curve, with 
     
    414431    """ 
    415432     
    416     def __new__(cls, examples = None, weightID = 0, **kwds): 
     433    @deprecated_keywords({"examples": "data","weightID": "weight_id"}) 
     434    def __new__(cls, data = None, weight_id = 0, **kwds): 
    417435        self = Orange.classification.Learner.__new__(cls, **kwds) 
    418         if examples: 
     436        if data: 
    419437            self.__init__(**kwargs) 
    420             return self.__call__(examples, weightID) 
     438            return self.__call__(data, weight_id) 
    421439        else: 
    422440            return self 
    423441         
    424     def __init__(self, learner=None, storeCurve=False, **kwds): 
     442    @deprecated_keywords({"storeCurve": "store_curve"}) 
     443    def __init__(self, learner=None, store_curve=False, **kwds): 
    425444        self.learner = learner 
    426         self.storeCurve = storeCurve 
    427         self.__dict__.update(kwds) 
    428  
    429     def __call__(self, examples, weightID = 0): 
     445        self.store_curve = store_curve 
     446        for name, value in kwds.items(): 
     447            setattr(self, name, value) 
     448 
     449    @deprecated_keywords({"examples": "data","weightID": "weight_id"}) 
     450    def __call__(self, data, weight_id = 0): 
    430451        if self.learner is None: 
    431452            raise AttributeError("Learner not set.") 
    432453         
    433         classifier = self.learner(examples, weightID) 
     454        classifier = self.learner(data, weight_id) 
    434455        threshold, optCA, curve = Orange.wrappers.ThresholdCA(classifier,  
    435                                                           examples,  
    436                                                           weightID) 
    437         if self.storeCurve: 
     456                                                          data,  
     457                                                          weight_id) 
     458        if self.store_curve: 
    438459            return ThresholdClassifier(classifier, threshold, curve = curve) 
    439460        else: 
    440461            return ThresholdClassifier(classifier, threshold) 
    441462 
     463ThresholdLearner = deprecated_members( 
     464    {"storeCurve": "store_curve"},  
     465    wrap_methods=["__init__"] 
     466    )(ThresholdLearner) 
     467     
    442468class ThresholdClassifier(Orange.classification.Classifier): 
    443469     
     
    446472    :obj:`Orange.optimization.ThresholdLearner_fixed` is therefore another 
    447473    wrapper class, containing a classifier and a threshold. When it needs to 
    448     classify an example, it calls the wrapped classifier to predict 
     474    classify an instance, it calls the wrapped classifier to predict 
    449475    probabilities. The example will be classified into the second class only if 
    450476    the probability of that class is above the threshold. 
     
    452478    .. attribute:: classifier 
    453479     
    454     The wrapped classifier, normally the one related to the ThresholdLearner's 
    455     learner, e.g. an instance of 
    456     :obj:`Orange.classification.bayes.NaiveLearner`. 
     480        The wrapped classifier, normally the one related to the ThresholdLearner's 
     481        learner, e.g. an instance of 
     482        :obj:`Orange.classification.bayes.NaiveLearner`. 
    457483     
    458484    .. attribute:: threshold 
    459485     
    460     The threshold for classification into the second class. 
     486        The threshold for classification into the second class. 
    461487     
    462488    The two attributes can be specified set as attributes or given to the 
     
    468494        self.classifier = classifier 
    469495        self.threshold = threshold 
    470         self.__dict__.update(kwds) 
    471  
    472     def __call__(self, example, what = Orange.classification.Classifier.GetValue): 
    473         probs = self.classifier(example, self.GetProbabilities) 
     496        for name, value in kwds.items(): 
     497            setattr(self, name, value) 
     498 
     499    def __call__(self, instance, what = Orange.classification.Classifier.GetValue): 
     500        probs = self.classifier(instance, self.GetProbabilities) 
    474501        if what == self.GetProbabilities: 
    475502            return probs 
     
    483510     
    484511class ThresholdLearner_fixed(Orange.classification.Learner): 
    485     """ There's also a dumb variant of  
    486     :obj:`Orange.optimization.ThresholdLearner`, a class called 
    487     :obj:`Orange.optimization.ThreshholdLearner_fixed`. Instead of finding the 
    488     optimal threshold it uses a prescribed one. So, it has the following two 
     512    """ This is a convinience  variant of  
     513    :obj:`Orange.optimization.ThresholdLearner`. Instead of finding the 
     514    optimal threshold it uses a prescribed one. It has the following two 
    489515    attributes. 
    490516     
    491517    .. attribute:: learner 
    492518     
    493     The wrapped learner, for example an instance of 
    494     :obj:`Orange.classification.bayes.NaiveLearner`. 
     519        The wrapped learner, for example an instance of 
     520        :obj:`~Orange.classification.bayes.NaiveLearner`. 
    495521     
    496522    .. attribute:: threshold 
    497523     
    498     Threshold to use in classification. 
    499      
    500     What this guy does is therefore simple: to learn, it calls the learner and 
    501     puts the resulting classifier together with the threshold into an instance 
    502     of ThresholdClassifier. 
     524        Threshold to use in classification. 
     525     
     526    This class calls its base learner and puts the resulting classifier 
     527    together with the threshold into an instance of :obj:`ThresholdClassifier`. 
    503528     
    504529    """ 
    505     def __new__(cls, examples = None, weightID = 0, **kwds): 
     530    @deprecated_keywords({"examples": "data", "weightID": "weight_id"}) 
     531    def __new__(cls, data = None, weight_id = 0, **kwds): 
    506532        self = Orange.classification.Learner.__new__(cls, **kwds) 
    507         if examples: 
     533        if data: 
    508534            self.__init__(**kwds) 
    509             return self.__call__(examples, weightID) 
     535            return self.__call__(data, weight_id) 
    510536        else: 
    511537            return self 
     
    514540        self.learner = learner 
    515541        self.threshold = threshold 
    516         self.__dict__.update(kwds) 
    517  
    518     def __call__(self, examples, weightID = 0): 
     542        for name, value in kwds.items(): 
     543            setattr(name, value) 
     544             
     545    @deprecated_keywords({"examples": "data", "weightID": "weight_id"}) 
     546    def __call__(self, data, weight_id = 0): 
    519547        if self.learner is None: 
    520548            raise AttributeError("Learner not set.") 
    521549        if self.threshold is None: 
    522550            raise AttributeError("Threshold not set.") 
    523         if len(examples.domain.classVar.values) != 2: 
     551        if len(data.domain.classVar.values) != 2: 
    524552            raise ValueError("ThresholdLearner handles binary classes only.") 
    525553         
    526         return ThresholdClassifier(self.learner(examples, weightID),  
     554        return ThresholdClassifier(self.learner(data, weight_id),  
    527555                                   self.threshold) 
    528556 
  • docs/reference/rst/code/optimization-tuning1.py

    r9823 r10077  
    33learner = Orange.classification.tree.TreeLearner() 
    44voting = Orange.data.Table("voting") 
    5 tuner = Orange.optimization.Tune1Parameter(object=learner, 
    6                            parameter="minSubset", 
     5tuner = Orange.optimization.Tune1Parameter(learner=learner, 
     6                           parameter="min_subset", 
    77                           values=[1, 2, 3, 4, 5, 10, 15, 20], 
    88                           evaluate = Orange.evaluation.scoring.AUC, verbose=2) 
    99classifier = tuner(voting) 
    1010 
    11 print "Optimal setting: ", learner.minSubset 
     11print "Optimal setting: ", learner.min_subset 
    1212 
    1313untuned = Orange.classification.tree.TreeLearner() 
     
    1818print "Tuned tree: %5.3f" % AUCs[1] 
    1919 
    20 learner = Orange.classification.tree.TreeLearner(minSubset=10).instance() 
     20learner = Orange.classification.tree.TreeLearner(min_subset=10).instance() 
    2121voting = Orange.data.Table("voting") 
    2222tuner = Orange.optimization.Tune1Parameter(object=learner, 
    23                     parameter=["split.continuousSplitConstructor.minSubset",  
    24                                "split.discreteSplitConstructor.minSubset"], 
     23                    parameter=["split.continuous_split_constructor.min_subset",  
     24                               "split.discrete_split_constructor.min_subset"], 
    2525                    values=[1, 2, 3, 4, 5, 10, 15, 20], 
    2626                    evaluate = Orange.evaluation.scoring.AUC, verbose=2) 
     
    2828classifier = tuner(voting) 
    2929 
    30 print "Optimal setting: ", learner.split.continuousSplitConstructor.minSubset 
     30print "Optimal setting: ", learner.split.continuous_split_constructor.min_subset 
  • docs/reference/rst/code/optimization-tuningm.py

    r9946 r10077  
    33learner = Orange.classification.tree.TreeLearner() 
    44voting = Orange.data.Table("voting") 
    5 tuner = Orange.optimization.TuneMParameters(object=learner, 
    6              parameters=[("minSubset", [2, 5, 10, 20]), 
     5tuner = Orange.optimization.TuneMParameters(learner=learner, 
     6             parameters=[("min_subset", [2, 5, 10, 20]), 
    77                         ("measure", [Orange.feature.scoring.GainRatio(),  
    88                                      Orange.feature.scoring.Gini()])], 
Note: See TracChangeset for help on using the changeset viewer.