Ignore:
Timestamp:
04/03/12 10:12:26 (2 years ago)
Author:
anze <anze.staric@…>
Branch:
default
Message:

Removed optimization to tuning.

File:
1 moved

Legend:

Unmodified
Added
Removed
  • Orange/tuning/__init__.py

    r10654 r10711  
    11"""  
    2 .. index:: optimization 
    3  
    4 Wrappers for Tuning Parameters and Thresholds 
    5  
    6 Classes for two very useful purposes: tuning learning algorithm's parameters 
    7 using internal validation and tuning the threshold for classification into 
    8 positive class. 
    9  
    10 ***************** 
    11 Tuning parameters 
    12 ***************** 
    13  
    14 Two classes support tuning parameters. 
    15 :obj:`Orange.optimization.Tune1Parameter` for fitting a single parameter and 
    16 :obj:`Orange.optimization.TuneMParameters` fitting multiple parameters at once, 
    17 trying all possible combinations. When called with data and, optionally, id 
    18 of meta attribute with weights, they find the optimal setting of arguments 
    19 using cross validation. The classes can also be used as ordinary learning 
    20 algorithms - they are in fact derived from 
    21 :obj:`Orange.classification.Learner`. 
    22  
    23 Both classes have a common parent, :obj:`Orange.optimization.TuneParameters`, 
    24 and a few common attributes. 
    25  
    26 .. autoclass:: Orange.optimization.TuneParameters 
    27    :members: 
    28  
    29 .. autoclass:: Orange.optimization.Tune1Parameter 
    30    :members: 
    31    
    32 .. autoclass:: Orange.optimization.TuneMParameters 
    33    :members:  
    34     
    35 ************************** 
    36 Setting Optimal Thresholds 
    37 ************************** 
    38  
    39 Some models may perform well in terms of AUC which measures the ability to 
    40 distinguish between instances of two classes, but have low classifications 
    41 accuracies. The reason may be in the threshold: in binary problems, classifiers 
    42 usually classify into the more probable class, while sometimes, when class 
    43 distributions are highly skewed, a modified threshold would give better 
    44 accuracies. Here are two classes that can help. 
    45    
    46 .. autoclass:: Orange.optimization.ThresholdLearner 
    47    :members:  
    48       
    49 .. autoclass:: Orange.optimization.ThresholdClassifier 
    50    :members:  
    51     
    52 Examples 
    53 ======== 
    54  
    55 This is how you use the learner. 
    56  
    57 part of :download:`optimization-thresholding1.py <code/optimization-thresholding1.py>` 
    58  
    59 .. literalinclude:: code/optimization-thresholding1.py 
    60  
    61 The output:: 
    62  
    63     W/out threshold adjustement: 0.633 
    64     With adjusted thredhold: 0.659 
    65     With threshold at 0.80: 0.449 
    66  
    67 part of :download:`optimization-thresholding2.py <code/optimization-thresholding2.py>` 
    68  
    69 .. literalinclude:: code/optimization-thresholding2.py 
    70  
    71 The script first divides the data into training and testing subsets. It trains 
    72 a naive Bayesian classifier and than wraps it into 
    73 :obj:`Orange.optimization.ThresholdClassifiers` with thresholds of .2, .5 and 
    74 .8. The three models are tested on the left-out data, and we compute the 
    75 confusion matrices from the results. The printout:: 
    76  
    77     0.20: TP 60.000, TN 1.000 
    78     0.50: TP 42.000, TN 24.000 
    79     0.80: TP 2.000, TN 43.000 
    80  
    81 shows how the varying threshold changes the balance between the number of true 
    82 positives and negatives. 
    83  
    84 .. autoclass:: Orange.optimization.PreprocessedLearner 
    85    :members:  
     2 
    863    
    874""" 
Note: See TracChangeset for help on using the changeset viewer.