Ignore:
Timestamp:
04/03/12 10:12:26 (2 years ago)
Author:
anze <anze.staric@…>
Branch:
default
Message:

Removed optimization to tuning.

File:
1 moved

Legend:

Unmodified
Added
Removed
  • docs/reference/rst/Orange.tuning.rst

    r9372 r10711  
    44 
    55.. automodule:: Orange.optimization 
     6 
     7.. index:: tuning 
     8 
     9Wrappers for Tuning Parameters and Thresholds 
     10 
     11Classes for two very useful purposes: tuning learning algorithm's parameters 
     12using internal validation and tuning the threshold for classification into 
     13positive class. 
     14 
     15***************** 
     16Tuning parameters 
     17***************** 
     18 
     19Two classes support tuning parameters. 
     20:obj:`Orange.optimization.Tune1Parameter` for fitting a single parameter and 
     21:obj:`Orange.optimization.TuneMParameters` fitting multiple parameters at once, 
     22trying all possible combinations. When called with data and, optionally, id 
     23of meta attribute with weights, they find the optimal setting of arguments 
     24using cross validation. The classes can also be used as ordinary learning 
     25algorithms - they are in fact derived from 
     26:obj:`Orange.classification.Learner`. 
     27 
     28Both classes have a common parent, :obj:`Orange.optimization.TuneParameters`, 
     29and a few common attributes. 
     30 
     31.. autoclass:: TuneParameters 
     32   :members: 
     33 
     34.. autoclass:: Tune1Parameter 
     35   :members: 
     36 
     37.. autoclass:: TuneMParameters 
     38   :members: 
     39 
     40************************** 
     41Setting Optimal Thresholds 
     42************************** 
     43 
     44Some models may perform well in terms of AUC which measures the ability to 
     45distinguish between instances of two classes, but have low classifications 
     46accuracies. The reason may be in the threshold: in binary problems, classifiers 
     47usually classify into the more probable class, while sometimes, when class 
     48distributions are highly skewed, a modified threshold would give better 
     49accuracies. Here are two classes that can help. 
     50 
     51.. autoclass:: ThresholdLearner 
     52   :members: 
     53 
     54.. autoclass:: ThresholdClassifier 
     55   :members: 
     56 
     57Examples 
     58======== 
     59 
     60This is how you use the learner. 
     61 
     62part of :download:`optimization-thresholding1.py <code/optimization-thresholding1.py>` 
     63 
     64.. literalinclude:: code/optimization-thresholding1.py 
     65 
     66The output:: 
     67 
     68    W/out threshold adjustement: 0.633 
     69    With adjusted thredhold: 0.659 
     70    With threshold at 0.80: 0.449 
     71 
     72part of :download:`optimization-thresholding2.py <code/optimization-thresholding2.py>` 
     73 
     74.. literalinclude:: code/optimization-thresholding2.py 
     75 
     76The script first divides the data into training and testing subsets. It trains 
     77a naive Bayesian classifier and than wraps it into 
     78:obj:`Orange.optimization.ThresholdClassifiers` with thresholds of .2, .5 and 
     79.8. The three models are tested on the left-out data, and we compute the 
     80confusion matrices from the results. The printout:: 
     81 
     82    0.20: TP 60.000, TN 1.000 
     83    0.50: TP 42.000, TN 24.000 
     84    0.80: TP 2.000, TN 43.000 
     85 
     86shows how the varying threshold changes the balance between the number of true 
     87positives and negatives. 
     88 
     89.. autoclass:: PreprocessedLearner 
     90   :members: 
Note: See TracChangeset for help on using the changeset viewer.