Ignore:
Timestamp:
02/27/13 15:02:50 (14 months ago)
Author:
Ales Erjavec <ales.erjavec@…>
Branch:
default
Message:

Cleanup of 'Widget catalog' documentation.

Fixed rst text formating, replaced dead hardcoded reference links (now using
:ref:), etc.

File:
1 edited

Legend:

Unmodified
Added
Removed
  • docs/widgets/rst/classify/classificationtree.rst

    r11050 r11359  
    2121 
    2222   - Learner 
    23       The classification tree learning algorithm with settings as specified in the dialog. 
     23      The classification tree learning algorithm with settings as specified in 
     24      the dialog. 
    2425 
    2526   - Classification Tree 
     
    2728 
    2829 
    29 Signal :code:`Classification Tree` sends data only if the learning data (signal :code:`Classified Examples` is present. 
     30Signal :code:`Classification Tree` sends data only if the learning data 
     31(signal :code:`Classified Examples` is present. 
    3032 
    3133Description 
    3234----------- 
    3335 
    34 This widget provides a graphical interface to the classification tree learning algorithm. 
     36This widget provides a graphical interface to the classification tree learning 
     37algorithm. 
    3538 
    36 As all widgets for classification, this widget provides a learner and classifier on the output. Learner is a learning algorithm with settings as specified by the user. It can be fed into widgets for testing learners, for instance :code:`Test Learners`. Classifier is a Classification Tree Classifier (a subtype of a general classifier), built from the training examples on the input. If examples are not given, there is no classifier on the output. 
     39As all widgets for classification, this widget provides a learner and 
     40classifier on the output. Learner is a learning algorithm with settings 
     41as specified by the user. It can be fed into widgets for testing learners, 
     42for instance :ref:`Test Learners`. Classifier is a Classification Tree 
     43Classifier (a subtype of a general classifier), built from the training 
     44examples on the input. If examples are not given, there is no classifier on 
     45the output. 
    3746 
    3847.. image:: images/ClassificationTree.png 
    3948   :alt: Classification Tree Widget 
    4049 
    41 Learner can be given a name under which it will appear in, say, :code:`Test Learners`. The default name is "Classification Tree". 
     50Learner can be given a name under which it will appear in, say, 
     51:ref:`Test Learners`. The default name is "Classification Tree". 
    4252 
    43 The first block of options deals with the :obj:`Attribute selection criterion`, where you can choose between the information gain, gain ratio, gini index and ReliefF. For the latter, it is possible to :obj:`Limit the number of reference examples` (more examples give more accuracy and less speed) and the :obj:`Number of neighbours` considered in the estimation. 
     53The first block of options deals with the :obj:`Attribute selection criterion`, 
     54where you can choose between the information gain, gain ratio, gini index and 
     55ReliefF. For the latter, it is possible to :obj:`Limit the number of reference 
     56examples` (more examples give more accuracy and less speed) and the 
     57:obj:`Number of neighbours` considered in the estimation. 
    4458 
    45 If :code:`Binarization` is checked, the values of multivalued attributes are split into two groups (based on the statistics in the particular node) to yield a binary tree. Binarization gets rid of the usual measures' bias towards attributes with more values and is generally recommended. 
     59If :code:`Binarization` is checked, the values of multivalued attributes 
     60are split into two groups (based on the statistics in the particular node) 
     61to yield a binary tree. Binarization gets rid of the usual measures' 
     62bias towards attributes with more values and is generally recommended. 
    4663 
    47 Pruning during induction can be based on the :obj:`Minimal number of instance in leaves`; if checked, the algorithm will never construct a split which would put less than the specified number of training examples into any of the branches. You can also forbid the algorithm to split the nodes with less than the given number of instances (:obj:`Stop splitting nodes with less instances than`)or the nodes with a large enough majority class (:obj:`Stop splitting nodes with a majority class of (%)`. 
     64Pruning during induction can be based on the :obj:`Minimal number of 
     65instance in leaves`; if checked, the algorithm will never construct a split 
     66which would put less than the specified number of training examples into any 
     67of the branches. You can also forbid the algorithm to split the nodes with 
     68less than the given number of instances (:obj:`Stop splitting nodes with 
     69less instances than`)or the nodes with a large enough majority class 
     70(:obj:`Stop splitting nodes with a majority class of (%)`. 
    4871 
    49 During induction, the algorithm can produce a tree in which entire subtrees predict the same class, but with different probabilities. This can increase probability based measures of classifier quality, like the Brier score or AUC, but the trees tend to be much larger and more difficult to grasp. To avoid it, tell it to :obj:`Recursively merge the leaves with same majority class`. The widget also supports :obj:`pruning with m-estimate`. 
     72During induction, the algorithm can produce a tree in which entire subtrees 
     73predict the same class, but with different probabilities. This can increase 
     74probability based measures of classifier quality, like the Brier score 
     75or AUC, but the trees tend to be much larger and more difficult to grasp. 
     76To avoid it, tell it to :obj:`Recursively merge the leaves with same 
     77majority class`. The widget also supports :obj:`pruning with m-estimate`. 
    5078 
    51 After changing one or more settings, you need to push :obj:`Apply`, which will put the new learner on the output and, if the training examples are given, construct a new classifier and output it as well. 
     79After changing one or more settings, you need to push :obj:`Apply`, which 
     80will put the new learner on the output and, if the training examples are 
     81given, construct a new classifier and output it as well. 
    5282 
    53 The tree can deal with missing data. Orange's tree learner actually supports quite a few methods for that, but when used from canvas, it effectively splits the example into multiple examples with different weights. If you had data with 25% males and 75% females, then when the gender is unknown, the examples splits into two, a male and a female with weights .25 and .75, respectively. This goes for both learning and classification. 
     83The tree can deal with missing data. Orange's tree learner actually 
     84supports quite a few methods for that, but when used from canvas, 
     85it effectively splits the example into multiple examples with different 
     86weights. If you had data with 25% males and 75% females, then when the 
     87gender is unknown, the examples splits into two, a male and a female 
     88with weights .25 and .75, respectively. This goes for both learning 
     89and classification. 
    5490 
    5591Examples 
    5692-------- 
    5793 
    58 There are two typical uses of this widget. First, you may want to induce the model and check what it looks like. You do it with the schema below; to learn more about it, see the documentation on `Classification Tree Graph <ClassificationTreeGraph.htm>`_. 
     94There are two typical uses of this widget. First, you may want to induce 
     95the model and check what it looks like. You do it with the schema below; 
     96to learn more about it, see the documentation on :ref:`Classification Tree 
     97Graph` 
    5998 
    6099.. image:: images/ClassificationTreeGraph-SimpleSchema-S.gif 
Note: See TracChangeset for help on using the changeset viewer.