Changeset 11404:1a7b773d7c7b in orange


Ignore:
Timestamp:
03/15/13 17:46:47 (13 months ago)
Author:
Ales Erjavec <ales.erjavec@…>
Branch:
default
Message:

Replaced the use of :code: role with :obj:

Location:
docs/widgets/rst
Files:
13 edited

Legend:

Unmodified
Added
Removed
  • docs/widgets/rst/associate/associationrulesexplorer.rst

    r11359 r11404  
    4848left-hand side conditions in a particular rule. Turned around, this 
    4949means that each rule appears in many places in the tree. As the completely 
    50 open tree below shows, the rule :code:`age=adult & sex=male -> status=crew` 
     50open tree below shows, the rule "age=adult & sex=male -> status=crew" 
    5151appears in two places (the seventh and the eleventh row). 
    5252 
  • docs/widgets/rst/classify/c45.rst

    r11359 r11404  
    3333 
    3434 
    35 :code:`Classifier`, :code:`C45 Tree` and :code:`Classification Tree` are 
     35:obj:`Classifier`, :obj:`C45 Tree` and :obj:`Classification Tree` are 
    3636available only if examples are present on the input. Which of the latter two 
    3737output signals is active is determined by setting 
     
    6565The next block of options deals with splitting. C4.5 uses gain ratio by 
    6666default; to override this, check :obj:`Use information gain instead of ratio`, 
    67 which is equivalent to C4.5's command line option :code:`-g`. If you enable 
    68 :obj:`subsetting` (equivalent to :code:`-s`), C4.5 will merge values of 
     67which is equivalent to C4.5's command line option ``-g``. If you enable 
     68:obj:`Subsetting` (equivalent to ``-s``), C4.5 will merge values of 
    6969multivalued discrete attributes instead of creating one branch for each node. 
    70 :obj:`Probabilistic threshold for continuous attributes` (:code:`-p`) makes 
     70:obj:`Probabilistic threshold for continuous attributes` (``-p``) makes 
    7171C4.5 compute the lower and upper boundaries for values of continuous attributes 
    7272for which the number of misclassified examples would be within one standard 
     
    8383 
    8484The resulting classifier can be left in the original Quinlan's structure, as 
    85 returned by his underlying code, or :obj:`converted to orange the structure` 
     85returned by his underlying code, or :obj:`Converted to orange the structure` 
    8686that is used by Orange's tree induction algorithm. This setting decides which 
    87 of the two signals that output the tree - :code:`C45 Classifier` or 
    88 :code:`Tree Classifier` will be active. As Orange's structure is more general 
     87of the two signals that output the tree - :obj:`C45 Classifier` or 
     88:obj:`Tree Classifier` will be active. As Orange's structure is more general 
    8989and can easily accommodate all the data that C4.5 tree needs for 
    9090classification, we believe that the converted tree behave exactly the same as 
  • docs/widgets/rst/classify/classificationtree.rst

    r11359 r11404  
    2828 
    2929 
    30 Signal :code:`Classification Tree` sends data only if the learning data 
    31 (signal :code:`Classified Examples` is present. 
     30Signal :obj:`Classification Tree` sends data only if the learning data 
     31(signal :obj:`Classified Examples`) is present. 
    3232 
    3333Description 
     
    5757:obj:`Number of neighbours` considered in the estimation. 
    5858 
    59 If :code:`Binarization` is checked, the values of multivalued attributes 
     59If :obj:`Binarization` is checked, the values of multivalued attributes 
    6060are split into two groups (based on the statistics in the particular node) 
    6161to yield a binary tree. Binarization gets rid of the usual measures' 
  • docs/widgets/rst/classify/classificationtreeviewer.rst

    r11359 r11404  
    2424 
    2525 
    26 Signal :code:`Classified Examples` sends data only if some tree node is 
     26Signal :obj:`Classified Examples` sends data only if some tree node is 
    2727selected and contains some examples. 
    2828 
  • docs/widgets/rst/classify/interactivetreebuilder.rst

    r11359 r11404  
    3434 
    3535 
    36 Signal :code:`Examples` sends data only if some tree node is selected and 
     36Signal :obj:`Examples` sends data only if some tree node is selected and 
    3737contains some examples. 
    3838 
     
    5656the animals that don't give milk and have no feathers (the pictures shows 
    5757a tree for the zoo data set) would be split according to whether they are 
    58 :code:`aquatic` or not. In case of continuous attributes, a cut off point 
     58*aquatic* or not. In case of continuous attributes, a cut off point 
    5959needs to be specified as well. 
    6060 
    6161If Split is used on a node which is not a leaf, the criterion at that node 
    62 is replaced. If we, for instance, selected the &lt;root&gt; node and pushed 
    63 Split, the criterion :code:`milk` would be replaced with :code:`aquatic` 
    64 and the nodes below (:code:`feathers`) are removed. 
     62is replaced. If we, for instance, selected the :obj:`<root>` node and pushed 
     63:obj:`Split`, the criterion *milk* would be replaced with *aquatic* 
     64and the nodes below (*feathers*) are removed. 
    6565 
    6666Button :obj:`Cut` cuts the tree at the selected node. If we pushed Cut 
    6767in the situation in the picture, nothing would happen since the selected 
    68 node (:code:`feathers=0`) is already a leaf. If we selected :code:`<root>` 
     68node (:obj:`feathers=0`) is already a leaf. If we selected :obj:`<root>` 
    6969and pushed Cut, the entire tree would be cut off. 
    7070 
    71 Cut is especially useful in combination with :code:`Build` which builds 
     71Cut is especially useful in combination with :obj:`Build` which builds 
    7272a subtree at the current node. So, if we push Build in the situation 
    7373depicted above, a subtree would be built for the milkless featherless 
  • docs/widgets/rst/classify/knearestneighbours.rst

    r11359 r11404  
    2727 
    2828 
    29 Signal :code:`KNN Classifier` sends data only if the learning data (signal 
    30 :code:`Examples` is present. 
     29Signal :obj:`KNN Classifier` sends data only if the learning data (signal 
     30:obj:`Examples` is present. 
    3131 
    3232Description 
  • docs/widgets/rst/classify/logisticregression.rst

    r11359 r11404  
    2828 
    2929 
    30 Signal :code:`Logistic Regression Classifier` sends data only if the learning 
    31 data (signal :code:`Examples` is present. 
     30Signal :obj:`Logistic Regression Classifier` sends data only if the learning 
     31data (signal :obj:`Examples` is present. 
    3232 
    3333Description 
  • docs/widgets/rst/classify/majority.rst

    r11359 r11404  
    2828 
    2929 
    30 Signal :code:`Classifier` sends data only if the learning data (signal 
    31 :code:`Examples`) is present. 
     30Signal :obj:`Classifier` sends data only if the learning data (signal 
     31:obj:`Examples`) is present. 
    3232 
    3333Description 
  • docs/widgets/rst/classify/naivebayes.rst

    r11359 r11404  
    2828 
    2929 
    30 Signal :code:`Naive Bayesian Classifier` sends data only if the learning 
    31 data (signal :code:`Examples` is present. 
     30Signal :obj:`Naive Bayesian Classifier` sends data only if the learning 
     31data (signal :obj:`Examples` is present. 
    3232 
    3333Description 
  • docs/widgets/rst/data/discretize.rst

    r11050 r11404  
    139139right-hand side of the graph. In case of discrete classes, the target class can be any 
    140140of the original classes, while for discretized attributes, it is one of the intervals 
    141 (&lt;18545.33 in our case). :obj:`Show rug` adds small lines at the bottom 
     141(*< 18545.33* in our case). :obj:`Show rug` adds small lines at the bottom 
    142142and the top of the graph, which represents histograms showing the number of examples in the 
    143143target class (top) and the other classes (bottom). On the snapshot, the examples of the 
    144 target class (&lt;18545.33) are concentrated at between 50 and 120, while the rarer examples 
     144target class (*< 18545.33*) are concentrated at between 50 and 120, while the rarer examples 
    145145of other classes are spread between 100 and 200, with an outlier at 250. Plotting the rug 
    146146can be slow if the number of examples is huge. 
  • docs/widgets/rst/data/rank.rst

    r11359 r11404  
    6969 
    7070The widget outputs two example tables. The one, whose corresponding signal 
    71 is named :code:`ExampleTable Attributes` looks pretty much like the one 
     71is named :obj:`ExampleTable Attributes` looks pretty much like the one 
    7272shown in the Rank widget, except that the second column is split into two 
    7373columns, one giving the attribute type (D for discrete and C for continuous), 
     
    110110The examples in the file are put through ref:`Data Sampler` which split the 
    111111data set into two subsets: one, containing 70% of examples (signal 
    112 :code:`Classified Examples`) will be used for training a 
     112:obj:`Classified Examples`) will be used for training a 
    113113:ref:`Naive Bayes <Naive Bayes>` classifier, and the other 30% (signal 
    114 :code:`Remaining Classified Examples`) for testing. Attribute subset selection 
     114:obj:`Remaining Classified Examples`) for testing. Attribute subset selection 
    115115based on information gain was performed on the training set only, and five most 
    116116informative attributes were selected for learning. A data set with all other 
    117 attributes removed (signal :code:`Reduced Example Table`) is fed into 
     117attributes removed (signal :obj:`Reduced Example Table`) is fed into 
    118118:ref:`Test Learners`. Test Learners widgets also gets the 
    119 :code:`Remaining Classified Examples` to use them as test examples (don't 
    120 forget to set :code:`Test on Test Data` in that widget!). 
     119:obj:`Remaining Classified Examples` to use them as test examples (don't 
     120forget to set :obj:`Test on Test Data` in that widget!). 
    121121 
    122122To verify how the subset selection affects the classifier's performance, we 
    123123added another :ref:`Test Learners`, but connected it to the 
    124 :code:`Data Sampler` so that the two subsets emitted by the latter are used 
     124:ref:`Data Sampler` so that the two subsets emitted by the latter are used 
    125125for training and testing without any feature subset selection. 
    126126 
  • docs/widgets/rst/regression/pade.rst

    r11359 r11404  
    3232 
    3333The widget is implemented to cache some data. After, for instance, computing 
    34 the derivatives by :code:`x` and :code:`y` separately, the widget has already 
     34the derivatives by ``x`` and ``y`` separately, the widget has already 
    3535stored all the data to produce the derivatives by both in a moment. 
    3636 
  • docs/widgets/rst/regression/regressiontree.rst

    r11359 r11404  
    2424 
    2525 
    26 Signal :code:`Regression Tree` sends data only if the learning data (signal 
    27 :code:`Examples`) is present. 
     26Signal :obj:`Regression Tree` sends data only if the learning data (signal 
     27:obj:`Examples`) is present. 
    2828 
    2929Description 
     
    4242:ref:`Test Learners`. The default name is "Regression Tree". 
    4343 
    44 If :code:`Binarization` is checked, the values of multivalued attributes 
     44If :obj:`Binarization` is checked, the values of multivalued attributes 
    4545are split into two groups (based on the statistics in the particular node) 
    4646to yield a binary tree. Binarization gets rid of the usual measures' bias 
Note: See TracChangeset for help on using the changeset viewer.