Changeset 9349:fa13a2c52fcd in orange


Ignore:
Timestamp:
12/13/11 21:04:13 (2 years ago)
Author:
mitar
Branch:
default
Convert:
0cfa79eff84be2b4f488afa2b960342f5d1ce71a
Message:

Changed way of linking to code in documentation.

Location:
orange/Orange
Files:
32 edited

Legend:

Unmodified
Added
Removed
  • orange/Orange/associate/__init__.py

    r8042 r9349  
    101101The text needs to be cleaned of punctuation marks and capital letters at beginnings of the sentences, each sentence needs to be put in a new line and commas need to be inserted between the words. 
    102102 
    103 .. _inquisition.basket: code/inquisition.basket 
    104 .. _lenses.tab: code/lenses.tab 
    105  
    106 Data example (`inquisition.basket`_): 
     103Data example (:download:`inquisition.basket <code/inquisition.basket>`): 
    107104 
    108105.. literalinclude:: code/inquisition.basket 
    109106    
    110 Inducing the rules is trivial (uses `inquisition.basket`_): :: 
     107Inducing the rules is trivial (uses :download:`inquisition.basket <code/inquisition.basket>`):: 
    111108 
    112109    import Orange 
     
    207204Meaning of all attributes (except the new one, classificationRules) is the 
    208205same as for AssociationRulesSparseInducer. See the description of 
    209 :ref:`maxItemSets <maxItemSets>` there. The example uses `lenses.tab`_: :: 
     206:ref:`maxItemSets <maxItemSets>` there. The example uses :download:`lenses.tab <code/lenses.tab>`:: 
    210207 
    211208    import Orange 
     
    353350discarded afterwards). Let us write a function that finds the examples that 
    354351confirm the rule (fit both sides of it) and those that contradict it (fit the 
    355 left-hand side but not the right). The example uses the `lenses.tab`_: :: 
     352left-hand side but not the right). The example uses the :download:`lenses.tab <code/lenses.tab>`:: 
    356353 
    357354    import Orange 
  • orange/Orange/classification/knn.py

    r8042 r9349  
    154154into training (80%) and testing (20%) instances. We will use the former  
    155155for "training" the classifier and test it on five testing instances  
    156 randomly selected from a part of (`knnlearner.py`_, uses `iris.tab`_): 
     156randomly selected from a part of (:download:`knnlearner.py <code/knnlearner.py>`, uses :download:`iris.tab <code/iris.tab>`): 
    157157 
    158158.. literalinclude:: code/knnExample1.py 
     
    175175decide to do so, the distance_constructor must be set to an instance 
    176176of one of the classes for distance measuring. This can be seen in the following 
    177 part of (`knnlearner.py`_, uses `iris.tab`_): 
     177part of (:download:`knnlearner.py <code/knnlearner.py>`, uses :download:`iris.tab <code/iris.tab>`): 
    178178 
    179179.. literalinclude:: code/knnExample2.py 
     
    188188 
    189189The result is still perfect. 
    190  
    191 .. _iris.tab: code/iris.tab 
    192 .. _knnlearner.py: code/knnlearner.py 
    193190 
    194191.. index: fnn 
     
    292289-------- 
    293290 
    294 The following script (`knnInstanceDistance.py`_, uses `lenses.tab`_)  
     291The following script (:download:`knnInstanceDistance.py <code/knnInstanceDistance.py>`, uses :download:`lenses.tab <code/lenses.tab>`) 
    295292shows how to find the five nearest neighbors of the first instance 
    296293in the lenses dataset. 
    297294 
    298295.. literalinclude:: code/knnInstanceDistance.py 
    299  
    300 .. _lenses.tab: code/lenses.tab 
    301 .. _knnInstanceDistance.py: code/knnInstanceDistance.py 
    302296 
    303297""" 
  • orange/Orange/classification/logreg.py

    r8042 r9349  
    154154 
    155155The first example shows a very simple induction of a logistic regression 
    156 classifier (`logreg-run.py`_, uses `titanic.tab`_). 
     156classifier (:download:`logreg-run.py <code/logreg-run.py>`, uses :download:`titanic.tab <code/titanic.tab>`). 
    157157 
    158158.. literalinclude:: code/logreg-run.py 
     
    175175 
    176176The next examples shows how to handle singularities in data sets 
    177 (`logreg-singularities.py`_, uses `adult_sample.tab`_). 
     177(:download:`logreg-singularities.py <code/logreg-singularities.py>`, uses :download:`adult_sample.tab <code/adult_sample.tab>`). 
    178178 
    179179.. literalinclude:: code/logreg-singularities.py 
     
    219219 
    220220The example below shows, how the use of stepwise logistic regression can help to 
    221 gain in classification performance (`logreg-stepwise.py`_, uses `ionosphere.tab`_): 
     221gain in classification performance (:download:`logreg-stepwise.py <code/logreg-stepwise.py>`, uses :download:`ionosphere.tab <code/ionosphere.tab>`): 
    222222 
    223223.. literalinclude:: code/logreg-stepwise.py 
     
    260260    10 x a9 
    261261    10 x a8 
    262  
    263 .. _logreg-run.py: code/logreg-run.py 
    264 .. _logreg-singularities.py: code/logreg-singularities.py 
    265 .. _logreg-stepwise.py: code/logreg-stepwise.py 
    266  
    267 .. _ionosphere.tab: code/ionosphere.tab 
    268 .. _adult_sample.tab: code/adult_sample.tab 
    269 .. _titanic.tab: code/titanic.tab 
    270262 
    271263""" 
  • orange/Orange/classification/lookup.py

    r8925 r9349  
    2121they usually reside in :obj:`~Orange.data.variable.Variable.get_value_from` fields of constructed 
    2222features to facilitate their automatic computation. For instance, 
    23 the following script shows how to translate the `monks-1.tab`_ data set 
     23the following script shows how to translate the :download:`monks-1.tab <code/monks-1.tab>` data set 
    2424features into a more useful subset that will only include the features 
    2525``a``, ``b``, ``e``, and features that will tell whether ``a`` and ``b`` are equal and 
    2626whether ``e`` is 1 (don't bother about the details, they follow later;  
    27 `lookup-lookup.py`_, uses: `monks-1.tab`_): 
     27:download:`lookup-lookup.py <code/lookup-lookup.py>`, uses: :download:`monks-1.tab <code/monks-1.tab>`): 
    2828 
    2929.. literalinclude:: code/lookup-lookup.py 
     
    158158        Let's see some indices for randomly chosen examples from the original table. 
    159159         
    160         part of `lookup-lookup.py`_ (uses: `monks-1.tab`_): 
     160        part of :download:`lookup-lookup.py <code/lookup-lookup.py>` (uses: :download:`monks-1.tab <code/monks-1.tab>`): 
    161161 
    162162        .. literalinclude:: code/lookup-lookup.py 
     
    254254    is called and the resulting classifier is returned instead of the learner. 
    255255 
    256 part of `lookup-table.py`_ (uses: `monks-1.tab`_): 
     256part of :download:`lookup-table.py <code/lookup-table.py>` (uses: :download:`monks-1.tab <code/monks-1.tab>`): 
    257257 
    258258.. literalinclude:: code/lookup-table.py 
     
    323323the class_var. It doesn't set the :obj:`Orange.data.variable.Variable.get_value_from`, though. 
    324324 
    325 part of `lookup-table.py`_ (uses: `monks-1.tab`_):: 
     325part of :download:`lookup-table.py <code/lookup-table.py>` (uses: :download:`monks-1.tab <code/monks-1.tab>`):: 
    326326 
    327327    import Orange 
     
    336336alternative call arguments, it offers an easy way to observe feature 
    337337interactions. For this purpose, we shall omit e, and construct a 
    338 ClassifierByDataTable from a and b only (part of `lookup-table.py`_; uses: `monks-1.tab`_): 
     338ClassifierByDataTable from a and b only (part of :download:`lookup-table.py <code/lookup-table.py>`; uses: :download:`monks-1.tab <code/monks-1.tab>`): 
    339339 
    340340.. literalinclude:: code/lookup-table.py 
     
    469469        3      3      yes 
    470470 
    471  
    472 .. _lookup-lookup.py: code/lookup-lookup.py 
    473 .. _lookup-table.py: code/lookup-table.py 
    474 .. _monks-1.tab: code/monks-1.tab 
    475  
    476471""" 
    477472 
  • orange/Orange/classification/majority.py

    r8917 r9349  
    6262This "learning algorithm" will most often be used as a baseline, 
    6363that is, to determine if some other learning algorithm provides 
    64 any information about the class (`majority-classification.py`_,  
    65 uses: `monks-1.tab`_): 
     64any information about the class (:download:`majority-classification.py <code/majority-classification.py>`, 
     65uses: :download:`monks-1.tab <code/monks-1.tab>`): 
    6666 
    6767.. literalinclude:: code/majority-classification.py 
    6868    :lines: 7- 
    69  
    70 .. _majority-classification.py: code/majority-classification.py 
    71 .. _monks-1.tab: code/monks-1.tab 
    7269 
    7370""" 
  • orange/Orange/classification/rules.py

    r9075 r9349  
    3232Usage is consistent with typical learner usage in Orange: 
    3333 
    34 `rules-cn2.py`_ (uses `titanic.tab`_) 
     34:download:`rules-cn2.py <code/rules-cn2.py>` (uses :download:`titanic.tab <code/titanic.tab>`) 
    3535 
    3636.. literalinclude:: code/rules-cn2.py 
    3737    :lines: 7- 
    38  
    39 .. _rules-cn2.py: code/rules-cn2.py 
    40 .. _titanic.tab: code/titanic.tab 
    4138 
    4239The result:: 
     
    158155in description of classes that follows it: 
    159156 
    160 part of `rules-customized.py`_ (uses `titanic.tab`_) 
     157part of :download:`rules-customized.py <code/rules-customized.py>` (uses :download:`titanic.tab <code/titanic.tab>`) 
    161158 
    162159.. literalinclude:: code/rules-customized.py 
    163160    :lines: 7-17 
    164161 
    165 .. _rules-customized.py: code/rules-customized.py 
    166  
    167 In the example, the rule evaluation function was set to an m-estimate of 
    168162probability with m=50. The result is:: 
    169163 
     
    187181different bean width. This is simply written as: 
    188182 
    189 part of `rules-customized.py`_ (uses `titanic.tab`_) 
     183part of :download:`rules-customized.py <code/rules-customized.py>` (uses :download:`titanic.tab <code/titanic.tab>`) 
    190184 
    191185.. literalinclude:: code/rules-customized.py 
  • orange/Orange/classification/svm/__init__.py

    r9229 r9349  
    135135 
    136136.. literalinclude:: code/svm-custom-kernel.py 
    137  
    138  
    139 .. _svm-linear-weights.py: code/svm-linear-weights.py 
    140 .. _svm-custom-kernel.py: code/svm-custom-kernel.py 
    141 .. _svm-easy.py: code/svm-easy.py 
    142 .. _brown-selected.tab: code/brown-selected.tab 
    143 .. _iris.tab: code/iris.tab 
    144 .. _vehicle.tab: code/vehicle.tab 
    145137 
    146138.. _`Support Vector Machine`: http://en.wikipedia.org/wiki/Support_vector_machine 
  • orange/Orange/classification/tree.py

    r9299 r9349  
    2020.. literalinclude:: code/orngTree1.py 
    2121   :lines: 1-4 
    22  
    23 .. _orngTree1.py: code/orngTree1.py 
    2422 
    2523See `Decision tree learning 
     
    114112 
    115113The following function counts the number of nodes in a tree: 
    116  
    117 .. _lenses.tab: code/lenses.tab 
    118 .. _treestructure.py: code/treestructure.py 
    119114 
    120115.. literalinclude:: code/treestructure.py 
     
    263258for :obj:`~TreeLearner.stop`. 
    264259 
    265 .. _tree3.py: code/tree3.py 
    266  
    267260.. literalinclude:: code/tree3.py 
    268261   :lines: 8-23 
     
    822815.. literalinclude:: code/orngTree1.py 
    823816   :lines: 1-4 
    824  
    825 .. _orngTree1.py: code/orngTree1.py 
    826817 
    827818Printing the predicted class at each node, the number 
     
    11461137second largest class in the node: 
    11471138 
    1148 .. _orngTree2.py: code/orngTree2.py 
    1149  
    11501139.. literalinclude:: code/orngTree2.py 
    11511140   :lines: 7-31 
     
    12931282Examples 
    12941283======== 
    1295  
    1296 .. _tree_c45.py: code/tree_c45.py 
    1297 .. _iris.tab: code/iris.tab 
    12981284 
    12991285This 
  • orange/Orange/clustering/kmeans.py

    r8042 r9349  
    1616 
    1717The following code runs k-means clustering and prints out the cluster indexes 
    18 for the last 10 data instances (`kmeans-run.py`_, uses `iris.tab`_): 
     18for the last 10 data instances (:download:`kmeans-run.py <code/kmeans-run.py>`, uses :download:`iris.tab <code/iris.tab>`): 
    1919 
    2020.. literalinclude:: code/kmeans-run.py 
     
    2929o be computed at each iteration we have to set :obj:`minscorechange`, but we can 
    3030leave it at 0 or even set it to a negative value, which allows the score to deteriorate 
    31 by some amount (`kmeans-run-callback.py`_, uses `iris.tab`_): 
     31by some amount (:download:`kmeans-run-callback.py <code/kmeans-run-callback.py>`, uses :download:`iris.tab <code/iris.tab>`): 
    3232 
    3333.. literalinclude:: code/kmeans-run-callback.py 
     
    4444    Iteration: 8, changes: 0, score: 9.8624 
    4545 
    46 Call-back above is used for reporting of the progress, but may as well call a function that plots a selection data projection with corresponding centroid at a given step of the clustering. This is exactly what we did with the following script (`kmeans-trace.py`_, uses `iris.tab`_): 
     46Call-back above is used for reporting of the progress, but may as well call a function that plots a selection data projection with corresponding centroid at a given step of the clustering. This is exactly what we did with the following script (:download:`kmeans-trace.py <code/kmeans-trace.py>`, uses :download:`iris.tab <code/iris.tab>`): 
    4747 
    4848.. literalinclude:: code/kmeans-trace.py 
     
    8282and finds more optimal centroids. The following code compares three different  
    8383initialization methods (random, diversity-based and hierarchical clustering-based)  
    84 in terms of how fast they converge (`kmeans-cmp-init.py`_, uses `iris.tab`_,  
    85 `housing.tab`_, `vehicle.tab`_): 
     84in terms of how fast they converge (:download:`kmeans-cmp-init.py <code/kmeans-cmp-init.py>`, uses :download:`iris.tab <code/iris.tab>`, 
     85:download:`housing.tab <code/housing.tab>`, :download:`vehicle.tab <code/vehicle.tab>`): 
    8686 
    8787.. literalinclude:: code/kmeans-cmp-init.py 
     
    9696 
    9797The following code computes the silhouette score for k=2..7 and plots a  
    98 silhuette plot for k=3 (`kmeans-silhouette.py`_, uses `iris.tab`_): 
     98silhuette plot for k=3 (:download:`kmeans-silhouette.py <code/kmeans-silhouette.py>`, uses :download:`iris.tab <code/iris.tab>`): 
    9999 
    100100.. literalinclude:: code/kmeans-silhouette.py 
     
    114114   Silhouette plot for k=3. 
    115115 
    116 .. _iris.tab: code/iris.tab 
    117 .. _housing.tab: code/housing.tab 
    118 .. _vehicle.tab: code/vehicle.tab 
    119 .. _kmeans-run.py: code/kmeans-run.py 
    120 .. _kmeans-run-callback.py: code/kmeans-run-callback.py 
    121 .. _kmeans-trace.py: code/kmeans-trace.py 
    122 .. _kmeans-cmp-init.py: code/kmeans-cmp-init.py 
    123 .. _kmeans-silhouette.py: code/kmeans-sillhouette.py 
    124116""" 
    125117 
  • orange/Orange/data/sample.py

    r8042 r9349  
    115115Say that you have loaded the lenses domain into ``data``. We'll split 
    116116it into two datasets, the first containing only 6 examples and the other 
    117 containing the rest (from `randomindices2.py`_): 
    118   
    119 .. _randomindices2.py: code/randomindices2.py 
    120 .. _lenses.tab: code/lenses.tab 
    121  
     117containing the rest (from :download:`randomindices2.py <code/randomindices2.py>`): 
     118  
    122119.. literalinclude:: code/randomindices2.py 
    123120    :lines: 11-17 
     
    266263:obj:`stratified` to :obj:`Stratified` will yield an error. 
    267264 
    268 .. _randomindicesn.py: code/randomindicesn.py 
    269  
    270265Let us construct a list of indices that would assign half of examples 
    271266to the first set and a quarter to the second and third (part of 
    272 `randomindicesn.py`_, uses `lenses.tab`_): 
     267:download:`randomindicesn.py <code/randomindicesn.py>`, uses :download:`lenses.tab <code/lenses.tab>`): 
    273268 
    274269.. literalinclude:: code/randomindicesn.py 
     
    294289        Number of folds. Default is 10. 
    295290  
    296 .. _randomindicescv.py: code/randomindicescv.py 
    297   
    298291We shall prepare indices for an ordinary ten-fold cross validation and 
    299292indices for 10 examples for 5-fold cross validation. For the latter, 
    300293we shall only pass the number of examples, which, of course, prevents 
    301 the stratification. Part of `randomindicescv.py`_, uses `lenses.tab`_): 
     294the stratification. Part of :download:`randomindicescv.py <code/randomindicescv.py>`, uses :download:`lenses.tab <code/lenses.tab>`): 
    302295 
    303296.. literalinclude:: code/randomindicescv.py 
  • orange/Orange/data/variable.py

    r9207 r9349  
    409409    :return_type: :class:`Orange.data.variable.Variable` 
    410410     
    411 .. _`variable-reuse.py`: code/variable-reuse.py 
    412  
    413 These following examples (from `variable-reuse.py`_) give the shown results if 
     411These following examples (from :download:`variable-reuse.py <code/variable-reuse.py>`) give the shown results if 
    414412executed only once (in a Python session) and in this order. 
    415413 
  • orange/Orange/ensemble/__init__.py

    r8042 r9349  
    4646validation and observe classification accuracy. 
    4747 
    48 `ensemble.py`_ (uses `lymphography.tab`_) 
     48:download:`ensemble.py <code/ensemble.py>` (uses :download:`lymphography.tab <code/lymphography.tab>`) 
    4949 
    5050.. literalinclude:: code/ensemble.py 
    5151  :lines: 7- 
    52  
    53 .. _lymphography.tab: code/lymphography.tab 
    54 .. _ensemble.py: code/ensemble.py 
    5552 
    5653Running this script, we may get something like:: 
     
    8582to a tree learner on a liver disorder (bupa) and housing data sets. 
    8683 
    87 `ensemble-forest.py`_ (uses `buba.tab`_, `housing.tab`_) 
     84:download:`ensemble-forest.py <code/ensemble-forest.py>` (uses :download:`bupa.tab <code/bupa.tab>`, :download:`housing.tab <code/housing.tab>`) 
    8885 
    8986.. literalinclude:: code/ensemble-forest.py 
    9087  :lines: 7- 
    91  
    92 .. _buba.tab: code/buba.tab 
    93 .. _housing.tab: code/housing.tab 
    94 .. _ensemble-forest.py: code/ensemble-forest.py 
    9588 
    9689Notice that our forest contains 50 trees. Learners are compared through  
     
    113106and minExamples are both set to 5. 
    114107 
    115 `ensemble-forest2.py`_ (uses `buba.tab`_) 
     108:download:`ensemble-forest2.py <code/ensemble-forest2.py>` (uses :download:`bupa.tab <code/bupa.tab>`) 
    116109 
    117110.. literalinclude:: code/ensemble-forest2.py 
    118111  :lines: 7- 
    119  
    120 .. _ensemble-forest2.py: code/ensemble-forest2.py 
    121112 
    122113Running the above code would report on sizes (number of nodes) of the tree 
     
    153144:class:`Orange.data.Table` for details). 
    154145 
    155 `ensemble-forest-measure.py`_ (uses `iris.tab`_) 
     146:download:`ensemble-forest-measure.py <code/ensemble-forest-measure.py>` (uses :download:`iris.tab <code/iris.tab>`) 
    156147 
    157148.. literalinclude:: code/ensemble-forest-measure.py 
    158149  :lines: 7- 
    159  
    160 .. _ensemble-forest-measure.py: code/ensemble-forest-measure.py 
    161 .. _iris.tab: code/iris.tab 
    162150 
    163151Corresponding output:: 
  • orange/Orange/ensemble/forest.py

    r9318 r9349  
    6565    :param trees: number of trees in the forest. 
    6666    :type trees: int 
     67 
    6768    :param attributes: number of randomly drawn features among 
    6869            which to select the best to split the nodes in tree 
     
    7172            :obj:`learner` is specified. 
    7273    :type attributes: int 
     74 
    7375    :param base_learner: A base tree learner. The base learner will be 
    7476        randomized with Random Forest's random 
     
    7779        will not split nodes with less than 5 data instances. 
    7880    :type base_learner: None or 
    79     :class:`Orange.classification.tree.TreeLearner` or 
     81        :class:`Orange.classification.tree.TreeLearner` or 
    8082        :class:`Orange.classification.tree.SimpleTreeLearner` 
     83 
    8184    :param rand: random generator used in bootstrap sampling. If None (default),  
    8285        then ``random.Random(0)`` is used. 
     86 
    8387    :param learner: Tree induction learner. If `None` (default),  
    8488        the :obj:`base_learner` will be used (and randomized). If 
     
    8690        with no additional transformations. 
    8791    :type learner: None or :class:`Orange.core.Learner` 
     92 
    8893    :param callback: a function to be called after every iteration of 
    8994            induction of classifier. This is called with parameter  
    9095            (from 0.0 to 1.0) that gives estimates on learning progress. 
     96 
    9197    :param name: name of the learner. 
    9298    :type name: string 
     99 
    93100    :rtype: :class:`~Orange.ensemble.forest.RandomForestClassifier` or  
    94101            :class:`~Orange.ensemble.forest.RandomForestLearner` 
     102 
    95103    """ 
    96104 
     
    271279    :param trees: number of trees in the forest. 
    272280    :type trees: int 
     281 
    273282    :param attributes: number of randomly drawn features among 
    274283            which to select the best to split the nodes in tree 
     
    277286            :obj:`learner` is specified. 
    278287    :type attributes: int 
     288 
    279289    :param base_learner: A base tree learner. The base learner will be 
    280290        randomized with Random Forest's random 
     
    283293        will not split nodes with less than 5 data instances. 
    284294    :type base_learner: None or 
    285     :class:`Orange.classification.tree.TreeLearner` or 
     295        :class:`Orange.classification.tree.TreeLearner` or 
    286296        :class:`Orange.classification.tree.SimpleTreeLearner` 
     297 
    287298    :param rand: random generator used in bootstrap sampling. If None (default),  
    288299        then ``random.Random(0)`` is used. 
     300 
    289301    :param learner: Tree induction learner. If `None` (default),  
    290302        the :obj:`base_learner` will be used (and randomized). If 
     
    292304        with no additional transformations. 
    293305    :type learner: None or :class:`Orange.core.Learner` 
     306 
    294307    """ 
    295308    def __init__(self, trees=100, attributes=None, rand=None, base_learner=None, learner=None): 
  • orange/Orange/evaluation/reliability.py

    r9283 r9349  
    1919 
    2020Next example shows basic reliability estimation usage  
    21 (`reliability-basic.py`_, uses `housing.tab`_): 
     21(:download:`reliability-basic.py <code/reliability-basic.py>`, uses :download:`housing.tab <code/housing.tab>`): 
    2222 
    2323.. literalinclude:: code/reliability_basic.py 
     
    3333using default reliability estimates, and at the ending output reliability 
    3434estimates for the first instance of data table. 
    35 (`reliability-run.py`_, uses `housing.tab`_): 
     35(:download:`reliability-run.py <code/reliability-run.py>`, uses :download:`housing.tab <code/housing.tab>`): 
    3636 
    3737.. literalinclude:: code/reliability-run.py 
     
    114114 
    115115Here we will walk through a bit longer example of how to use the reliability 
    116 estimate module (`reliability-long.py`_, uses `prostate.tab`_):. 
     116estimate module (:download:`reliability-long.py <code/reliability-long.py>`, uses :download:`prostate.tab <code/prostate.tab>`): 
    117117 
    118118.. literalinclude:: code/reliability-long.py 
     
    153153method do you want to use. You might want to do this to reduce computation time  
    154154or because you think they don't perform good enough. 
    155  
    156 .. _reliability-run.py: code/reliability-run.py 
    157 .. _housing.tab: code/housing.tab 
    158  
    159 .. _reliability-long.py: code/reliability-long.py 
    160 .. _prostate.tab: code/prostate.tab 
    161155 
    162156 
  • orange/Orange/evaluation/scoring.py

    r8146 r9349  
    2626 
    2727Basic cross validation example is shown in the following part of  
    28 (`statExamples.py`_, uses `voting.tab`_ and `vehicle.tab`_): 
     28(:download:`statExamples.py <code/statExamples.py/>`, uses :download:`voting.tab <code/voting.tab>` and :download:`vehicle.tab <code/vehicle.tab>`): 
    2929 
    3030.. literalinclude:: code/statExample0.py 
    31  
    32 .. _voting.tab: code/voting.tab 
    33 .. _vehicle.tab: code/vehicle.tab 
    34 .. _statExamples.py: code/statExamples.py 
    3531 
    3632If instances are weighted, weights are taken into account. This can be 
     
    5147 
    5248So, let's compute all this in part of  
    53 (`statExamples.py`_, uses `voting.tab`_ and `vehicle.tab`_) and print it out: 
     49(:download:`statExamples.py <code/statExamples.py>`, uses :download:`voting.tab <code/voting.tab>` and :download:`vehicle.tab <code/vehicle.tab>`) and print it out: 
    5450 
    5551.. literalinclude:: code/statExample1.py 
    5652   :lines: 13- 
    57  
    58 .. _voting.tab: code/voting.tab 
    59 .. _vehicle.tab: code/vehicle.tab 
    60 .. _statExamples.py: code/statExamples.py 
    6153 
    6254The output should look like this:: 
     
    6759    majrty  0.614   0.526   0.474   -0.000 
    6860 
    69 Script `statExamples.py`_ contains another example that also prints out  
     61Script :download:`statExamples.py <code/statExamples.py>` contains another example that also prints out  
    7062the standard errors. 
    71  
    72 .. _statExamples.py: code/statExamples.py 
    7363 
    7464Confusion Matrix 
     
    9686   probability of the positive class is higher than the :obj:`cutoff`. 
    9787 
    98    The example (part of `statExamples.py`_) below shows how setting the 
     88   The example (part of :download:`statExamples.py <code/statExamples.py>`) below shows how setting the 
    9989   cut off threshold from the default 0.5 to 0.2 affects the confusion matrics  
    10090   for naive Bayesian classifier:: 
     
    10898       print "TP: %i, FP: %i, FN: %s, TN: %i" % (cm.TP, cm.FP, cm.FN, cm.TN) 
    10999 
    110    .. _statExamples.py: code/statExamples.py 
    111     
    112100   The output:: 
    113101    
     
    153141   the matrix for naive Bayesian classifier. 
    154142    
    155    Here we see another example from `statExamples.py`_:: 
     143   Here we see another example from :download:`statExamples.py <code/statExamples.py>`:: 
    156144    
    157145       cm = Orange.evaluation.scoring.confusion_matrices(resVeh)[0] 
     
    160148       for className, classConfusions in zip(classes, cm): 
    161149           print ("%s" + ("\t%i" * len(classes))) % ((className, ) + tuple(classConfusions)) 
    162     
    163    .. _statExamples.py: code/statExamples.py 
    164150    
    165151   So, here's what this nice piece of code gives:: 
     
    223209    
    224210   Let us print out sensitivities and specificities of our classifiers in 
    225    part of `statExamples.py`_:: 
     211   part of :download:`statExamples.py <code/statExamples.py>`:: 
    226212    
    227213       cm = Orange.evaluation.scoring.confusion_matrices(res) 
     
    231217           print "%s\t%5.3f\t%5.3f" % (learners[l].name, Orange.evaluation.scoring.sens(cm[l]), Orange.evaluation.scoring.spec(cm[l])) 
    232218    
    233    .. _statExamples.py: code/statExamples.py 
    234  
    235219ROC Analysis 
    236220============ 
     
    384368.. autofunction:: R2 
    385369 
    386 The following code (`statExamples.py`_) uses most of the above measures to 
     370The following code (:download:`statExamples.py <code/statExamples.py>`) uses most of the above measures to 
    387371score several regression methods. 
    388372 
    389373.. literalinclude:: code/statExamplesRegression.py 
    390  
    391 .. _statExamples.py: code/statExamples.py 
    392374 
    393375The code above produces the following output:: 
     
    405387.. autofunction:: graph_ranks 
    406388 
    407 The following script (`statExamplesGraphRanks.py`_) shows hot to plot a graph: 
     389The following script (:download:`statExamplesGraphRanks.py <code/statExamplesGraphRanks.py>`) shows hot to plot a graph: 
    408390 
    409391.. literalinclude:: code/statExamplesGraphRanks.py 
    410  
    411 .. _statExamplesGraphRanks.py: code/statExamplesGraphRanks.py 
    412392 
    413393Code produces the following graph:  
  • orange/Orange/feature/discretization.py

    r8764 r9349  
    3838  user-prescribed cut-off points. 
    3939 
    40 .. _discretization.py: code/discretization.py 
    41  
    4240Instances of classes derived from :class:`Discretization`. It define a 
    4341single method: the call operator. The object can also be called through 
     
    5351        name of the attribute. 
    5452 
    55 Here's an example. Part of `discretization.py`_: 
     53Here's an example. Part of :download:`discretization.py <code/discretization.py>`: 
    5654 
    5755.. literalinclude:: code/discretization.py 
     
    162160Let us manually construct an interval discretizer with cut-off points at 3.0 
    163161and 5.0. We shall use the discretizer to construct a discretized sepal length  
    164 (part of `discretization.py`_): 
     162(part of :download:`discretization.py <code/discretization.py>`): 
    165163 
    166164.. literalinclude:: code/discretization.py 
     
    179177Can you use the same discretizer for more than one attribute? Yes, as long 
    180178as they have same cut-off points, of course. Simply call construct_var for each 
    181 continuous attribute (part of `discretization.py`_): 
     179continuous attribute (part of :download:`discretization.py <code/discretization.py>`): 
    182180 
    183181.. literalinclude:: code/discretization.py 
     
    287285intervals. We shall construct an :class:`Orange.data.Table` with discretized 
    288286attributes and print description of the attributes (part 
    289 of `discretization.py`_): 
     287of :download:`discretization.py <code/discretization.py>`): 
    290288 
    291289.. literalinclude:: code/discretization.py 
     
    319317 
    320318As all discretizers, :class:`EquiDistDiscretizer` also has the method  
    321 ``construct_variable`` (part of `discretization.py`_): 
     319``construct_variable`` (part of :download:`discretization.py <code/discretization.py>`): 
    322320 
    323321.. literalinclude:: code/discretization.py 
     
    364362        its information gain is lower than MDL (default: false). 
    365363 
    366 Part of `discretization.py`_: 
     364Part of :download:`discretization.py <code/discretization.py>`: 
    367365 
    368366.. literalinclude:: code/discretization.py 
  • orange/Orange/feature/imputation.py

    r8762 r9349  
    207207minimal values. 
    208208 
    209 `imputation-minimal-imputer.py`_ (uses `voting.tab`_): 
     209:download:`imputation-minimal-imputer.py <code/imputation-minimal-imputer.py>` (uses :download:`voting.tab <code/voting.tab>`): 
    210210 
    211211.. literalinclude:: code/imputation-minimal-imputer.py 
     
    279279    in a single examples and then in the whole table. 
    280280 
    281 `imputation-complex.py`_ (uses `bridges.tab`_): 
     281:download:`imputation-complex.py <code/imputation-complex.py>` (uses :download:`bridges.tab <code/bridges.tab>`): 
    282282 
    283283.. literalinclude:: code/imputation-complex.py 
     
    299299the only attribute whose values will get imputed is "LENGTH"; the imputed value 
    300300will be 1234. 
    301  
    302 `imputation-complex.py`_ (uses `bridges.tab`_): 
    303301 
    304302.. literalinclude:: code/imputation-complex.py 
     
    371369The following imputer predicts the missing attribute values using 
    372370classification and regression trees with the minimum of 20 examples in a leaf.  
    373 Part of `imputation-complex.py`_ (uses `bridges.tab`_): 
     371Part of :download:`imputation-complex.py <code/imputation-complex.py>` (uses :download:`bridges.tab <code/bridges.tab>`): 
    374372 
    375373.. literalinclude:: code/imputation-complex.py 
     
    387385:class:`Orange.regression.mean.MeanLearner` (which 
    388386just remembers the average) for continuous attributes. Part of  
    389 `imputation-complex.py`_ (uses `bridges.tab`_): 
     387:download:`imputation-complex.py <code/imputation-complex.py>` (uses :download:`bridges.tab <code/bridges.tab>`): 
    390388 
    391389.. literalinclude:: code/imputation-complex.py 
     
    397395:class:`Imputer_model` and initialize an empty list of models.  
    398396The following code snippets are from 
    399 `imputation-complex.py`_ (uses `bridges.tab`_): 
     397:download:`imputation-complex.py <code/imputation-complex.py>` (uses :download:`bridges.tab <code/bridges.tab>`): 
    400398 
    401399.. literalinclude:: code/imputation-complex.py 
     
    496494 
    497495The following code shows what this imputer actually does to the domain. 
    498 Part of `imputation-complex.py`_ (uses `bridges.tab`_): 
     496Part of :download:`imputation-complex.py <code/imputation-complex.py>` (uses :download:`bridges.tab <code/bridges.tab>`): 
    499497 
    500498.. literalinclude:: code/imputation-complex.py 
    501499    :lines: 137-151 
    502  
    503500 
    504501The script's output looks like this:: 
     
    600597for wrappers from module orngImpute, and that way properly use the in 
    601598classifier testing procedures. 
    602  
    603 .. _imputation-minimal-imputer.py: code/imputation-minimal-imputer.py 
    604 .. _imputation-complex.py: code/imputation-complex.py 
    605 .. _voting.tab: code/voting.tab 
    606 .. _bridges.tab: code/bridges.tab 
    607599 
    608600""" 
  • orange/Orange/feature/scoring.py

    r9294 r9349  
    4040:obj:`score_all` and by scoring each feature individually, and prints out  
    4141the best three features.  
    42  
    43 .. _scoring-all.py: code/scoring-all.py 
    44 .. _voting.tab: code/voting.tab 
    4542 
    4643.. literalinclude:: code/scoring-all.py 
     
    433430 
    434431.. [Kononenko1995] I Kononenko: On biases in estimating multi-valued attributes, International Joint Conference on Artificial Intelligence, 1995. 
    435  
    436 .. _iris.tab: code/iris.tab 
    437 .. _lenses.tab: code/lenses.tab 
    438 .. _scoring-relief-gainRatio.py: code/scoring-relief-gainRatio.py 
    439 .. _voting.tab: code/voting.tab 
    440 .. _selection-best3.py: code/selection-best3.py 
    441 .. _scoring-info-lenses.py: code/scoring-info-lenses.py 
    442 .. _scoring-info-iris.py: code/scoring-info-iris.py 
    443 .. _scoring-diff-measures.py: code/scoring-diff-measures.py 
    444  
    445 .. _scoring-regression.py: code/scoring-regression.py 
    446 .. _scoring-relief-caching: code/scoring-relief-caching 
    447432 
    448433""" 
  • orange/Orange/feature/selection.py

    r8119 r9349  
    2020used to construct a predictive model. 
    2121 
    22 `selection-best3.py`_ (uses `voting.tab`_): 
     22:download:`selection-best3.py <code/selection-best3.py>` (uses :download:`voting.tab <code/voting.tab>`): 
    2323 
    2424.. literalinclude:: code/selection-best3.py 
     
    7878set of features. 
    7979 
    80 `selection-bayes.py`_ (uses `voting.tab`_): 
     80:download:`selection-bayes.py <code/selection-bayes.py>` (uses :download:`voting.tab <code/voting.tab>`): 
    8181 
    8282.. literalinclude:: code/selection-bayes.py 
     
    106106used. 
    107107 
    108 `selection-filtered-learner.py`_ (uses `voting.tab`_): 
     108:download:`selection-filtered-learner.py <code/selection-filtered-learner.py>` (uses :download:`voting.tab <code/voting.tab>`): 
    109109 
    110110.. literalinclude:: code/selection-filtered-learner.py 
     
    124124    :lines: 25- 
    125125 
    126 Running `selection-filtered-learner.py`_ with three features selected each 
     126Running :download:`selection-filtered-learner.py <code/selection-filtered-learner.py>` with three features selected each 
    127127time a learner is run gives the following result:: 
    128128 
     
    156156* R. Kohavi, G. John: Wrappers for Feature Subset Selection, Artificial 
    157157  Intelligence, 97 (1-2), pages 273-324, 1997 
    158  
    159 .. _selection-best3.py: code/selection-best3.py 
    160 .. _selection-bayes.py: code/selection-bayes.py 
    161 .. _selection-filtered-learner.py: code/selection-filtered-learner.py 
    162 .. _voting.tab: code/voting.tab 
    163158 
    164159""" 
  • orange/Orange/misc/selection.py

    r9301 r9349  
    3232feature with the highest information gain. 
    3333 
    34 part of `misc-selection-bestonthefly.py`_ (uses `lymphography.tab`_) 
     34part of :download:`misc-selection-bestonthefly.py <code/misc-selection-bestonthefly.py>` (uses :download:`lymphography.tab <code/lymphography.tab>`) 
    3535 
    3636.. literalinclude:: code/misc-selection-bestonthefly.py 
     
    4242like this: 
    4343 
    44 part of `misc-selection-bestonthefly.py`_ (uses `lymphography.tab`_) 
     44part of :download:`misc-selection-bestonthefly.py <code/misc-selection-bestonthefly.py>` (uses :download:`lymphography.tab <code/lymphography.tab>`) 
    4545 
    4646.. literalinclude:: code/misc-selection-bestonthefly.py 
     
    5050The other way to do it is through indices. 
    5151 
    52 `misc-selection-bestonthefly.py`_ (uses `lymphography.tab`_) 
     52:download:`misc-selection-bestonthefly.py <code/misc-selection-bestonthefly.py>` (uses :download:`lymphography.tab <code/lymphography.tab>`) 
    5353 
    5454.. literalinclude:: code/misc-selection-bestonthefly.py 
    5555  :lines: 25- 
    56  
    57 .. _misc-selection-bestonthefly.py: code/misc-selection-bestonthefly.py.py 
    58 .. _lymphography.tab: code/lymphography.tab 
    5956 
    6057Here we only give gain ratios to :obj:`BestOnTheFly`, so we don't have 
  • orange/Orange/misc/serverfiles.py

    r8997 r9349  
    7474======== 
    7575 
    76 .. _serverfiles1.py: code/serverfiles1.py 
    77 .. _serverfiles2.py: code/serverfiles2.py 
    78  
    79 Listing local files, files from the repository and downloading all available files from domain "demo" (`serverfiles1.py`_). 
     76Listing local files, files from the repository and downloading all available files from domain "demo" (:download:`serverfiles1.py <code/serverfiles1.py>`). 
    8077 
    8178.. literalinclude:: code/serverfiles1.py 
     
    9592    My domains ['KEGG', 'gene_sets', 'dictybase', 'NCBI_geneinfo', 'GO', 'miRNA', 'demo', 'Taxonomy', 'GEO'] 
    9693 
    97 A domain with a simple file can be built as follows (`serverfiles2.py`_). Of course, 
     94A domain with a simple file can be built as follows (:download:`serverfiles2.py <code/serverfiles2.py>`). Of course, 
    9895the username and password should be valid. 
    9996 
  • orange/Orange/network/__init__.py

    r9200 r9349  
    2323Pajek (.net) or GML file format. 
    2424 
    25 `network-read-nx.py`_ (uses: `K5.net`_): 
     25:download:`network-read-nx.py <code/network-read-nx.py>` (uses: :download:`K5.net <code/K5.net>`): 
    2626 
    2727.. literalinclude:: code/network-read.py 
    2828    :lines: 5-6 
    2929     
    30 .. _network-read-nx.py: code/network-read-nx.py 
    31 .. _K5.net: code/K5.net 
    32  
    3330Visualize a network in NetExplorer widget 
    3431----------------------------------------- 
     
    3633This example demonstrates how to display a network in NetExplorer. 
    3734 
    38 part of `network-widget.py`_ 
     35part of :download:`network-widget.py <code/network-widget.py>` 
    3936 
    4037.. literalinclude:: code/network-widget.py 
     
    4340.. image:: files/network-explorer.png 
    4441    :width: 100% 
    45  
    46 .. _network-widget.py: code/network-widget.py 
    4742 
    4843""" 
  • orange/Orange/network/deprecated.py

    r7999 r9349  
    2929Pajek (.net) or GML file format. 
    3030 
    31 `network-read.py`_ (uses: `K5.net`_): 
     31:download:`network-read.py <code/network-read.py>` (uses: :download:`K5.net <code/K5.net>`): 
    3232 
    3333.. literalinclude:: code/network-read.py 
     
    3939This example demonstrates how to display a network in NetExplorer. 
    4040 
    41 part of `network-widget.py`_ 
     41part of :download:`network-widget.py <code/network-widget.py>` 
    4242 
    4343.. literalinclude:: code/network-widget.py 
     
    6666layout optimization method in this example. 
    6767 
    68 `network-constructor.py`_ 
     68:download:`network-constructor.py <code/network-constructor.py>` 
    6969 
    7070.. literalinclude:: code/network-constructor.py 
     
    8181algorithms. 
    8282 
    83 part of `network-optimization.py`_ 
     83part of :download:`network-optimization.py <code/network-optimization.py>` 
    8484 
    8585.. literalinclude:: code/network-optimization.py 
     
    252252    values in the dictionary should be integers, eg. 
    253253        
    254     part of `network-graph.py`_ 
     254    part of :download:`network-graph.py <code/network-graph.py>` 
    255255 
    256256    .. literalinclude:: code/network-graph.py 
     
    451451^^^^^^^^ 
    452452 
    453 How to use graphs, part of `network-graph.py`_ 
     453How to use graphs, part of :download:`network-graph.py <code/network-graph.py>` 
    454454 
    455455.. literalinclude:: code/network-graph.py 
     
    477477    (None, None, None) 
    478478 
    479 How to use graphs with objects on edges, part of `network-graph-obj.py`_ 
     479How to use graphs with objects on edges, part of :download:`network-graph-obj.py <code/network-graph-obj.py>` 
    480480 
    481481.. literalinclude:: code/network-graph-obj.py 
     
    505505    (None, None, None) 
    506506 
    507 An example of network analysis, part of `network-graph-analysis.py`_ (uses: 
    508 `combination.net`_): 
     507An example of network analysis, part of :download:`network-graph-analysis.py <code/network-graph-analysis.py>` (uses: 
     508:download:`combination.net <code/combination.net>`): 
    509509 
    510510.. literalinclude:: code/network-graph-analysis.py 
     
    551551.. autoclass:: Orange.network.NetworkClustering 
    552552   :members: 
    553  
    554 .. _network-constructor.py: code/network-constructor.py 
    555 .. _network-optimization.py: code/network-optimization.py 
    556 .. _network-read.py: code/network-read.py 
    557 .. _K5.net: code/K5.net 
    558 .. _combination.net: code/combination.net 
    559 .. _network-widget.py: code/network-widget.py 
    560 .. _network-graph-analysis.py: code/network-graph-analysis.py 
    561 .. _network-graph.py: code/network-graph.py 
    562 .. _network-graph-obj.py: code/network-graph-obj.py 
    563553 
    564554""" 
  • orange/Orange/network/network.py

    r9007 r9349  
    306306    matplotlib.  
    307307         
    308     `network-constructor-nx.py`_ 
     308    :download:`network-constructor-nx.py <code/network-constructor-nx.py>` 
    309309     
    310310    .. literalinclude:: code/network-constructor-nx.py 
     
    313313     
    314314    .. image:: files/network-K5-random.png 
    315      
    316     .. _network-constructor-nx.py: code/network-constructor-nx.py 
    317315     
    318316    *Network layout optimization* 
     
    321319    included algorithms. 
    322320     
    323     part of `network-optimization-nx.py`_ 
     321    part of :download:`network-optimization-nx.py <code/network-optimization-nx.py>` 
    324322     
    325323    .. literalinclude:: code/network-optimization-nx.py 
     
    329327     
    330328    .. image:: files/network-K5-fr.png 
    331      
    332     .. _network-optimization-nx.py: code/network-optimization-nx.py 
    333329     
    334330    """ 
  • orange/Orange/optimization/__init__.py

    r8042 r9349  
    5555This is how you use the learner. 
    5656 
    57 part of `optimization-thresholding1.py`_ 
     57part of :download:`optimization-thresholding1.py <code/optimization-thresholding1.py>` 
    5858 
    5959.. literalinclude:: code/optimization-thresholding1.py 
     
    6969still unimportant), while setting it at 80% is a bad idea. Or is it? 
    7070 
    71 part of `optimization-thresholding2.py`_ 
     71part of :download:`optimization-thresholding2.py <code/optimization-thresholding2.py>` 
    7272 
    7373.. literalinclude:: code/optimization-thresholding2.py 
     
    8989   :members:  
    9090    
    91 .. _optimization-thresholding1.py: code/optimization-thresholding1.py 
    92 .. _optimization-thresholding2.py: code/optimization-thresholding2.py 
    93  
    9491""" 
    9592 
     
    210207    for a tree classifier. 
    211208     
    212     part of `optimization-tuning1.py`_ 
     209    part of :download:`optimization-tuning1.py <code/optimization-tuning1.py>` 
    213210 
    214211    .. literalinclude:: code/optimization-tuning1.py 
     
    234231    tree learner, and test them both. 
    235232     
    236     part of `optimization-tuning1.py`_ 
     233    part of :download:`optimization-tuning1.py <code/optimization-tuning1.py>` 
    237234 
    238235    .. literalinclude:: code/optimization-tuning1.py 
     
    248245        Untuned tree: 0.930 
    249246        Tuned tree: 0.986 
    250      
    251     .. _optimization-tuning1.py: code/optimization-tuning1.py 
    252247     
    253248    """ 
     
    312307    tuner as follows: 
    313308     
    314     `optimization-tuningm.py`_ 
     309    :download:`optimization-tuningm.py <code/optimization-tuningm.py>` 
    315310 
    316311    .. literalinclude:: code/optimization-tuningm.py 
     
    319314    :obj:`Orange.optimization.Tune1Parameter`. 
    320315     
    321     .. _optimization-tuningm.py: code/optimization-tuningm.py 
    322          
    323316    """ 
    324317     
  • orange/Orange/preprocess/outliers.py

    r8059 r9349  
    1414.. rubric:: Examples 
    1515 
    16 .. _outliers1.py: code/outlier1.py 
    17 .. _outliers2.py: code/outlier2.py 
    18  
    1916The following example prints a list of Z-values of examples in bridges dataset 
    20 (`outliers1.py`_). 
     17(:download:`outlier1.py <code/outlier1.py>`). 
    2118 
    2219.. literalinclude:: code/outlier1.py 
     
    2421The following example prints 5 examples with highest Z-scores. Euclidean 
    2522distance is used as a distance measurement and average distance is calculated 
    26 over 3 nearest neighbours (`outliers2.py`_). 
     23over 3 nearest neighbours (:download:`outlier2.py <code/outlier2.py>`). 
    2724 
    2825.. literalinclude:: code/outlier2.py 
  • orange/Orange/projection/correspondence.py

    r8042 r9349  
    1717    :exclude-members: A, B, D, F, G 
    1818              
    19  
    20  
    2119Example 
    2220------- 
    2321 
    2422Data table given below represents smoking habits of different employees 
    25 in a company (computed from smokers_ct.tab). 
     23in a company (computed from :download:`smokers_ct.tab <code/smokers_ct.tab>`). 
    2624 
    2725    ================  ====  =====  ======  =====  ========== 
     
    5856 
    5957So lets load the data, compute the contingency and do the analysis 
    60 (`correspondence.py`_, uses `smokers_ct.tab`_)):: 
     58(:download:`correspondence.py <code/correspondence.py>`, uses :download:`smokers_ct.tab <code/smokers_ct.tab>`):: 
    6159     
    6260    from Orange.projection import correspondence 
     
    159157 
    160158.. autofunction:: burt_table 
    161  
    162 .. _correspondence.py: code/correspondence.py 
    163 .. _smokers_ct.tab: code/smokers_ct.tab 
    164159 
    165160""" 
  • orange/Orange/projection/mds.py

    r8042 r9349  
    5555(not included with orange, http://matplotlib.sourceforge.net/). 
    5656 
    57 Example (`mds-scatterplot.py`_, uses `iris.tab`_) 
     57Example (:download:`mds-scatterplot.py <code/mds-scatterplot.py>`, uses :download:`iris.tab <code/iris.tab>`) 
    5858 
    5959.. literalinclude:: code/mds-scatterplot.py 
    6060    :lines: 7- 
    61  
    62 .. _mds-scatterplot.py: code/mds-scatterplot.py 
    63 .. _iris.tab: code/iris.tab 
    6461 
    6562The script produces a file *mds-scatterplot.py.png*. Color denotes 
     
    7976time. 
    8077 
    81 Example (`mds-advanced.py`_, uses `iris.tab`_) 
     78Example (:download:`mds-advanced.py <code/mds-advanced.py>`, uses :download:`iris.tab <code/iris.tab>`) 
    8279 
    8380.. literalinclude:: code/mds-advanced.py 
    8481    :lines: 7- 
    85  
    86 .. _mds-advanced.py: code/mds-advanced.py 
    8782 
    8883A few representative lines of the output are:: 
  • orange/Orange/projection/som.py

    r8762 r9349  
    8282 
    8383Class :obj:`Map` stores the self-organizing map composed of :obj:`Node` objects. The code below 
    84 (`som-node.py`_, uses `iris.tab`_) shows an example how to access the information stored in the  
     84(:download:`som-node.py <code/som-node.py>`, uses :download:`iris.tab <code/iris.tab>`) shows an example how to access the information stored in the  
    8585node of the map: 
    8686 
     
    9898======== 
    9999 
    100 .. _som-mapping.py: code/som-mapping.py 
    101 .. _som-node.py: code/som-node.py 
    102 .. _iris.tab: code/iris.tab 
    103  
    104 The following code  (`som-mapping.py`_, uses `iris.tab`_) infers self-organizing map from Iris data set. The map is rather small, and consists  
     100The following code  (:download:`som-mapping.py <code/som-mapping.py>`, uses :download:`iris.tab <code/iris.tab>`) infers self-organizing map from Iris data set. The map is rather small, and consists  
    105101of only 9 cells. We optimize the network, and then report how many data instances were mapped 
    106102into each cell. The second part of the code reports on data instances from one of the corner cells: 
  • orange/Orange/regression/mean.py

    r8042 r9349  
    2626Here's a simple example. 
    2727 
    28 `mean-regression.py`_ (uses: `housing.tab`_): 
     28:download:`mean-regression.py <code/mean-regression.py>` (uses: :download:`housing.tab <code/housing.tab>`): 
    2929 
    3030.. literalinclude:: code/mean-regression.py 
    3131    :lines: 7- 
    32  
    33 .. _mean-regression.py: code/mean-regression.py 
    34 .. _housing.tab: code/housing.tab 
    3532 
    3633""" 
  • orange/Orange/regression/tree.py

    r9164 r9349  
    1212but uses a different set of functions to evaluate node splitting and stop 
    1313criteria. Usage of regression trees is straightforward as demonstrated on the 
    14 following example (`regression-tree-run.py`_, uses `servo.tab`_): 
     14following example (:download:`regression-tree-run.py <code/regression-tree-run.py>`, uses :download:`servo.tab <code/servo.tab>`): 
    1515 
    1616.. literalinclude:: code/regression-tree-run.py 
     
    2222.. autoclass:: TreeClassifier 
    2323    :members: 
    24  
    25 .. _regression-tree-run.py: code/regression-tree-run.py 
    26 .. _servo.tab: code/servo.tab 
    27  
    2824 
    2925================= 
  • orange/Orange/statistics/basic.py

    r8042 r9349  
    9595        variables in the domain. 
    9696     
    97     part of `distributions-basic-stat.py`_ (uses monks-1.tab) 
     97    part of :download:`distributions-basic-stat.py <code/distributions-basic-stat.py>` (uses :download:`monks-1.tab <code/monks-1.tab>`) 
    9898     
    9999    .. literalinclude:: code/distributions-basic-stat.py 
     
    109109 
    110110 
    111     part of `distributions-basic-stat`_ (uses iris.tab) 
     111    part of :download:`distributions-basic-stat.py <code/distributions-basic-stat.py>` (uses :download:`iris.tab <code/iris.tab>`) 
    112112     
    113113    .. literalinclude:: code/distributions-basic-stat.py 
     
    118118        5.84333467484  
    119119 
    120 .. _distributions-basic-stat: code/distributions-basic-stat.py 
    121 .. _distributions-basic-stat.py: code/distributions-basic-stat.py 
    122120""" 
    123121 
Note: See TracChangeset for help on using the changeset viewer.