Changeset 9994:1073e0304a87 in orange


Ignore:
Timestamp:
02/07/12 22:35:42 (2 years ago)
Author:
Matija Polajnar <matija.polajnar@…>
Branch:
default
Message:

Remove links from documentation to datasets. Remove datasets reference directory.

Files:
23 deleted
35 edited

Legend:

Unmodified
Added
Removed
  • Orange/classification/knn.py

    r9724 r9994  
    136136into training (80%) and testing (20%) instances. We will use the former  
    137137for "training" the classifier and test it on five testing instances  
    138 randomly selected from a part of (:download:`knnlearner.py <code/knnlearner.py>`, uses :download:`iris.tab <code/iris.tab>`): 
     138randomly selected from a part of (:download:`knnlearner.py <code/knnlearner.py>`): 
    139139 
    140140.. literalinclude:: code/knnExample1.py 
     
    157157decide to do so, the distance_constructor must be set to an instance 
    158158of one of the classes for distance measuring. This can be seen in the following 
    159 part of (:download:`knnlearner.py <code/knnlearner.py>`, uses :download:`iris.tab <code/iris.tab>`): 
     159part of (:download:`knnlearner.py <code/knnlearner.py>`): 
    160160 
    161161.. literalinclude:: code/knnExample2.py 
     
    271271-------- 
    272272 
    273 The following script (:download:`knnInstanceDistance.py <code/knnInstanceDistance.py>`, uses :download:`lenses.tab <code/lenses.tab>`) 
     273The following script (:download:`knnInstanceDistance.py <code/knnInstanceDistance.py>`) 
    274274shows how to find the five nearest neighbors of the first instance 
    275275in the lenses dataset. 
  • Orange/classification/lookup.py

    r9960 r9994  
    2121they usually reside in :obj:`~Orange.feature.Descriptor.get_value_from` fields of constructed 
    2222features to facilitate their automatic computation. For instance, 
    23 the following script shows how to translate the :download:`monks-1.tab <code/monks-1.tab>` data set 
     23the following script shows how to translate the `monks-1.tab` data set 
    2424features into a more useful subset that will only include the features 
    2525``a``, ``b``, ``e``, and features that will tell whether ``a`` and ``b`` are equal and 
    2626whether ``e`` is 1 (don't bother about the details, they follow later;  
    27 :download:`lookup-lookup.py <code/lookup-lookup.py>`, uses: :download:`monks-1.tab <code/monks-1.tab>`): 
     27:download:`lookup-lookup.py <code/lookup-lookup.py>`): 
    2828 
    2929.. literalinclude:: code/lookup-lookup.py 
     
    158158        Let's see some indices for randomly chosen examples from the original table. 
    159159         
    160         part of :download:`lookup-lookup.py <code/lookup-lookup.py>` (uses: :download:`monks-1.tab <code/monks-1.tab>`): 
     160        part of :download:`lookup-lookup.py <code/lookup-lookup.py>`: 
    161161 
    162162        .. literalinclude:: code/lookup-lookup.py 
     
    254254    is called and the resulting classifier is returned instead of the learner. 
    255255 
    256 part of :download:`lookup-table.py <code/lookup-table.py>` (uses: :download:`monks-1.tab <code/monks-1.tab>`): 
     256part of :download:`lookup-table.py <code/lookup-table.py>`: 
    257257 
    258258.. literalinclude:: code/lookup-table.py 
     
    323323the class_var. It doesn't set the :obj:`Orange.feature.Descriptor.get_value_from`, though. 
    324324 
    325 part of :download:`lookup-table.py <code/lookup-table.py>` (uses: :download:`monks-1.tab <code/monks-1.tab>`):: 
     325part of :download:`lookup-table.py <code/lookup-table.py>`:: 
    326326 
    327327    import Orange 
     
    336336alternative call arguments, it offers an easy way to observe feature 
    337337interactions. For this purpose, we shall omit e, and construct a 
    338 ClassifierByDataTable from a and b only (part of :download:`lookup-table.py <code/lookup-table.py>`; uses: :download:`monks-1.tab <code/monks-1.tab>`): 
     338ClassifierByDataTable from a and b only (part of :download:`lookup-table.py <code/lookup-table.py>`): 
    339339 
    340340.. literalinclude:: code/lookup-table.py 
  • Orange/classification/majority.py

    r9671 r9994  
    6262This "learning algorithm" will most often be used as a baseline, 
    6363that is, to determine if some other learning algorithm provides 
    64 any information about the class (:download:`majority-classification.py <code/majority-classification.py>`, 
    65 uses: :download:`monks-1.tab <code/monks-1.tab>`): 
     64any information about the class (:download:`majority-classification.py <code/majority-classification.py>`): 
    6665 
    6766.. literalinclude:: code/majority-classification.py 
  • Orange/classification/rules.py

    r9936 r9994  
    3232Usage is consistent with typical learner usage in Orange: 
    3333 
    34 :download:`rules-cn2.py <code/rules-cn2.py>` (uses :download:`titanic.tab <code/titanic.tab>`) 
     34:download:`rules-cn2.py <code/rules-cn2.py>` 
    3535 
    3636.. literalinclude:: code/rules-cn2.py 
     
    155155in description of classes that follows it: 
    156156 
    157 part of :download:`rules-customized.py <code/rules-customized.py>` (uses :download:`titanic.tab <code/titanic.tab>`) 
     157part of :download:`rules-customized.py <code/rules-customized.py>` 
    158158 
    159159.. literalinclude:: code/rules-customized.py 
     
    181181different bean width. This is simply written as: 
    182182 
    183 part of :download:`rules-customized.py <code/rules-customized.py>` (uses :download:`titanic.tab <code/titanic.tab>`) 
     183part of :download:`rules-customized.py <code/rules-customized.py>` 
    184184 
    185185.. literalinclude:: code/rules-customized.py 
  • Orange/clustering/kmeans.py

    r9977 r9994  
    1616 
    1717The following code runs k-means clustering and prints out the cluster indexes 
    18 for the last 10 data instances (:download:`kmeans-run.py <code/kmeans-run.py>`, uses :download:`iris.tab <code/iris.tab>`): 
     18for the last 10 data instances (:download:`kmeans-run.py <code/kmeans-run.py>`): 
    1919 
    2020.. literalinclude:: code/kmeans-run.py 
     
    2929o be computed at each iteration we have to set :obj:`minscorechange`, but we can 
    3030leave it at 0 or even set it to a negative value, which allows the score to deteriorate 
    31 by some amount (:download:`kmeans-run-callback.py <code/kmeans-run-callback.py>`, uses :download:`iris.tab <code/iris.tab>`): 
     31by some amount (:download:`kmeans-run-callback.py <code/kmeans-run-callback.py>`): 
    3232 
    3333.. literalinclude:: code/kmeans-run-callback.py 
     
    4444    Iteration: 8, changes: 0, score: 9.8624 
    4545 
    46 Call-back above is used for reporting of the progress, but may as well call a function that plots a selection data projection with corresponding centroid at a given step of the clustering. This is exactly what we did with the following script (:download:`kmeans-trace.py <code/kmeans-trace.py>`, uses :download:`iris.tab <code/iris.tab>`): 
     46Call-back above is used for reporting of the progress, but may as well call a function that plots a selection data projection with corresponding centroid at a given step of the clustering. This is exactly what we did with the following script (:download:`kmeans-trace.py <code/kmeans-trace.py>`): 
    4747 
    4848.. literalinclude:: code/kmeans-trace.py 
     
    8282and finds more optimal centroids. The following code compares three different  
    8383initialization methods (random, diversity-based and hierarchical clustering-based)  
    84 in terms of how fast they converge (:download:`kmeans-cmp-init.py <code/kmeans-cmp-init.py>`, uses :download:`iris.tab <code/iris.tab>`, 
    85 :download:`housing.tab <code/housing.tab>`, :download:`vehicle.tab <code/vehicle.tab>`): 
     84in terms of how fast they converge (:download:`kmeans-cmp-init.py <code/kmeans-cmp-init.py>`): 
    8685 
    8786.. literalinclude:: code/kmeans-cmp-init.py 
     
    9695 
    9796The following code computes the silhouette score for k=2..7 and plots a  
    98 silhuette plot for k=3 (:download:`kmeans-silhouette.py <code/kmeans-silhouette.py>`, uses :download:`iris.tab <code/iris.tab>`): 
     97silhuette plot for k=3 (:download:`kmeans-silhouette.py <code/kmeans-silhouette.py>`): 
    9998 
    10099.. literalinclude:: code/kmeans-silhouette.py 
  • Orange/data/sample.py

    r9697 r9994  
    265265Let us construct a list of indices that would assign half of examples 
    266266to the first set and a quarter to the second and third (part of 
    267 :download:`randomindicesn.py <code/randomindicesn.py>`, uses :download:`lenses.tab <code/lenses.tab>`): 
     267:download:`randomindicesn.py <code/randomindicesn.py>`): 
    268268 
    269269.. literalinclude:: code/randomindicesn.py 
     
    292292indices for 10 examples for 5-fold cross validation. For the latter, 
    293293we shall only pass the number of examples, which, of course, prevents 
    294 the stratification. Part of :download:`randomindicescv.py <code/randomindicescv.py>`, uses :download:`lenses.tab <code/lenses.tab>`): 
     294the stratification. Part of :download:`randomindicescv.py <code/randomindicescv.py>`): 
    295295 
    296296.. literalinclude:: code/randomindicescv.py 
  • Orange/ensemble/__init__.py

    r9671 r9994  
    4646validation and observe classification accuracy. 
    4747 
    48 :download:`ensemble.py <code/ensemble.py>` (uses :download:`lymphography.tab <code/lymphography.tab>`) 
     48:download:`ensemble.py <code/ensemble.py>` 
    4949 
    5050.. literalinclude:: code/ensemble.py 
     
    8282to a tree learner on a liver disorder (bupa) and housing data sets. 
    8383 
    84 :download:`ensemble-forest.py <code/ensemble-forest.py>` (uses :download:`bupa.tab <code/bupa.tab>`, :download:`housing.tab <code/housing.tab>`) 
     84:download:`ensemble-forest.py <code/ensemble-forest.py>` 
    8585 
    8686.. literalinclude:: code/ensemble-forest.py 
     
    106106and minExamples are both set to 5. 
    107107 
    108 :download:`ensemble-forest2.py <code/ensemble-forest2.py>` (uses :download:`bupa.tab <code/bupa.tab>`) 
     108:download:`ensemble-forest2.py <code/ensemble-forest2.py>` 
    109109 
    110110.. literalinclude:: code/ensemble-forest2.py 
     
    144144:class:`Orange.data.Table` for details). 
    145145 
    146 :download:`ensemble-forest-measure.py <code/ensemble-forest-measure.py>` (uses :download:`iris.tab <code/iris.tab>`) 
     146:download:`ensemble-forest-measure.py <code/ensemble-forest-measure.py>` 
    147147 
    148148.. literalinclude:: code/ensemble-forest-measure.py 
  • Orange/misc/selection.py

    r9775 r9994  
    3232feature with the highest information gain. 
    3333 
    34 part of :download:`misc-selection-bestonthefly.py <code/misc-selection-bestonthefly.py>` (uses :download:`lymphography.tab <code/lymphography.tab>`) 
     34part of :download:`misc-selection-bestonthefly.py <code/misc-selection-bestonthefly.py>` 
    3535 
    3636.. literalinclude:: code/misc-selection-bestonthefly.py 
     
    4242like this: 
    4343 
    44 part of :download:`misc-selection-bestonthefly.py <code/misc-selection-bestonthefly.py>` (uses :download:`lymphography.tab <code/lymphography.tab>`) 
     44part of :download:`misc-selection-bestonthefly.py <code/misc-selection-bestonthefly.py>` 
    4545 
    4646.. literalinclude:: code/misc-selection-bestonthefly.py 
     
    5050The other way to do it is through indices. 
    5151 
    52 :download:`misc-selection-bestonthefly.py <code/misc-selection-bestonthefly.py>` (uses :download:`lymphography.tab <code/lymphography.tab>`) 
     52:download:`misc-selection-bestonthefly.py <code/misc-selection-bestonthefly.py>` 
    5353 
    5454.. literalinclude:: code/misc-selection-bestonthefly.py 
  • Orange/multilabel/br.py

    r9671 r9994  
    4545 
    4646The following example demonstrates a straightforward invocation of 
    47 this algorithm (:download:`mlc-classify.py <code/mlc-classify.py>`, uses 
    48 :download:`emotions.tab <code/emotions.tab>`): 
     47this algorithm (:download:`mlc-classify.py <code/mlc-classify.py>`): 
    4948 
    5049.. literalinclude:: code/mlc-classify.py 
  • Orange/multilabel/brknn.py

    r9671 r9994  
    3030 
    3131The following example demonstrates a straightforward invocation of 
    32 this algorithm (:download:`mlc-classify.py <code/mlc-classify.py>`, uses 
    33 :download:`emotions.tab <code/emotions.tab>`): 
     32this algorithm (:download:`mlc-classify.py <code/mlc-classify.py>`): 
    3433 
    3534.. literalinclude:: code/mlc-classify.py 
  • Orange/multilabel/lp.py

    r9922 r9994  
    3434 
    3535The following example demonstrates a straightforward invocation of 
    36 this algorithm (:download:`mlc-classify.py <code/mlc-classify.py>`, uses 
    37 :download:`emotions.tab <code/emotions.tab>`): 
     36this algorithm (:download:`mlc-classify.py <code/mlc-classify.py>`): 
    3837 
    3938.. literalinclude:: code/mlc-classify.py 
  • Orange/multilabel/mlknn.py

    r9671 r9994  
    3636 
    3737The following example demonstrates a straightforward invocation of 
    38 this algorithm (:download:`mlc-classify.py <code/mlc-classify.py>`, uses 
    39 :download:`emotions.tab <code/emotions.tab>`): 
     38this algorithm (:download:`mlc-classify.py <code/mlc-classify.py>`): 
    4039 
    4140.. literalinclude:: code/mlc-classify.py 
  • Orange/multilabel/mulan.py

    r9927 r9994  
    4040 
    4141if __name__=="__main__": 
    42     table = trans_mulan_data("../../doc/datasets/emotions.xml","../../doc/datasets/emotions.arff") 
     42    table = trans_mulan_data("../doc/datasets/emotions.xml","../doc/datasets/emotions.arff") 
    4343     
    4444    for i in range(10): 
    4545        print table[i] 
    4646     
    47     table.save("emotions.tab") 
     47    table.save("/tmp/emotions.tab") 
  • Orange/multitarget/__init__.py

    r9671 r9994  
    2424:download:`generate_multitarget.py <code/generate_multitarget.py>`) to show 
    2525some basic functionalities (part of 
    26 :download:`multitarget.py <code/multitarget.py>`, uses 
    27 :download:`multitarget-synthetic.tab <code/multitarget-synthetic.tab>`). 
     26:download:`multitarget.py <code/multitarget.py>`). 
    2827 
    2928.. literalinclude:: code/multitarget.py 
  • Orange/multitarget/tree.py

    r9922 r9994  
    2424The following example demonstrates how to build a prediction model with 
    2525MultitargetTreeLearner and use it to predict (multiple) class values for 
    26 a given instance (:download:`multitarget.py <code/multitarget.py>`, 
    27 uses :download:`test-pls.tab <code/test-pls.tab>`): 
     26a given instance (:download:`multitarget.py <code/multitarget.py>`): 
    2827 
    2928.. literalinclude:: code/multitarget.py 
  • Orange/network/__init__.py

    r9671 r9994  
    2323Pajek (.net) or GML file format. 
    2424 
    25 :download:`network-read-nx.py <code/network-read-nx.py>` (uses: :download:`K5.net <code/K5.net>`): 
     25:download:`network-read-nx.py <code/network-read-nx.py>`: 
    2626 
    2727.. literalinclude:: code/network-read.py 
  • Orange/network/deprecated.py

    r9922 r9994  
    2929Pajek (.net) or GML file format. 
    3030 
    31 :download:`network-read.py <code/network-read.py>` (uses: :download:`K5.net <code/K5.net>`): 
     31:download:`network-read.py <code/network-read.py>`: 
    3232 
    3333.. literalinclude:: code/network-read.py 
  • Orange/projection/correspondence.py

    r9671 r9994  
    2121 
    2222Data table given below represents smoking habits of different employees 
    23 in a company (computed from :download:`smokers_ct.tab <code/smokers_ct.tab>`). 
     23in a company (computed from `smokers_ct.tab`). 
    2424 
    2525    ================  ====  =====  ======  =====  ========== 
     
    5656 
    5757So lets load the data, compute the contingency and do the analysis 
    58 (:download:`correspondence.py <code/correspondence.py>`, uses :download:`smokers_ct.tab <code/smokers_ct.tab>`):: 
     58(:download:`correspondence.py <code/correspondence.py>`):: 
    5959     
    6060    from Orange.projection import correspondence 
  • Orange/projection/mds.py

    r9916 r9994  
    5555(not included with orange, http://matplotlib.sourceforge.net/). 
    5656 
    57 Example (:download:`mds-scatterplot.py <code/mds-scatterplot.py>`, uses :download:`iris.tab <code/iris.tab>`) 
     57Example (:download:`mds-scatterplot.py <code/mds-scatterplot.py>`) 
    5858 
    5959.. literalinclude:: code/mds-scatterplot.py 
     
    7676time. 
    7777 
    78 Example (:download:`mds-advanced.py <code/mds-advanced.py>`, uses :download:`iris.tab <code/iris.tab>`) 
     78Example (:download:`mds-advanced.py <code/mds-advanced.py>`) 
    7979 
    8080.. literalinclude:: code/mds-advanced.py 
  • Orange/projection/som.py

    r9671 r9994  
    8282 
    8383Class :obj:`Map` stores the self-organizing map composed of :obj:`Node` objects. The code below 
    84 (:download:`som-node.py <code/som-node.py>`, uses :download:`iris.tab <code/iris.tab>`) shows an example how to access the information stored in the  
     84(:download:`som-node.py <code/som-node.py>`) shows an example how to access the information stored in the 
    8585node of the map: 
    8686 
     
    9898======== 
    9999 
    100 The following code  (:download:`som-mapping.py <code/som-mapping.py>`, uses :download:`iris.tab <code/iris.tab>`) infers self-organizing map from Iris data set. The map is rather small, and consists  
     100The following code  (:download:`som-mapping.py <code/som-mapping.py>`) infers self-organizing map from Iris data set. The map is rather small, and consists 
    101101of only 9 cells. We optimize the network, and then report how many data instances were mapped 
    102102into each cell. The second part of the code reports on data instances from one of the corner cells: 
  • Orange/regression/mean.py

    r9671 r9994  
    2626Here's a simple example. 
    2727 
    28 :download:`mean-regression.py <code/mean-regression.py>` (uses: :download:`housing.tab <code/housing.tab>`): 
     28:download:`mean-regression.py <code/mean-regression.py>`: 
    2929 
    3030.. literalinclude:: code/mean-regression.py 
  • Orange/regression/tree.py

    r9671 r9994  
    1212but uses a different set of functions to evaluate node splitting and stop 
    1313criteria. Usage of regression trees is straightforward as demonstrated on the 
    14 following example (:download:`regression-tree-run.py <code/regression-tree-run.py>`, uses :download:`servo.tab <code/servo.tab>`): 
     14following example (:download:`regression-tree-run.py <code/regression-tree-run.py>`): 
    1515 
    1616.. literalinclude:: code/regression-tree-run.py 
  • Orange/statistics/basic.py

    r9671 r9994  
    9797        variables in the domain. 
    9898     
    99     part of :download:`distributions-basic-stat.py <code/distributions-basic-stat.py>` (uses :download:`monks-1.tab <code/monks-1.tab>`) 
     99    part of :download:`distributions-basic-stat.py <code/distributions-basic-stat.py>` 
    100100     
    101101    .. literalinclude:: code/distributions-basic-stat.py 
     
    111111 
    112112 
    113     part of :download:`distributions-basic-stat.py <code/distributions-basic-stat.py>` (uses :download:`iris.tab <code/iris.tab>`) 
     113    part of :download:`distributions-basic-stat.py <code/distributions-basic-stat.py>` 
    114114     
    115115    .. literalinclude:: code/distributions-basic-stat.py 
  • docs/reference/rst/Orange.associate.rst

    r9988 r9994  
    107107.. literalinclude:: code/inquisition.basket 
    108108 
    109 Inducing the rules is trivial (uses :download:`inquisition.basket <code/inquisition.basket>`):: 
     109Inducing the rules is trivial:: 
    110110 
    111111    import Orange 
  • docs/reference/rst/Orange.evaluation.scoring.rst

    r9958 r9994  
    2727 
    2828Basic cross validation example is shown in the following part of 
    29 (:download:`statExamples.py <code/statExamples.py>`, uses :download:`voting.tab <code/voting.tab>` and :download:`vehicle.tab <code/vehicle.tab>`): 
     29(:download:`statExamples.py <code/statExamples.py>`): 
    3030 
    3131.. literalinclude:: code/statExample0.py 
     
    4848 
    4949So, let's compute all this in part of 
    50 (:download:`statExamples.py <code/statExamples.py>`, uses :download:`voting.tab <code/voting.tab>` and :download:`vehicle.tab <code/vehicle.tab>`) and print it out: 
     50(:download:`statExamples.py <code/statExamples.py>`) and print it out: 
    5151 
    5252.. literalinclude:: code/statExample1.py 
     
    417417 
    418418So, let's compute all this and print it out (part of 
    419 :download:`mlc-evaluate.py <code/mlc-evaluate.py>`, uses 
    420 :download:`emotions.tab <code/emotions.tab>`): 
     419:download:`mlc-evaluate.py <code/mlc-evaluate.py>`): 
    421420 
    422421.. literalinclude:: code/mlc-evaluate.py 
  • docs/reference/rst/Orange.evaluation.testing.rst

    r9696 r9994  
    3232list of learning algorithms is prepared. 
    3333 
    34 part of :download:`testing-test.py <code/testing-test.py>` (uses :download:`voting.tab <code/voting.tab>`) 
     34part of :download:`testing-test.py <code/testing-test.py>` 
    3535 
    3636.. literalinclude:: code/testing-test.py 
  • docs/reference/rst/Orange.projection.pca.rst

    r9616 r9994  
    3131 
    3232The following example demonstrates a straightforward invocation of PCA 
    33 (:download:`pca-run.py <code/pca-run.py>`, uses :download:`iris.tab <code/iris.tab>`): 
     33(:download:`pca-run.py <code/pca-run.py>`): 
    3434 
    3535.. literalinclude:: code/pca-run.py 
     
    3939feature space. Printing the classifier displays how much variance is covered with the first few components. Classifier 
    4040can also be used to access transformation vectors (eigen_vectors) and variance of the pca components (eigen_values). 
    41 Scree plot can be used when deciding, how many components to keep (:download:`pca-scree.py <code/pca-scree.py>`, 
    42 uses :download:`iris.tab <code/iris.tab>`): 
     41Scree plot can be used when deciding, how many components to keep (:download:`pca-scree.py <code/pca-scree.py>`): 
    4342 
    4443.. literalinclude:: code/pca-scree.py 
  • docs/tutorial/rst/association-rules.rst

    r9386 r9994  
    8888   arguments.  
    8989 
    90 Here goes the code (part of :download:`assoc2.py <code/assoc2.py>`, uses :download:`imports-85.tab <code/imports-85.tab>`):: 
     90Here goes the code (part of :download:`assoc2.py <code/assoc2.py>`):: 
    9191 
    9292   rules = orange.AssociationRulesInducer(data, support = 0.4) 
  • docs/tutorial/rst/basic-exploration.rst

    r9386 r9994  
    246246determine if for specific instances and attribute the value is not 
    247247defined. Let us use this function to compute the proportion of missing 
    248 values per each attribute (:download:`report_missing.py <code/report_missing.py>`, uses :download:`adult_sample.tab <code/adult_sample.tab>`):: 
     248values per each attribute (:download:`report_missing.py <code/report_missing.py>`):: 
    249249 
    250250   import orange 
     
    315315frequencies for discrete attributes, and for both number of instances 
    316316where specific attribute has a missing value.  The use of this object 
    317 is exemplified in the following script (:download:`data_characteristics4.py <code/data_characteristics4.py>`, 
    318 uses :download:`adult_sample.tab <code/adult_sample.tab>`):: 
     317is exemplified in the following script (:download:`data_characteristics4.py <code/data_characteristics4.py>`):: 
    319318 
    320319   import orange 
  • docs/tutorial/rst/classification.rst

    r9385 r9994  
    3737construct a naive Bayesian classifier from voting data set, and 
    3838will use it to classify the first five instances from this data set 
    39 (:download:`classifier.py <code/classifier.py>`, uses :download:`voting.tab <code/voting.tab>`):: 
     39(:download:`classifier.py <code/classifier.py>`):: 
    4040 
    4141   import orange 
     
    7474additional parameter ``orange.GetProbabilities``. Also, note that the 
    7575democrats have a class index 1. We find this out with print 
    76 ``data.domain.classVar.values`` (:download:`classifier2.py <code/classifier2.py>`, uses :download:`voting.tab <code/voting.tab>`):: 
     76``data.domain.classVar.values`` (:download:`classifier2.py <code/classifier2.py>`):: 
    7777 
    7878   import orange 
     
    120120a wrapper (module) called ``orngTree`` was build around it to simplify 
    121121the use of classification trees and to assemble the learner with 
    122 some usual (default) components. Here is a script with it (:download:`tree.py <code/tree.py>`, 
    123 uses :download:`voting.tab <code/voting.tab>`):: 
     122some usual (default) components. Here is a script with it (:download:`tree.py <code/tree.py>`):: 
    124123 
    125124   import orange, orngTree 
     
    231230have already learned), majority and k-nearest neighbors classifier 
    232231(new ones) and prints prediction for first 10 instances of voting data 
    233 set (:download:`handful.py <code/handful.py>`, uses :download:`voting.tab <code/voting.tab>`):: 
     232set (:download:`handful.py <code/handful.py>`):: 
    234233 
    235234   import orange, orngTree 
  • docs/tutorial/rst/discretization.rst

    r9385 r9994  
    3434 
    3535Here is a script which demonstraters the basics of discretization in 
    36 Orange (:download:`disc.py <code/disc.py>`, uses :download:`iris.tab <code/iris.tab>`):: 
     36Orange (:download:`disc.py <code/disc.py>`):: 
    3737 
    3838   import orange 
     
    9696discretized using quartiles (``sl``) and using Fayyad-Irani's 
    9797algorithm (``sl_ent``). We shall also keep the original (continuous) 
    98 attribute ``sepal width`` (from :download:`disc2.py <code/disc2.py>`, uses :download:`iris.tab <code/iris.tab>`):: 
     98attribute ``sepal width`` (from :download:`disc2.py <code/disc2.py>`):: 
    9999 
    100100   def printexamples(data, inxs, msg="First %i examples"): 
     
    180180Both, ``EquiNDiscretization`` and ``EntropyDiscretization`` construct 
    181181transformer objects of type ``IntervalDiscretizer``. It's cut-off 
    182 points are stored in a list points (:download:`disc4.py <code/disc4.py>`, uses :download:`iris.tab <code/iris.tab>`):: 
     182points are stored in a list points (:download:`disc4.py <code/disc4.py>`):: 
    183183 
    184184   import orange 
     
    211211change anything the discretization will actually do to the data. In 
    212212the following example, we have rounded the cut-off points for the 
    213 attribute ``pl`` (:download:`disc5.py <code/disc5.py>`, uses :download:`iris.tab <code/iris.tab>`):: 
     213attribute ``pl`` (:download:`disc5.py <code/disc5.py>`):: 
    214214 
    215215   import orange 
     
    256256 
    257257Let's now discretize Iris' attribute pl using three intervals with 
    258 cut-off points 2.0 and 4.0 (:download:`disc6.py <code/disc6.py>`, uses :download:`iris.tab <code/iris.tab>`):: 
     258cut-off points 2.0 and 4.0 (:download:`disc6.py <code/disc6.py>`):: 
    259259 
    260260   import orange 
     
    299299from their original continuous versions, so you need only to convert 
    300300the testing examples to a new (discretized) domain. Following code 
    301 shows how (:download:`disc7.py <code/disc7.py>`, uses :download:`iris.tab <code/iris.tab>`):: 
     301shows how (:download:`disc7.py <code/disc7.py>`):: 
    302302 
    303303   import orange 
  • docs/tutorial/rst/ensembles.rst

    r9385 r9994  
    1818learner. Using this module, using it is very easy: you have to define 
    1919a learner, give it to bagger or booster, which in turn returns a new 
    20 (boosted or bagged) learner. Here goes an example (:download:`ensemble3.py <code/ensemble3.py>`, 
    21 uses :download:`promoters.tab <code/promoters.tab>`):: 
     20(boosted or bagged) learner. Here goes an example (:download:`ensemble3.py <code/ensemble3.py>`):: 
    2221 
    2322   import orange, orngTest, orngStat, orngEnsemble 
  • docs/tutorial/rst/evaluation.rst

    r9385 r9994  
    2929script reports on four different scores: classification accuracy, 
    3030information score, Brier score and area under ROC curve 
    31 (:download:`accuracy7.py <code/accuracy7.py>`, uses :download:`voting.tab <code/voting.tab>`):: 
     31(:download:`accuracy7.py <code/accuracy7.py>`):: 
    3232 
    3333   import orange, orngTest, orngStat, orngTree 
     
    135135computes the classification accuracies for each of the classifier. By 
    136136this means, let us compare naive Bayes and classification trees 
    137 (:download:`accuracy2.py <code/accuracy2.py>`, uses :download:`voting.tab <code/voting.tab>`):: 
     137(:download:`accuracy2.py <code/accuracy2.py>`):: 
    138138 
    139139   import orange, orngTree 
     
    188188first half of the data for training and the rest for testing. The 
    189189script is similar to the one above, with a part which is different 
    190 shown below (part of :download:`accuracy3.py <code/accuracy3.py>`, uses :download:`voting.tab <code/voting.tab>`):: 
     190shown below (part of :download:`accuracy3.py <code/accuracy3.py>`):: 
    191191 
    192192   # set up the classifiers 
  • docs/tutorial/rst/feature-subset-selection.rst

    r9386 r9994  
    8585particular wrapper from orngDisc). The code is quite short since we 
    8686will also use a wrapper called FilteredLearner from orngFSS module 
    87 (part of :download:`fss7.py <code/fss7.py>`, uses :download:`adult_sample.tab <code/adult_sample.tab>`):: 
     87(part of :download:`fss7.py <code/fss7.py>`):: 
    8888 
    8989   import orange, orngDisc, orngTest, orngStat, orngFSS 
  • docs/tutorial/rst/learners-in-python.rst

    r9878 r9994  
    152152For a more elaborate test that also shows the use of a learner (that 
    153153is not given the data at its initialization), here is a script that 
    154 does 10-fold cross validation (:download:`nbdisc_test.py <code/nbdisc_test.py>`, uses :download:`iris.tab <code/iris.tab>` and 
    155 :download:`nbdisc.py <code/nbdisc.py>`):: 
     154does 10-fold cross validation (:download:`nbdisc_test.py <code/nbdisc_test.py>`, 
     155uses :download:`nbdisc.py <code/nbdisc.py>`):: 
    156156 
    157157   import orange, orngEval, nbdisc 
     
    443443Here is the code that tests our bagging we have just implemented. It 
    444444compares a decision tree and its bagged variant.  Run it yourself to 
    445 see which one is better (:download:`bagging_test.py <code/bagging_test.py>`, uses :download:`bagging.py <code/bagging.py>` and 
    446 :download:`adult_sample.tab <code/adult_sample.tab>`):: 
     445see which one is better (:download:`bagging_test.py <code/bagging_test.py>`):: 
    447446 
    448447   import orange, orngTree, orngEval, bagging 
Note: See TracChangeset for help on using the changeset viewer.