Ignore:
Files:
13 added
7 deleted
24 edited

Legend:

Unmodified
Added
Removed
  • .hgignore

    r9881 r9917  
    1515source/orangeom/lib_vectors_auto.txt 
    1616 
    17 # Ignore build and dist dir, created by setup.py build or setup.py bdist_* . 
     17# Ignore files created by setup.py. 
    1818build 
    1919dist 
     20MANIFEST 
     21Orange.egg-info 
    2022 
    2123# Ignore dot files. 
     
    3234 
    3335# Built documentation. 
    34 docs/reference/html 
     36docs/*/html 
    3537 
    36 # Images generated by tests. 
     38# Files generated by tests. 
     39Orange/testing/regression/*/*.changed.txt 
     40Orange/testing/regression/*/*.crash.txt 
     41Orange/testing/regression/*/*.new.txt 
    3742Orange/doc/modules/*.png 
    3843docs/reference/rst/code/*.png 
     44 
     45Orange/doc/modules/tree1.dot 
     46Orange/doc/reference/del2.tab 
     47Orange/doc/reference/undefined-saved-dc-dk.tab 
     48Orange/doc/reference/undefined-saved-na.tab 
     49Orange/testing/regression/results_orange25/unusedValues.py.txt 
     50docs/reference/rst/code/iris.testsave.arff 
     51docs/tutorial/rst/code/adult_sample_sampled.tab 
     52docs/tutorial/rst/code/tree.dot 
     53 
  • Orange/clustering/hierarchical.py

    r9752 r9906  
    8181         
    8282        :param matrix: A distance matrix to perform the clustering on. 
    83         :type matrix: :class:`Orange.core.SymMatrix` 
     83        :type matrix: :class:`Orange.misc.SymMatrix` 
    8484 
    8585 
     
    157157 
    158158Let us construct a simple distance matrix and run clustering on it. 
    159 :: 
    160  
    161     import Orange 
    162     from Orange.clustering import hierarchical 
    163     m = [[], 
    164          [ 3], 
    165          [ 2, 4], 
    166          [17, 5, 4], 
    167          [ 2, 8, 3, 8], 
    168          [ 7, 5, 10, 11, 2], 
    169          [ 8, 4, 1, 5, 11, 13], 
    170          [ 4, 7, 12, 8, 10, 1, 5], 
    171          [13, 9, 14, 15, 7, 8, 4, 6], 
    172          [12, 10, 11, 15, 2, 5, 7, 3, 1]] 
    173     matrix = Orange.core.SymMatrix(m) 
    174     root = hierarchical.HierarchicalClustering(matrix, 
    175             linkage=hierarchical.HierarchicalClustering.Average) 
     159 
     160.. literalinclude:: code/hierarchical-example.py 
     161    :lines: 1-14 
    176162     
    177163Root is a root of the cluster hierarchy. We can print using a 
    178164simple recursive function. 
    179 :: 
    180  
    181     def printClustering(cluster): 
    182         if cluster.branches: 
    183             return "(%s%s)" % (printClustering(cluster.left), printClustering(cluster.right)) 
    184         else: 
    185             return str(cluster[0]) 
     165 
     166.. literalinclude:: code/hierarchical-example.py 
     167    :lines: 16-20 
    186168             
    187169The output is not exactly nice, but it will have to do. Our clustering, 
     
    211193supposedly the only) element of cluster, cluster[0], we shall print 
    212194it out as a tuple.  
    213 :: 
    214  
    215     def printClustering2(cluster): 
    216         if cluster.branches: 
    217             return "(%s%s)" % (printClustering2(cluster.left), printClustering2(cluster.right)) 
    218         else: 
    219             return str(tuple(cluster)) 
     195 
     196.. literalinclude:: code/hierarchical-example.py 
     197    :lines: 22-26 
    220198             
    221199The distance matrix could have been given a list of objects. We could, 
    222200for instance, put 
    223 :: 
    224      
    225     matrix.objects = ["Ann", "Bob", "Curt", "Danny", "Eve", 
    226                       "Fred", "Greg", "Hue", "Ivy", "Jon"] 
     201     
     202.. literalinclude:: code/hierarchical-example.py 
     203    :lines: 28-29 
    227204 
    228205above calling the HierarchicalClustering. 
     
    234211If we've forgotten to store the objects into matrix prior to clustering, 
    235212nothing is lost. We can add it into clustering later, by 
    236 :: 
    237  
    238     root.mapping.objects = ["Ann", "Bob", "Curt", "Danny", "Eve", "Fred", "Greg", "Hue", "Ivy", "Jon"] 
     213 
     214.. literalinclude:: code/hierarchical-example.py 
     215    :lines: 31 
    239216     
    240217So, what do these "objects" do? Call printClustering(root) again and you'll 
     
    269246of ``root.left`` and ``root.right``. 
    270247 
    271 Let us write function for cluster pruning. :: 
    272  
    273     def prune(cluster, togo): 
    274         if cluster.branches: 
    275             if togo<0: 
    276                 cluster.branches = None 
    277             else: 
    278                 for branch in cluster.branches: 
    279                     prune(branch, togo-cluster.height) 
     248Let us write function for cluster pruning. 
     249 
     250.. literalinclude:: code/hierarchical-example.py 
     251    :lines: 33-39 
    280252 
    281253We shall use ``printClustering2`` here, since we can have multiple elements 
     
    287259     
    288260We've ended up with four clusters. Need a list of clusters? 
    289 Here's the function. :: 
    290      
    291     def listOfClusters0(cluster, alist): 
    292         if not cluster.branches: 
    293             alist.append(list(cluster)) 
    294         else: 
    295             for branch in cluster.branches: 
    296                 listOfClusters0(branch, alist) 
    297                  
    298     def listOfClusters(root): 
    299         l = [] 
    300         listOfClusters0(root, l) 
    301         return l 
     261Here's the function. 
     262 
     263.. literalinclude:: code/hierarchical-example.py 
     264    :lines: 41-51 
    302265         
    303266The function returns a list of lists, in our case 
     
    313276and cluster it with average linkage. Since we don't need the matrix, 
    314277we shall let the clustering overwrite it (not that it's needed for 
    315 such a small data set as Iris). :: 
    316  
    317     import Orange 
    318     from Orange.clustering import hierarchical 
    319  
    320     data = Orange.data.Table("iris") 
    321     matrix = Orange.core.SymMatrix(len(data)) 
    322     matrix.setattr("objects", data) 
    323     distance = Orange.distance.Euclidean(data) 
    324     for i1, instance1 in enumerate(data): 
    325         for i2 in range(i1+1, len(data)): 
    326             matrix[i1, i2] = distance(instance1, data[i2]) 
    327              
    328     clustering = hierarchical.HierarchicalClustering() 
    329     clustering.linkage = clustering.Average 
    330     clustering.overwrite_matrix = 1 
    331     root = clustering(matrix) 
     278such a small data set as Iris). 
     279 
     280.. literalinclude:: code/hierarchical-example-2.py 
     281    :lines: 1-15 
    332282 
    333283Note that we haven't forgotten to set the ``matrix.objects``. We did it 
    334284through ``matrix.setattr`` to avoid the warning. Let us now prune the 
    335285clustering using the function we've written above, and print out the 
    336 clusters. :: 
    337      
    338     prune(root, 1.4) 
    339     for n, cluster in enumerate(listOfClusters(root)): 
    340         print "\n\n Cluster %i \n" % n 
    341         for instance in cluster: 
    342             print instance 
     286clusters. 
     287     
     288.. literalinclude:: code/hierarchical-example-2.py 
     289    :lines: 16-20 
    343290             
    344291Since the printout is pretty long, it might be more informative to just 
    345 print out the class distributions for each cluster. :: 
    346      
    347     for cluster in listOfClusters(root): 
    348         dist = Orange.core.get_class_distribution(cluster) 
    349         for e, d in enumerate(dist): 
    350             print "%s: %3.0f " % (data.domain.class_var.values[e], d), 
    351         print 
     292print out the class distributions for each cluster. 
     293     
     294.. literalinclude:: code/hierarchical-example-2.py 
     295    :lines: 22-26 
    352296         
    353297Here's what it shows. :: 
     
    365309instance, call a learning algorithms, passing a cluster as an argument. 
    366310It won't mind. If you, however, want to have a list of table, you can 
    367 easily convert the list by :: 
    368  
    369     tables = [Orange.data.Table(cluster) for cluster in listOfClusters(root)] 
     311easily convert the list by 
     312 
     313.. literalinclude:: code/hierarchical-example-2.py 
     314    :lines: 28 
    370315     
    371316Finally, if you are dealing with examples, you may want to take the function 
     
    502447    """ 
    503448    distance = distance_constructor(data) 
    504     matrix = orange.SymMatrix(len(data)) 
     449    matrix = Orange.misc.SymMatrix(len(data)) 
    505450    for i in range(len(data)): 
    506451        for j in range(i+1): 
     
    540485     
    541486    """ 
    542     matrix = orange.SymMatrix(len(data.domain.attributes)) 
     487    matrix = Orange.misc.SymMatrix(len(data.domain.attributes)) 
    543488    for a1 in range(len(data.domain.attributes)): 
    544489        for a2 in range(a1): 
     
    618563    :type tree: :class:`HierarchicalCluster` 
    619564    :param matrix: SymMatrix that was used to compute the clustering. 
    620     :type matrix: :class:`Orange.core.SymMatrix` 
     565    :type matrix: :class:`Orange.misc.SymMatrix` 
    621566    :param progress_callback: Function used to report on progress. 
    622567    :type progress_callback: function 
     
    811756    :type tree: :class:`HierarchicalCluster` 
    812757    :param matrix: SymMatrix that was used to compute the clustering. 
    813     :type matrix: :class:`Orange.core.SymMatrix` 
     758    :type matrix: :class:`Orange.misc.SymMatrix` 
    814759    :param progress_callback: Function used to report on progress. 
    815760    :type progress_callback: function 
     
    15111456 
    15121457def feature_distance_matrix(data, distance=None, progress_callback=None): 
    1513     """ A helper function that computes an :class:`Orange.core.SymMatrix` of 
     1458    """ A helper function that computes an :class:`Orange.misc.SymMatrix` of 
    15141459    all pairwise distances between features in `data`. 
    15151460     
     
    15241469    :type progress_callback: function 
    15251470     
    1526     :rtype: :class:`Orange.core.SymMatrix` 
     1471    :rtype: :class:`Orange.misc.SymMatrix` 
    15271472     
    15281473    """ 
    15291474    attributes = data.domain.attributes 
    1530     matrix = orange.SymMatrix(len(attributes)) 
     1475    matrix = Orange.misc.SymMatrix(len(attributes)) 
    15311476    iter_count = matrix.dim * (matrix.dim - 1) / 2 
    15321477    milestones = progress_bar_milestones(iter_count, 100) 
     
    15811526    :type cluster: :class:`HierarchicalCluster` 
    15821527     
    1583     :rtype: :class:`Orange.core.SymMatrix` 
     1528    :rtype: :class:`Orange.misc.SymMatrix` 
    15841529     
    15851530    """ 
    15861531 
    15871532    mapping = cluster.mapping   
    1588     matrix = Orange.core.SymMatrix(len(mapping)) 
     1533    matrix = Orange.misc.SymMatrix(len(mapping)) 
    15891534    for cluster in postorder(cluster): 
    15901535        if cluster.branches: 
     
    16241569     
    16251570     
    1626 if __name__=="__main__": 
    1627     data = orange.ExampleTable("doc//datasets//brown-selected.tab") 
    1628 #    data = orange.ExampleTable("doc//datasets//iris.tab") 
    1629     root = hierarchicalClustering(data, order=True) #, linkage=orange.HierarchicalClustering.Single) 
    1630     attr_root = hierarchicalClustering_attributes(data, order=True) 
    1631 #    print root 
    1632 #    d = DendrogramPlotPylab(root, data=data, labels=[str(ex.getclass()) for ex in data], dendrogram_width=0.4, heatmap_width=0.3,  params={}, cmap=None) 
    1633 #    d.plot(show=True, filename="graph.png") 
    1634  
    1635     dendrogram_draw("graph.eps", root, attr_tree=attr_root, data=data, labels=[str(e.getclass()) for e in data], tree_height=50, #width=500, height=500, 
    1636                           cluster_colors={root.right:(255,0,0), root.right.right:(0,255,0)},  
    1637                           color_palette=ColorPalette([(255, 0, 0), (0,0,0), (0, 255,0)], gamma=0.5,  
    1638                                                      overflow=(255, 255, 255), underflow=(255, 255, 255))) #, minv=-0.5, maxv=0.5) 
  • Orange/distance/__init__.py

    r9805 r9916  
    247247    
    248248def distance_matrix(data, distance_constructor=Euclidean, progress_callback=None): 
    249     """ A helper function that computes an :obj:`Orange.data.SymMatrix` of all 
     249    """ A helper function that computes an :obj:`Orange.misc.SymMatrix` of all 
    250250    pairwise distances between instances in `data`. 
    251251     
     
    260260    :type progress_callback: function 
    261261     
    262     :rtype: :class:`Orange.data.SymMatrix` 
    263      
    264     """ 
    265     matrix = Orange.data.SymMatrix(len(data)) 
     262    :rtype: :class:`Orange.misc.SymMatrix` 
     263     
     264    """ 
     265    matrix = Orange.misc.SymMatrix(len(data)) 
    266266    dist = distance_constructor(data) 
    267267 
  • Orange/evaluation/testing.py

    r9697 r9914  
    3838        """Appends a new result (class and probability prediction by a single classifier) to the classes and probabilities field.""" 
    3939     
    40         if type(aclass)==list: 
    41             self.classes.append(aclass) 
    42             self.probabilities.append(aprob) 
     40        if type(aclass)==int: 
     41            self.classes.append(int(aclass)) 
     42            self.probabilities.append(list(aprob)) 
    4343        elif type(aclass.value)==float: 
    4444            self.classes.append(float(aclass)) 
    4545            self.probabilities.append(aprob) 
    4646        else: 
    47             self.classes.append(int(aclass)) 
    48             self.probabilities.append(list(aprob)) 
     47            self.classes.append(aclass) 
     48            self.probabilities.append(aprob) 
    4949 
    5050    def set_result(self, i, aclass, aprob): 
  • Orange/feature/discretization.py

    r9878 r9900  
    1515    Discretization, \ 
    1616    Preprocessor_discretize 
    17  
    18  
    1917 
    2018def entropyDiscretization_wrapper(data): 
  • Orange/fixes/fix_changed_names.py

    r9918 r9920  
    4343           "orange.newmetaid": "Orange.feature.new_meta_id", 
    4444 
    45            "orange.SymMatrix": "Orange.data.SymMatrix", 
     45           "orange.SymMatrix": "Orange.misc.SymMatrix", 
     46 
    4647           "orange.GetValue": "Orange.classification:Classifier.GetValue", 
    4748           "orange.GetProbabilities": "Orange.classification:Classifier.GetProbabilities", 
  • Orange/network/deprecated.py

    r9671 r9916  
    616616        :param matrix: number of objects in a matrix must match the number  
    617617            of vertices in a network. 
    618         :type matrix: Orange.core.SymMatrix 
     618        :type matrix: Orange.misc.SymMatrix 
    619619        :param lower: lower distance bound. 
    620620        :type lower: float 
     
    14541454        self.mdsStep = 0 
    14551455        self.stopMDS = 0 
    1456         self.vertexDistance.matrixType = Orange.core.SymMatrix.Symmetric 
     1456        self.vertexDistance.matrixType = Orange.misc.SymMatrix.Symmetric 
    14571457        self.diag_coors = math.sqrt((min(self.graph.coors[0]) -  \ 
    14581458                                     max(self.graph.coors[0]))**2 + \ 
  • Orange/network/network.py

    r9671 r9916  
    797797        self.mdsStep = 0 
    798798        self.stopMDS = 0 
    799         self.items_matrix.matrixType = Orange.core.SymMatrix.Symmetric 
     799        self.items_matrix.matrixType = Orange.misc.SymMatrix.Symmetric 
    800800        self.diag_coors = math.sqrt((min(self.coors[0]) - \ 
    801801                                     max(self.coors[0])) ** 2 + \ 
  • Orange/preprocess/outliers.py

    r9765 r9915  
    9494        other distance measures 
    9595        """ 
    96         self.distmatrix = Orange.core.SymMatrix(len(self.examples)) #FIXME  
     96        self.distmatrix = Orange.misc.SymMatrix(len(self.examples)) #FIXME  
    9797        for i in range(len(self.examples)): 
    9898            for j in range(i + 1): 
  • Orange/projection/linear.py

    r9880 r9916  
    397397        if distances: 
    398398            if n_valid != len(valid_data): 
    399                 classes = Orange.core.SymMatrix(n_valid) 
     399                classes = Orange.misc.SymMatrix(n_valid) 
    400400                r = 0 
    401401                for ro, vr in enumerate(valid_data): 
  • Orange/projection/mds.py

    r9725 r9916  
    177177     
    178178    :param distances: original dissimilarity - a distance matrix to operate on. 
    179     :type distances: :class:`Orange.core.SymMatrix` 
     179    :type distances: :class:`Orange.misc.SymMatrix` 
    180180     
    181181    :param dim: dimension of the projected space. 
     
    194194    .. attribute:: distances 
    195195     
    196        An :class:`Orange.core.SymMatrix` containing the distances that we 
     196       An :class:`Orange.misc.SymMatrix` containing the distances that we 
    197197       want to achieve (lsmt changes these). 
    198198        
    199199    .. attribute:: projected_distances 
    200200 
    201        An :class:`Orange.core.SymMatrix` containing the distances between 
     201       An :class:`Orange.misc.SymMatrix` containing the distances between 
    202202       projected points. 
    203203        
    204204    .. attribute:: original_distances 
    205205 
    206        An :class:`Orange.core.SymMatrix` containing the original distances 
     206       An :class:`Orange.misc.SymMatrix` containing the original distances 
    207207       between points. 
    208208        
    209209    .. attribute:: stress 
    210210        
    211        An :class:`Orange.core.SymMatrix` holding the stress. 
     211       An :class:`Orange.misc.SymMatrix` holding the stress. 
    212212     
    213213    .. attribute:: dim 
     
    232232    def __init__(self, distances=None, dim=2, **kwargs): 
    233233        self.mds=orangemds.MDS(distances, dim, **kwargs) 
    234         self.original_distances=Orange.core.SymMatrix([m for m in self.distances]) 
     234        self.original_distances=Orange.misc.SymMatrix([m for m in self.distances]) 
    235235 
    236236    def __getattr__(self, name): 
  • Orange/testing/unit/tests/test_hclustering.py

    r9724 r9915  
    7474             [13,  9, 14, 15,  7,  8,  4,  6], 
    7575             [12, 10, 11, 15,  2,  5,  7,  3,  1]] 
    76         self.matrix = Orange.core.SymMatrix(m) 
     76        self.matrix = Orange.misc.SymMatrix(m) 
    7777        self.matrix.setattr("objects", ["Ann", "Bob", "Curt", "Danny", "Eve", "Fred", "Greg", "Hue", "Ivy", "Jon"]) 
    7878        self.cluster = hier.HierarchicalClustering(self.matrix) 
  • Orange/testing/unit/tests/test_refactoring.py

    r9862 r9910  
    33 
    44""" 
    5 import sys, os 
    65import unittest 
    76 
     
    2625     
    2726def rhasattr(obj, name): 
     27    """ Recursive hasattr. 
     28    """ 
    2829    while "." in name: 
    2930        first, name = name.split(".", 1) 
     
    3536 
    3637def rgetattr(obj, name): 
     38    """ Recursive getattr  
     39    """ 
    3740    while "." in name: 
    3841        first, name = name.split(".", 1) 
     
    4548 
    4649def import_package(name): 
     50    """ Import a package and return it. 
     51    """ 
    4752    mod = __import__(name) 
    4853    if "." in name: 
     
    6671             
    6772             
    68             self.assertTrue(rhasattr(old_mod, old_name), "{0} is missing".format(old)) 
    69             self.assertTrue(rhasattr(new_mod, new_name), "{0} is missing".format(new)) 
     73            self.assertTrue(rhasattr(old_mod, old_name),  
     74                            "{0} is missing".format(old)) 
     75            self.assertTrue(rhasattr(new_mod, new_name), 
     76                            "{0} is missing".format(new)) 
    7077             
    7178    def test_import_mapping(self): 
     
    7683     
    7784             
    78              
    7985if __name__ == "__main__": 
    8086    unittest.main() 
  • docs/extend-widgets/rst/conf.py

    r9402 r9917  
    1717# add these directories to sys.path here. If the directory is relative to the 
    1818# documentation root, use os.path.abspath to make it absolute, like shown here. 
    19 #sys.path.append(os.path.abspath('.')) 
     19sys.path.append(os.path.abspath('../../../orange')) 
     20import Orange 
    2021 
    2122# -- General configuration ----------------------------------------------------- 
  • docs/extend-widgets/rst/index.rst

    r9402 r9917  
    33########################## 
    44 
    5 Contents: 
     5.. toctree:: 
     6   :maxdepth: 3 
     7 
     8   OrangeWidgets.plot 
    69 
    710**************** 
     
    1013 
    1114* :ref:`genindex` 
     15* :ref:`modindex` 
    1216* :ref:`search` 
  • docs/reference/rst/Orange.data.rst

    r9896 r9901  
    1111    Orange.data.sample 
    1212    Orange.data.formats 
     13    Orange.data.discretization 
  • docs/reference/rst/Orange.evaluation.scoring.rst

    r9892 r9904  
    114114   data set, we would compute the matrix like this:: 
    115115 
    116       cm = Orange.evaluation.scoring.confusion_matrices(resVeh, \ 
    117 vehicle.domain.classVar.values.index("van")) 
     116      cm = Orange.evaluation.scoring.confusion_matrices(resVeh, vehicle.domain.classVar.values.index("van")) 
    118117 
    119118   and get the results like these:: 
     
    177176   classes, you can also compute the 
    178177   `sensitivity <http://en.wikipedia.org/wiki/Sensitivity_(tests)>`_ 
    179    [TP/(TP+FN)], `specificity \ 
    180 <http://en.wikipedia.org/wiki/Specificity_%28tests%29>`_ 
    181    [TN/(TN+FP)], `positive predictive value \ 
    182 <http://en.wikipedia.org/wiki/Positive_predictive_value>`_ 
    183    [TP/(TP+FP)] and `negative predictive value \ 
    184 <http://en.wikipedia.org/wiki/Negative_predictive_value>`_ [TN/(TN+FN)]. 
     178   [TP/(TP+FN)], `specificity <http://en.wikipedia.org/wiki/Specificity_%28tests%29>`_ 
     179   [TN/(TN+FP)], `positive predictive value <http://en.wikipedia.org/wiki/Positive_predictive_value>`_ 
     180   [TP/(TP+FP)] and `negative predictive value <http://en.wikipedia.org/wiki/Negative_predictive_value>`_ [TN/(TN+FN)]. 
    185181   In information retrieval, positive predictive value is called precision 
    186182   (the ratio of the number of relevant records retrieved to the total number 
     
    195191   as F1 [2*precision*recall/(precision+recall)] or, for a general case, 
    196192   Falpha [(1+alpha)*precision*recall / (alpha*precision + recall)]. 
    197    The `Matthews correlation coefficient \ 
    198 <http://en.wikipedia.org/wiki/Matthews_correlation_coefficient>`_ 
     193   The `Matthews correlation coefficient <http://en.wikipedia.org/wiki/Matthews_correlation_coefficient>`_ 
    199194   in essence a correlation coefficient between 
    200195   the observed and predicted binary classifications; it returns a value 
  • docs/reference/rst/Orange.feature.discretization.rst

    r9863 r9900  
    4949value according to the rule found by discretization. In this respect, the discretization behaves similar to 
    5050:class:`Orange.classification.Learner`. 
    51  
    52 Utility functions 
    53 ================= 
    54  
    55 Some functions and classes that can be used for 
    56 categorization of continuous features. Besides several general classes that 
    57 can help in this task, we also provide a function that may help in 
    58 entropy-based discretization (Fayyad & Irani), and a wrapper around classes for 
    59 categorization that can be used for learning. 
    60  
    61 .. autoclass:: Orange.feature.discretization.DiscretizedLearner_Class 
    62  
    63 .. autoclass:: DiscretizeTable 
    64  
    65 .. rubric:: Example 
    66  
    67 FIXME. A chapter on `feature subset selection <../ofb/o_fss.htm>`_ in Orange 
    68 for Beginners tutorial shows the use of DiscretizedLearner. Other 
    69 discretization classes from core Orange are listed in chapter on 
    70 `categorization <../ofb/o_categorization.htm>`_ of the same tutorial. 
    7151 
    7252Discretization Algorithms 
  • docs/reference/rst/Orange.feature.imputation.rst

    r9890 r9905  
    282282capable of handling unknown values. 
    283283 
    284 Learners with imputer as a component 
    285 ==================================== 
     284Imputer as a component 
     285====================== 
    286286 
    287287Learners that cannot handle missing values should provide a slot 
     
    292292:obj:`~Orange.classification.logreg.LogRegLearner` will pass them to 
    293293:obj:`~Orange.classification.logreg.LogRegLearner.imputer_constructor` to get 
    294 an imputer and used it to impute the missing values in the learning data. 
    295 Imputed data is then used by the actual learning algorithm. Also, when a 
     294an imputer and use it to impute the missing values in the learning data. 
     295Imputed data is then used by the actual learning algorithm. When a 
    296296classifier :obj:`~Orange.classification.logreg.LogRegClassifier` is 
    297 constructed, 
    298 the imputer is stored in its attribute 
    299 :obj:`~Orange.classification.logreg.LogRegClassifier.imputer`. At 
    300 classification, the same imputer is used for imputation of missing values 
     297constructed, the imputer is stored in its attribute 
     298:obj:`~Orange.classification.logreg.LogRegClassifier.imputer`. During 
     299classification the same imputer is used for imputation of missing values 
    301300in (testing) examples. 
    302301 
     
    305304it is recommended to use imputation according to the described procedure. 
    306305 
    307 The choice of which imputer to use depends on the problem domain. In this 
    308 example we want to impute the minimal value of each feature. 
     306The choice of the imputer depends on the problem domain. In this example the 
     307minimal value of each feature is imputed: 
    309308 
    310309.. literalinclude:: code/imputation-logreg.py 
     
    318317.. note:: 
    319318 
    320    Note that just one instance of 
     319   Just one instance of 
    321320   :obj:`~Orange.classification.logreg.LogRegLearner` is constructed and then 
    322321   used twice in each fold. Once it is given the original instances as they 
     
    329328   testing. 
    330329 
    331 Wrapper for learning algorithms 
    332 =============================== 
     330Wrappers for learning 
     331===================== 
    333332 
    334333In a learning/classification process, imputation is needed on two occasions. 
    335 Before learning, the imputer needs to process the training examples. 
     334Before learning, the imputer needs to process the training instances. 
    336335Afterwards, the imputer is called for each instance to be classified. For 
    337336example, in cross validation, imputation should be done on training folds 
     
    343342simply skips the corresponding attributes in the formula, while 
    344343classification/regression trees have components for handling the missing 
    345 values in various ways. 
    346  
    347 If for any reason you want to use these algorithms to run on imputed data, 
    348 you can use this wrapper. 
     344values in various ways. A wrapper is provided for learning algorithms that 
     345require imputed data. 
    349346 
    350347.. class:: ImputeLearner 
  • docs/reference/rst/index.rst

    r9897 r9917  
    4040   Orange.misc 
    4141 
    42    OrangeWidgets.plot 
    43  
    4442**************** 
    4543Index and search 
  • install-scripts/createSnapshot.btm

    r9730 r9909  
    1212rem # update source(s) to revision HEAD 
    1313cdd %TMPDIR 
    14  
    15 hg pull --update 
    1614 
    1715rem # build core 
  • install-scripts/updateAndCall.btm

    r9730 r9909  
    99REM hg clone https://bitbucket.org/biolab/orange snapshot 
    1010cdd snapshot 
    11 hg pull --update 
     11hg pull 
     12hg update 
     13 
    1214 
    1315REM hg clone https://bitbucket.org/biolab/orange-addon-bioinformatics Bioinformatics 
    1416cdd Bioinformatics 
    15 hg pull --update 
     17hg pull 
     18hg update 
    1619 
    1720cdd  e:\orange\scripts\snapshot 
     
    1922REM hg clone https://bitbucket.org/biolab/orange-addon-text Text 
    2023cdd Text 
    21 hg pull --update 
     24hg pull  
     25hg update 
    2226 
    2327cdd e:\orange\scripts 
    2428 
    25 copy /r snapshot\install-scripts\* . 
     29copy /q snapshot\install-scripts\* . 
    2630 
    27 #svn update -N 
    28 #svn export http://orange.biolab.si/svn/orange/trunk/orange/doc/LICENSES license.txt 
    29 #svn 
    30   
    3131call callCreateSnapshot.btm 
    3232shutdown -s 
  • source/orange/_aliases.txt

    r9752 r9908  
    6969ExamplesDistance_Normalized 
    7070feature_distances attribute_distances 
     71 
     72TransformValue 
     73sub_transformer subtransformer 
     74 
     75ImputerConstructor 
     76impute_class imputeClass 
  • source/orange/discretize.hpp

    r9863 r9899  
    199199  __REGISTER_CLASS 
    200200 
    201   int maxNumberOfIntervals; //P maximal number of intervals; default = 0 (no limits) 
     201  int maxNumberOfIntervals; //P(+n) maximal number of intervals; default = 0 (no limits) 
    202202  bool forceAttribute; //P minimal number of intervals; default = 0 (no limits) 
    203203 
Note: See TracChangeset for help on using the changeset viewer.