Ignore:
Timestamp:
03/08/12 19:44:26 (2 years ago)
Author:
Matija Polajnar <matija.polajnar@…>
Branch:
default
Message:

Major refactorization of linear projections, fixing some bugs in the process.

File:
1 edited

Legend:

Unmodified
Added
Removed
  • docs/reference/rst/Orange.projection.linear.rst

    r9372 r10475  
    11.. automodule:: Orange.projection.linear 
     2 
     3############################## 
     4Linear projection (``linear``) 
     5############################## 
     6 
     7.. index:: linear projection 
     8 
     9.. index:: 
     10   single: projection; linear 
     11 
     12Linear transformation of the data might provide a unique insight into the data through observation of the optimized 
     13projection or through visualization of the space with reduced dimensionality. 
     14 
     15This module contains the FreeViz linear projection optimization algorithm 
     16[1], PCA and FDA and utility classes for classification of instances based on 
     17kNN in the linearly transformed space. 
     18 
     19Methods in this module use given data set to optimize a linear projection of features into a new vector space. The 
     20transformation is returned as a :class:`~Orange.projection.linear.Projector` instance that, when invoked, projects 
     21any given data with the domain that matches the domain that was used to optimize the projection. 
     22 
     23.. autoclass:: Orange.projection.linear.Projector 
     24   :members: 
     25 
     26************************************* 
     27Pricipal Component Analysis (``pca``) 
     28************************************* 
     29 
     30.. index:: Pricipal Component Analysis 
     31 
     32.. index:: 
     33   single: projection, Principal Component Analysis 
     34 
     35`PCA <http://en.wikipedia.org/wiki/Principal_component_analysis>`_ uses an orthogonal transformation to transform input 
     36features into a set of uncorrelated features called principal 
     37components. This transformation is defined in such a way that the first principal component has as high variance as 
     38possible and each succeeding component in turn has the highest variance possible under constraint that it is orthogonal 
     39to the preceding components. 
     40 
     41Because PCA is sensitive to the relative scaling of the original variables, the default behaviour of PCA class is to 
     42standardize the input data. 
     43 
     44Optimizer and Projector 
     45======================= 
     46 
     47.. index:: PCA 
     48.. autoclass:: Orange.projection.linear.Pca 
     49   :members: 
     50 
     51.. autoclass:: Orange.projection.linear.PcaProjector 
     52   :members: 
     53   :show-inheritance: 
     54 
     55Examples 
     56======== 
     57 
     58The following example demonstrates a straightforward invocation of PCA 
     59(:download:`pca-run.py <code/pca-run.py>`): 
     60 
     61.. literalinclude:: code/pca-run.py 
     62   :lines: 7- 
     63 
     64The call to the Pca constructor returns an instance of PcaClassifier, which is later used to transform data to PCA 
     65feature space. Printing the classifier displays how much variance is covered with the first few components. Classifier 
     66can also be used to access transformation vectors (eigen_vectors) and variance of the pca components (eigen_values). 
     67Scree plot can be used when deciding, how many components to keep (:download:`pca-scree.py <code/pca-scree.py>`): 
     68 
     69.. literalinclude:: code/pca-scree.py 
     70   :lines: 7- 
     71 
     72.. image:: files/pca-scree.png 
     73   :scale: 50 % 
     74 
     75 
     76.. index:: Fisher Discriminant Analysis 
     77 
     78.. index:: 
     79   single: projection, Fisher Discriminant Analysis 
     80 
     81************************************** 
     82Fisher discriminant analysis (``fda``) 
     83************************************** 
     84 
     85As a variant of LDA (Linear Discriminant Analysis), 
     86`FDA <http://en.wikipedia.org/wiki/Linear_discriminant_analysis#Fisher.27s_linear_discriminant>`_ finds 
     87a linear combination of features 
     88that separates two or more classes best. 
     89 
     90Optimizer and Projector 
     91======================= 
     92 
     93.. index:: FDA 
     94.. autoclass:: Orange.projection.linear.Fda 
     95   :members: 
     96 
     97.. autoclass:: Orange.projection.linear.FdaProjector 
     98   :members: 
     99   :show-inheritance: 
     100 
     101******* 
     102FreeViz 
     103******* 
     104 
     105Freeviz 
     106`(Demsar et al, 2005) <http://www.ailab.si/idamap/idamap2005/papers/12%20Demsar%20CR.pdf>`_ 
     107is a method that 
     108finds a good two-dimensional linear projection of the given data, where the 
     109quality is defined by a separation of the data from different classes and the 
     110proximity of the instances from the same class. FreeViz would normally be used 
     111through a widget since it is primarily a method for graphical exploration of 
     112the data. About the only case where one would like to use this module directly 
     113is to tests the classification aspects of the method, that is, to verify the 
     114accuracy of the resulting kNN-like classifiers on a set of benchmark data sets. 
     115 
     116Description of the method itself is far beyond the scope of this page. See the 
     117above paper for the original version of the method; at the moment of writing 
     118the method has been largely extended and not published yet, though the basic 
     119principles are the same. 
     120 
     121[1] Janez Demsar, Gregor Leban, Blaz Zupan: FreeViz - An Intelligent 
     122Visualization Approach for Class-Labeled Multidimensional Data Sets, 
     123Proceedings of IDAMAP 2005, Edinburgh. 
     124 
     125.. autoclass:: Orange.projection.linear.FreeViz 
     126   :members: 
     127   :show-inheritance: 
     128   :exclude-members: attractG, attractG, autoSetParameters, cancelOptimization, 
     129      classPermutationList, classPermutationList, findProjection, 
     130      forceBalancing, forceSigma, getShownAttributeList, mirrorSymmetry, 
     131      optimizeSeparation, optimize_FAST_Separation, optimize_LDA_Separation, 
     132      optimize_SLOW_Separation, radialAnchors, randomAnchors, repelG, 
     133      s2nMixAnchors, s2nMixData, s2nPlaceAttributes, s2nSpread, 
     134      setStatusBarText, showAllAttributes, stepsBeforeUpdate, 
     135      useGeneralizedEigenvectors 
     136 
     137:class:`~Orange.projection.linear.FreeViz` can be used in code to optimize 
     138a linear projection to two dimensions: 
     139 
     140.. literalinclude:: code/freeviz-projector.py 
     141   :lines: 7- 
     142 
     143Learner and Classifier 
     144====================== 
     145 
     146.. autoclass:: Orange.projection.linear.FreeVizLearner 
     147   :members: 
     148   :show-inheritance: 
     149 
     150.. autoclass:: Orange.projection.linear.FreeVizClassifier 
     151   :members: 
     152   :show-inheritance: 
     153 
     154.. autoclass:: Orange.projection.linear.S2NHeuristicLearner 
     155   :members: 
     156   :show-inheritance: 
Note: See TracChangeset for help on using the changeset viewer.