source: orange/docs/widgets/rst/classify/randomforest.rst @ 11778:ecd4beec2099

Revision 11778:ecd4beec2099, 2.7 KB checked in by Ales Erjavec <ales.erjavec@…>, 5 months ago (diff)

Use new SVG icons in the widget documentation.

Line 
1.. _Random Forest:
2
3Random Forest
4=============
5
6.. image:: ../../../../Orange/OrangeWidgets/Classify/icons/RandomForest.svg
7
8Random forest learner
9
10Signals
11-------
12
13Inputs:
14   - Examples (ExampleTable)
15      A table with training examples
16
17
18Outputs:
19   - Learner
20      The random forest learning algorithm with settings as specified in the
21      dialog
22   - Random Forest Classifier
23      Trained random forest
24
25
26Description
27-----------
28
29Random forest is a classification technique that proposed by
30[Breiman2001]_, given the set of class-labeled data, builds a set of
31classification trees. Each tree is developed from a bootstrap sample
32from the training data. When developing individual trees, an arbitrary
33subset of attributes is drawn (hence the term "random") from which the best
34attribute for the split is selected. The classification is based on the
35majority vote from individually developed tree classifiers in the forest.
36
37Random forest widget provides for a GUI to Orange's own implementation of
38random forest (:class:`~Orange.ensemble.forest.RandomForestLearner`). The
39widget output the learner, and, given the training data on its input, the
40random forest. Additional output channel is provided for a selected
41classification tree (from the forest) for the purpose of visualization
42or further analysis.
43
44.. image:: images/RandomForest.png
45
46In the widget, the first field is used to specify the name of the learner
47or classifier. Next block of parameters tells the algorithm how many
48classification trees will be included in the forest
49(:obj:`Number of trees in forest`), and how many attributes will be
50arbitrarily drawn for consideration at each node. If the later is not
51specified (option :obj:`Consider exactly ...` left unchecked), this number
52is equal to square root of number of attributes in the data set. Original
53Brieman's proposal is to grow the trees without any pre-prunning, but since
54this later often works quite well the user can set the depth to which the
55trees will be grown (:obj:`Maximal depth of individual trees`). As another
56pre-pruning option, the stopping condition in terms of minimal number of
57instances in the node before splitting can be set. Finally, if the training
58data is given to the widget, the :obj:`Index of the tree on the output`
59can be specified, instructing the widget to send the requested classifier.
60
61Examples
62--------
63
64Snapshot below shows a standard comparison schema of a random forest and
65a tree learner (in this case, C4.5) on a specific data set.
66
67.. image:: images/RandomForest-Test.png
68   :alt: Random forest evaluation
69
70
71References
72----------
73
74.. [Breiman2001] Breiman L (2001) Random Forests. Machine Learning 45 (1), 5-32.
75   (`PDF <http://www.springerlink.com/content/u0p06167n6173512/fulltext.pdf>`_)
Note: See TracBrowser for help on using the repository browser.