source: orange/docs/widgets/rst/evaluate/confusionmatrix.rst @ 11778:ecd4beec2099

Revision 11778:ecd4beec2099, 4.7 KB checked in by Ales Erjavec <ales.erjavec@…>, 5 months ago (diff)

Use new SVG icons in the widget documentation.

Line 
1.. _Confusion Matrix:
2
3Confusion Matrix
4================
5
6.. image:: ../../../../Orange/OrangeWidgets/Evaluate/icons/ConfusionMatrix.svg
7
8Shows a confusion matrix.
9
10Signals
11-------
12
13Inputs:
14   - Evaluation results (orngTest.ExperimentResults)
15      Results of testing the algorithms; typically from :ref:`Test Learners`
16
17
18Outputs:
19   - Selected Examples (ExampleTable)
20      A set of examples from the selected cells in the confusion matrix.
21
22
23Description
24-----------
25
26Confusion Matrix gives the number/proportion of examples from one class
27classified in to another (or same) class. Besides that, selecting elements
28of the matrix feeds the corresponding examples onto the output signal. This
29way, one can observe which specific examples were misclassified in a certai
30way.
31
32The widget usually gets the evaluation results from :ref:`Test Learners`;
33an example of the schema is shown below.
34
35.. image:: images/ConfusionMatrix.png
36
37The widget on the snapshot shows the confusion matrix for classification
38tree and naive Bayesian classifier trained and tested on the Iris data.
39The righthand side of the widget contains the matrix for naive Bayesian
40classifier (since this classifier is selected on the left). Each row
41corresponds to a correct class, and columns represent the predicted classes.
42For instance, seven examples of Iris-versicolor were misclassified as
43Iris-virginica. The rightmost column gives the number of examples from
44each class (there are 50 irises of each of the three classes) and the bottom
45row gives the number of examples classified into each class (e.g., 52
46instances were classified into virginica).
47
48When the evaluation results contain data on multiple learning algorithms,
49we have to choose one in in box :obj:`Learners`.
50
51.. image:: images/ConfusionMatrix-Schema.png
52
53In :obj:`Show` we select what data we would like to see in the matrix.
54In the above example, we are observing the :obj:`Number of examples`.
55The alternatives are :obj:`Proportions of predicted` and
56:obj:`Proportions of true` classes. In the iris example, "proportions of
57predicted" shows how many of examples classified as, say, Iris-versicolor
58are in which true class; in the table we can read the 0% of them are
59actually setosae, 89.6% of those classified as versicolor are versicolors,
60and 10.4% are virginicae.
61
62.. image:: images/ConfusionMatrix-propTrue.png
63
64Proportions of predicted shows the opposite relation: of all true versicolors,
6586% were classified as versicolors and 14% as virginicae.
66
67Button :obj:`Correct` sends all correctly classified examples to the output
68by selecting the diagonal of the matrix. :obj:`Misclassified` selects the
69misclassified examples. :obj:`None` annulates the selection. As mentioned
70before, one can also select individual cells of the table, to select specific
71kinds of misclassified examples, e.g. the versicolors classified as virginicae.
72
73When sending the selecting examples the widget can add new attributes telling
74the predicted classes or their probabilities, if the corresponding options
75:obj:`Append class prediction` and/or
76:obj:`Append predicted class probabilities` are checked.
77
78The widget updates the output at every change if :obj:`Commit automatically`
79is checked. If not, the user will need to press :obj:`Commit` to commit the
80changes.
81
82Example
83-------
84
85The following schema demonstrates well what this widget can be used for.
86
87.. image:: images/ConfusionMatrix-Schema.png
88
89:ref:`Test Learners` gets data from :ref:`File` and two learning algorithms
90from :ref:`Naive Bayes` and :ref:`Classification Tree`. It performs
91cross-validation or some other train-and-test procedures to get
92class predictions by both algorithms for all (or some, depending on the
93procedure) examples from the data. The test results are fed into the confusion
94matrix, where we can observe how many examples were misclassified in which way.
95
96On the output we connected two other widgets. :ref:`Data Table` will show
97the examples we select in the Confusion matrix. If we, for instance,
98click :obj:`Misclassified` the table will contain all examples which were
99misclassified by the selected method.
100
101:ref:`Scatter Plot` gets two set of examples. From the file widget, it gets
102the complete data and the confusion matrix will send only the selected data,
103for instance the misclassified examples. The scatter plot will show all the
104data, with the symbols representing the selected data filled and the other
105symbols hollow.
106
107For a nice example, we can load the iris data set and observe the position
108of misclassified examples in the scatter plot with attributes petal
109length and petal width used for x and y axes. As expected, the misclassified
110examples lie on the boundary between the two classes.
111
112.. image:: images/ConfusionMatrix-Example.png
Note: See TracBrowser for help on using the repository browser.