Orange Forum • View topic - Confusion matrix widget

Confusion matrix widget

A place to ask questions about methods in Orange and how they are used and other general support.

Confusion matrix widget

Postby algol » Wed Aug 22, 2007 18:21

I have a datafile... say the Titanic (survival= yes / no). I link the DATA widget (primed with the Orange resident Titanic datafile), to the LOGISTIC REGRESSION widget. I link both to the TEST LEARNERS widget. From the TEST LEARNERS widget I can link successfully to the evaluation widgets (e.g. Receiver Operating Characteristic curve / Calibration ). If I link, from TEST LEARNERS, to CONFUSION MATRIX, which is important to me, all I get is an empty 2 by 2 matrix.

The TEST LEARNERS widget actually names the CONFUSION MATRIX as a legitimate output! Which is exactly what I would have expected. But all I see is a 2 by 2 table with no content. Puzzling.

The only clue I have is that, unlike the ROC and CALIBRATION output targets for TEST LEARNERS.... the CONFUSION MATRIX widget has both input, and output plugs... am I missing something here?

Oh... somebody out there... put me out of my misery. Please! This one I will light 50 candles for. I do like ORANGE, but she is a harsh mistress for those who do not devote every waking hour of the day to her.

Postby algol » Fri Aug 24, 2007 16:55

Replying to my own question here. I have just noticed a Bug Report (Friday 6th July) reference non-functioning Confusion Matrix widget.

Janez replied that this widget was not meant to be let out of the asylum, but, if such a widget really were needed, it could be shuffled up the things-to-do list.

I want to add my name to the - fix it please- list.

I find the confusion matrix useful as an adjunct to the ROC curve, particularly is it can be delivered for classification thresholds other than Oranges' default P > 0.5, and even moreso if it can do that on the back of a Leaving-One-Out model validation.

I work mainly with logistic regression, devising risk-assessment rule-sets for medical classification problems. It is not unusual to be asked to deliver a RULE-IN solution i.e. zero false-negatives (... think cancer screening!) That typically entails classification thresholds well below P=0.5.

If you can deliver a confusion-matrix widget with that sort of flexibility, I for one would be extremely grateful. Thanks. algol.

Postby Janez » Mon Aug 27, 2007 11:30

I swear this widget worked (almost) perfectly not more than a week ago, and now it's broke without anybody changing anything. I'm looking into it, expect it fixed soon.

Postby Janez » Mon Aug 27, 2007 11:54

OK, it was less than almost perfect even before nobody did anything to break it. ;)

I've put a fix on the CVS and the snapshot will be rebuilt OK in half an hour or so.

confusion matrix widget fixed

Postby algol » Thu Sep 06, 2007 22:52

Thankyou Janez. Confusion matrix now runs fine (slow response to your fix... been on holiday)..

Forgive a postscript... but that widget would be SO much more useful if the user could specify the classification cut-point (the default is obviously 50%) and get a confusion matrix appropriate to the user cut selected.

The ROC widget goes some way in that direction, but... some of us (especially in medical applications) are much more comfortable with that little 2 by 2 table....

Aside from that... my complete respects. algol.


Return to Questions & Support