"Neural Networks: Nominal Class"

mksaad
mksaad New Altair Community Member
edited November 5 in Community Q&A
Hi,

I tried to train a neural network with a nominal class label dataset (iris dataset), but I received the following error:
This learning scheme does not have sufficient capabilities for the given data set: polynominal label not supported
How can I  train a neural network with a nominal class label dataset?

Greetings,
--
Motaz K. Saad

Answers

  • IngoRM
    IngoRM New Altair Community Member
    Hi,

    just add the NeuralNet operator into the operator "Binary2MultiClassLearner" which can be used to transform any binominal learning scheme into one which can be applied on multiple classes. Here is an example:

    <operator name="Root" class="Process" expanded="yes">
        <operator name="ExampleSetGenerator" class="ExampleSetGenerator">
            <parameter key="number_examples" value="200"/>
            <parameter key="number_of_attributes" value="2"/>
            <parameter key="target_function" value="gaussian mixture clusters"/>
        </operator>
        <operator name="Binary2MultiClassLearner" class="Binary2MultiClassLearner" expanded="yes">
            <operator name="NeuralNet" class="NeuralNet">
            </operator>
        </operator>
    </operator>
    Cheers,
    Ingo
  • mksaad
    mksaad New Altair Community Member
    Hi,
    Thanks for replying

    Hmmm  ::), It builds 3 NNs, one for each class values !


    Do you recommend using other solutions rather than using  "Binary2MultiClassLearner" operator, like replacing each class label value with a numeric value ?
    for example, for Iris dataset, I performed the following:
    replaced "Iris Setosa" class with 10
    replaced "Iris Versicolour" class with 20
    replaced "Iris Virginica" class with 30

    After training the NN for 1000 training cycles, with learning rate of 0.3,  momentum of 0.2, and error epsilon of 0.00.  I got the following results
    root_mean_squared_error: 1.802 +/- 0.000
    squared_error: 3.246 +/- 12.872


    When I tried to replace nominal class values with smaller numbers, I got difference results for same training cycles (1000). like the following
    replaced "Iris Setosa" class with 0
    replaced "Iris Versicolour" class with 1
    replaced "Iris Virginica" class with 2

    I got the following results
    root_mean_squared_error: 0.180 +/- 0.000
    squared_error: 0.032 +/- 0.129

    What do you think? !

    Another question please, What +/- 0.129 in squared_error results stands for ?

    Worm Greetings,
    --
    Motaz K. Saad
  • TobiasMalbrecht
    TobiasMalbrecht New Altair Community Member
    Hi Motaz,
    mksaad wrote:

    What do you think? !
    well, what do you think?

    Let me try to point you at how to get a conclusion yourself by mirroring to you what you did:

    First of all, you had a classification problem. Then you transformed this to a regression problem by arbitrarily mapping the three classes to real values. Afterwards you compared the (regression!) errors you obtained with two different mappings. You observed that the errors were different. Actually, the errors seem to be scaled as your label values imply.

    But what happens if you map the three classes to the values 2, 1, 0? ... or 0.1, 38297159 and 7? Or -4, 328 trillion and pi? Well, the errors will surely be different again. But the example mappings I just mentioned should not be less reasonable than the mappings you tried. The point is: mapping a classification to a regression problem and examining the regression errors will almost not give you any information at all. If you try using a regression learner on a classification problem, you at least have to map the predictions back to the classes and examine the classification errors. Nevertheless, this will still be highly dependent on the mappings you have chosen.

    Hence, to cut a long story short: the method Ingo proposed is the adequate way to use neural nets for multi classification problems.

    Hope I could clarify things a bit.

    Regards,
    Tobias
  • mksaad
    mksaad New Altair Community Member
    Hi Tobias,

    Thanks for your reply.

    What +/- 0.129 in squared_error results stands for ?


    Thanks in advance,
    --
    Motaz
  • TobiasMalbrecht
    TobiasMalbrecht New Altair Community Member
    Hi Motaz,

    sorry, I forget to answer that question! ;)

    That value is the standard deviation of the error in the folds (assuming you have done a cross validation).

    Regards,
    Tobias