🎉Community Raffle - Win $25

An exclusive raffle opportunity for active members like you! Complete your profile, answer questions and get your first accepted badge to enter the raffle.
Join and Win

SMOTE Upsampling Operator With Multi-Label Classification

User: "Nawaf"
New Altair Community Member
Updated by Jocelyn
Hi!
 I wanted to ask if it is possible to use SMOTE Upsampling operator with multi-label classification? If so how? If not what is the alternative operator to overcome imbalanced classes?

Find more posts tagged with

Sort by:
1 - 4 of 41
    Hi @nawaf,
    sure. you just use it #classes-1 times to get all classes to the same level.

    Best,
    Martin
    User: "David_A"
    New Altair Community Member
    Hi @Nawaf ,

    you could simply run SMOTE multiple time for the minority classes. So afterwards you have an up-sampled data set with all classes being balanced. Of course this is only really feasible when the number of classes is not too high.

    Best,
    David
    User: "Nawaf"
    New Altair Community Member
    OP
    Thanks folks for your response! The number of difference between class 0 and 1 (using the binary classification) is almost too high as normal for multi-label classification problem. So do you think finding the best threshold is better than applying SMOTE ?
    User: "MartinLiebig"
    Altair Employee
    Accepted Answer
    good question. Both ways are feasible and can be succesful. What I would remind you about is, that if you use tree-based models like a RF then the additional examples from upsampling allows "deeper trees", since there are just more examples. You this get a very different tree.

    Best,
    Martin