Prediction Column out of Binary Machine Learning Classification Problem

SA_H
SA_H New Altair Community Member
edited November 2024 in Community Q&A
For example, in case of Logistic Regression, we can get Coefficients that can be multiplied by the predictors to get the final output in the form of an attribute in CSV file or an image. Please let me know if it is scientifically correct to get the weights/rules out of trained SVM. ANN, KNN, NB models and multiply each predictor with each weight/rule and get the sum for all predictors. I mean (predictor 1* its weight + predictor 2* its weight + predictor 3* its weigh + ........)

Answers

  • rfuentealba
    rfuentealba New Altair Community Member

    At first glance, operating on weights/rules didn't sound logical to me: Decision Trees try to make examples fit in one or other category by treating all data as categorical rather than numerical. Logistic Regressions, on the other hand, are performed over numerical data, and altering the results might make more sense.

    However, Gradient Boosted Trees work in a similar fashion. That is, giving more weight to classes that are difficult to classify and less weight to the easier ones. It wouldn't hurt to make a quick test and see how predictors behave with your data. The keyword to continue researching is Boosting.

    Hope it helps,

    Rodrigo.
  • SA_H
    SA_H New Altair Community Member
    Thank you @rfuentealba for your help. Could you please let me which classifier rather than DT allow extract of weight/rules.
  • MartinLiebig
    MartinLiebig
    Altair Employee
    Hi,
    your approach of coefficient*value only works for linear models. The strenght of most machine learning models is ,that they are non-linear. thats the cool part.Breaking down non-linear, multi-variate methods into single factors is 'tricky' to 'impossible'.
    Never the less, have a look at the WEI ports of the operators and at operators like Tree to Rules (or so?). They may help.

    Cheers,
    Martin
  • SA_H
    SA_H New Altair Community Member
    So linear ML model could work same way sch as Linear SVM, but other non-linear models could not,
  • varunm1
    varunm1 New Altair Community Member
    edited October 2019
    Hello @summer_helmi

    SVM is a one of the linear models , but it can work with non linear functions using kernel trick. Non linear algorithms have their own way of working, for example a decision tree works based on split criterion and a neural network work based on hidden unit activations.

    So basically every class of algorithms have their own way of working 

    For your initial question, yes its scientifically correct to get feature weights from an algorithm, as the weights are calculated based on proven methods. But it is not always correct to multiply the weight with feature, it is only correct for a class of linear models (GLM) that are based on linear equations