🎉Community Raffle - Win $25

An exclusive raffle opportunity for active members like you! Complete your profile, answer questions and get your first accepted badge to enter the raffle.
Join and Win

How to identify critical parameters in a dataset?

User: "mzn"
New Altair Community Member
Updated by Jocelyn
Hi,
Say that you have a dataset of 10 different parameters (listed in 10 columns) to describe a phenomenon (listed in the 11 column). What would be a procedure to identify which of the 10 parameters are critical to predicting the phenomenon on hand (i.e. maybe 4-5 parameters are need to properly predict the phenomenon, how to go around figuring out what are these parameters?).
Thanks, 

Find more posts tagged with

Sort by:
1 - 2 of 21
    User: "varunm1"
    New Altair Community Member
    Accepted Answer
    Updated by varunm1
    Hi @mzn

    Auto model will provide you with attributes (Parameters) that support the prediction. If you don't have access to automodel, you can also see the correlation of these attributes to output using the correlation matrix operator. You can look at some feature selection technique operators like optimize selection, forward selection

    If you are trying to predict using different algorithms like a decision tree, SVM, etc., you can use explain predictions operator to see which attributes are helpful in predictions for a particular algorithm

     Thanks
    User: "IngoRM"
    New Altair Community Member
    Accepted Answer
    Hi,
    can I have access to/figure out the actual relationship/mathematical model used to predict future values?

    The respective models are used for this.  While for some of those models like linear regression (a formula) or a decision tree (nested if-then statements) a mathematical representations can be derived, this is not general the case.  Especially for the more complex models (and therefore often also more accurate models) like GBT or Neural Nets there is a close to zero chance to transform them into a human-readable format.  That is the reason why we have tools like the Simulator or Explain Predictions in the first place - so that you can build trust in what the models are doing even without such an explicit formula.

    Best,
    Ingo