🎉Community Raffle - Win $25

An exclusive raffle opportunity for active members like you! Complete your profile, answer questions and get your first accepted badge to enter the raffle.
Join and Win

Difference between normal decision tree with information gain criterion and W-J48

User: "koknayaya"
New Altair Community Member
Updated by Jocelyn
Hi. Have a good day everyone!

I want to ask a question smiley 

1st question
What is the difference between 
a) normal decision tree with information gain criterion and
b) W-J48?

Im quite confused with the difference. 

Why dont we just use the basic decision tree and choose 'Information gain' for the criterion instead of using W-J48?

2nd question
Is there any guidelines for me to set the suitable values for parameters in W-J48 such as the confidence threshold for pruning and the minimum number of instances per leaf? 

I dont know the suitable value that should be set for the parameters.

Find more posts tagged with

Sort by:
1 - 2 of 21
    User: "Telcontar120"
    New Altair Community Member
    I think this is a similar discussion to the following thread: https://community.rapidminer.com/discussion/54330/difference-between-c4-5-and-w-j48#latest
    For more details on the W-J48 implementation you should consult the Weka project documentation.
    User: "varunm1"
    New Altair Community Member
    Updated by varunm1
    Hi @koknayaya

    I dont see much difference conceptually between these two as they both use same concept. Information gain ratio and J48 both are worked by Quinlan. Actually both works based on Pruning confidence which is denoted as 'C' and minimal leaf size 'M'. You can see both options in both decision trees. 

    For your question 2, I see that the default values for confidence 'C' is 0.25 and 'M is '2'. If the confidence is lower the tree is pruned more. You need to try different combinations

    Thanks,
    Varun