Latent Dirichlet Allocation - Iterations?
Soni
New Altair Community Member
Hi RapidMiner team,
I'm rather new to LDA and am a bit confused to what exactly the iterations parameter refers to since the description says "Number of iterations for optimization". However, as far as I understood it, the operator uses Gibbs sampling where the number of iterations should be set as well. Is it set using the iterations parameter or has the parameter something to do with optimizing the hyperparameters (whereby the input field does not disappear if I uncheck the hyperparameter optimization)?
Thanks in advance!
Thanks in advance!
Tagged:
0
Best Answer
-
Hi @Soni,you are absultely right. its the number of sampling steps in Gibbs Sampling and is effectivly a hyper parameter of LDA.Best,Martin1
Answers
-
Hi @Soni - this is MarlaBot. I found these great videos on our RapidMiner Academy that you may find helpful:
Please LIKE my comment if it helps! 👇Instructional Video: Applying a Model to categorize Documents (Viewing time: ~11m) Instructional Video: Text Association Rules (Viewing time: ~10m)
MarlaBot0 -
Hi @Soni,you are absultely right. its the number of sampling steps in Gibbs Sampling and is effectivly a hyper parameter of LDA.Best,Martin1
-
1
-
Sorry? I don't understand the question. The LDA we use internally is from Mallet. The operator executes similar code to whats shown here:What would you expect?Best,Martin
0