Latent Dirichlet Allocation - Iterations?
Hi RapidMiner team,
I'm rather new to LDA and am a bit confused to what exactly the iterations parameter refers to since the description says "Number of iterations for optimization". However, as far as I understood it, the operator uses Gibbs sampling where the number of iterations should be set as well. Is it set using the iterations parameter or has the parameter something to do with optimizing the hyperparameters (whereby the input field does not disappear if I uncheck the hyperparameter optimization)?
Thanks in advance!
Thanks in advance!