Find more posts tagged with
Comments
Sort by:
1 - 11 of
111
Hi @sgenzer and @bhupendra_patil,
Is there any news about this? I would also be interested in trying a RapidMiner XGBoost implementation and see how it compares with an existing model based on H2O Gradient Boosted Trees.
Thanks
Is there any news about this? I would also be interested in trying a RapidMiner XGBoost implementation and see how it compares with an existing model based on H2O Gradient Boosted Trees.
Thanks
I have also noticed that the current version of H2O provides not only GBM (I assume this model is available in RM) but also XGBoost interface (yes, the original one as well as its light version), with GPU support which gives fantastic training acceleration but which restricts the tree learning options, so I found out it is not producing models of the accuracy as good as that of the CPU based backend.
you are talking about this right? https://xgboost.readthedocs.io/en/latest/jvm/java_intro.html#
Seems fairly new to me?
Best,
Martin
@mschmitz Yes, this was the one, also announced on the main web site. Pity, looks like a short wait for Windows to be supported?
I've done some more digging and I found out this French fork of XGBoost. If anybody wanted to do a DIY xgboost extension for RM (which seems to include Windows), here is a link to try:
It is becoming a sort of second-hand development and I am not sure if GPU would be supported in this project.
It is becoming a sort of second-hand development and I am not sure if GPU would be supported in this project.
Hi @jacobcybulski ,
i am aware of this. But adding self-build jars with platform dependency is a bit trickier than using normal things. I doubt that I can do this as a short side project without help of people like @jczogalla or @Marco_Boeck . And these folks are usually busy developing other cool new features. So we would need to push this through the normal product management cycles.
Cheers,
Martin
CC: @Knut-RM
Scott