Altair RISE
A program to recognize and reward our most engaged community members
Nominate Yourself Now!
Home
Discussions
Altair RapidMiner
XGBoost and LightGBM implementations
Nirr3
Hello,
while the h20 gbm definitely is a performing model, I often find myself using the XGBoost or lightgbm model from either R or python instead. Given this is one of the most commonly used gbm models, any chance we could have them added to RapidMiner.
Find more posts tagged with
AI Studio
Feature Request
Comments
sgenzer
Hi
@Nirr3
- I'm going to connect you with
@bhupendra_patil
. He has a pilot project that is working on exactly this kind of idea.
Scott
christos_karras
Hi
@sgenzer
and
@bhupendra_patil
,
Is there any news about this? I would also be interested in trying a RapidMiner XGBoost implementation and see how it compares with an existing model based on H2O Gradient Boosted Trees.
Thanks
hughesfleming68
Try the Smile extension. I have had very good results with it, especially the GBT.
jacobcybulski
Xgboost is available for Java, including its GPU support.
jacobcybulski
I have also noticed that the current version of H2O provides not only GBM (I assume this model is available in RM) but also XGBoost interface (yes, the original one as well as its light version), with GPU support which gives fantastic training acceleration but which restricts the tree learning options, so I found out it is not producing models of the accuracy as good as that of the CPU based backend.
MartinLiebig
Hi
@jacobcybulski
you are talking about this right?
https://xgboost.readthedocs.io/en/latest/jvm/java_intro.html#
Seems fairly new to me?
Best,
Martin
MartinLiebig
If yes, then this is the problem: "Windows not supported in the JVM package"
jacobcybulski
@mschmitz
Yes, this was the one, also announced on the main web site. Pity, looks like a short wait for Windows to be supported?
MartinLiebig
Its like this since 2016 if i remember correctly. They have no interest in supporting Windows.
jacobcybulski
I've done some more digging and I found out this French fork of XGBoost. If anybody wanted to do a DIY xgboost extension for RM (which seems to include Windows), here is a link to try:
https://github.com/criteo-forks/xgboost-jars/
.
It is becoming a sort of second-hand development and I am not sure if GPU would be supported in this project.
MartinLiebig
Hi
@jacobcybulski
,
i am aware of this. But adding self-build jars with platform dependency is a bit trickier than using normal things. I doubt that I can do this as a short side project without help of people like
@jczogalla
or
@Marco_Boeck
. And these folks are usually busy developing other cool new features. So we would need to push this through the normal product management cycles.
Cheers,
Martin
CC:
@Knut-RM
Quick Links
All Categories
Recent Discussions
Activity
My Discussions
Unanswered
日本語 (Japanese)
한국어(Korean)
Groups