🎉Community Raffle - Win $25

An exclusive raffle opportunity for active members like you! Complete your profile, answer questions and get your first accepted badge to enter the raffle.
Join and Win

Do GPU provide faster training times?

User: "Andy3"
New Altair Community Member
Updated by Jocelyn
Hello,

I have heard that Rapidminer have some similarities with H2O, so when I read the article (sefiks.com/2019/11/07/why-you-should-build-xgboost-models-within-h2o/) that the H2O library have some turbo-button when the GPU is enable I hope the same thing was applied in Rapidminer though I haven't been able to test this out. My impression has been that only the CPU has been the most important thing when trying to decrease training times with the GBT-algorithm. Could someone from RM verify that RM has the same turbo-button or not? :smile:

Best regards
Andy

Find more posts tagged with

Sort by:
1 - 4 of 41
    Hi @Andy3,

    we have GPU support for neural networks in the Deep Learning extension, but not (yet) for GBTs.

    Best,
    Martin
    User: "Andy3"
    New Altair Community Member
    OP
    Hope to have it soon in for the GBTs.
    User: "tkenez"
    New Altair Community Member
    Accepted Answer
    Hi @Andy3,

    We are actually using the open source H2O library as a backend for some of our learners (GBT, GLM, LR and DL - not the one in the Deep Learning Extension, but the built-in one). In our upcoming 9.7 release we actually did some work to bump up the used H2O library to the latest stable version and implemented some improvements on these learners.

    I can only confirm what @mschmitz wrote, currently we don't support GPUs for the above learners. However, we are actively looking into how we can leverage the library better, which includes adding additional learners, and enabling GPU support where it makes sense.

    I cannot commit to a release date yet, but this is an active topic in our engineering team.

    Regards,
    Tamas
    User: "Andy3"
    New Altair Community Member
    OP
    Glad to hear that you do some work to update RM to latest stable version. Love to hear more about it :)