🎉Community Raffle - Win $25

An exclusive raffle opportunity for active members like you! Complete your profile, answer questions and get your first accepted badge to enter the raffle.
Join and Win

Cannot reset network to smaller learning rate - Optimization parameter

User: "olafansau55"
New Altair Community Member
Updated by Jocelyn
Hello Guys
I'm working on a final project using rapid miner on COVID-19 data. I did the parameter optimization process. I entered a value for the learning rate in the range of 0.00 - 0.99 with a step count of 10 and for training_cycles in the range 1-200 for a step count of 10. Initially, there was no problem with that value, but when I changed the number of steps in training_cycles to 200, an error appeared like "Cannot reset the network to a smaller learning rate". I want to ask why this happened and how to solve it. If anyone can help out I would really appreciate it.

Thank you~
Sort by:
1 - 1 of 11
    User: "SabaRG"
    New Altair Community Member
    Accepted Answer
    Updated by SabaRG
    Dear @olafansau55
    Unfortunately, I don't know why the "Try" operator can not work with the deep learning operator, which I suggest you use to ignore this bug, and the ignore error option of optimization operator can not handle it! It is a bug with the deep learning (H2O) operator.
    But, for your case, I suggest starting the learning rate from 0.1 or 0.2 and checking when it has no error. Your problem is with the learning rate.
    Sincerely

    #BugReport