-
-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Example of how to use with Optimisers.jl #34
Comments
Constructing optimizers from Optimisers.jl is cheap and simple, since the state is de-coupled. Something like this would work: struct Scheduler{T, F}
constructor::F
schedule::T
end
_get_opt(scheduler::Scheduler, t) = scheduler.constructor(scheduler.schedule(t))
Optimisers.init(o::Scheduler, x::AbstractArray) =
(t = 1, opt = Optimisers.init(_get_opt(o, 1), x))
function Optimisers.apply!(o::Scheduler, state, x, dx)
opt = _get_opt(o, state.t)
new_state, new_dx = Optimisers.apply!(opt, state.opt, x, dx)
return (t = state.t + 1, opt = new_state), new_dx
end
opt = Scheduler(Step(init_lr, decay)) do lr
Momentum(lr)
end
st = Optimisers.setup(opt, model) |
I tried this and my model stopped training 😬 It's stuck after one epoch |
That's weird. Which rule are you using? And can you post the value of |
I would also confirm that training doesn't stall with a reasonable fixed LR first. |
Whoops, nevermind, figured it out. I set |
Ah good to know. You should check out |
Now that we have |
I meet a problems that scheduler can not be setup. Could you share me more details? |
The details for calling |
Modify the original code as
|
Great, thanks! |
Hi, I've been trying to use this package with Optimisers.jl (specifically, I've been trying to use a
Step
schedule with aScheduler
but I seem to be getting errors that suggest that this setup works with the Flux optimisers, and not with Optimisers.jl for now. Is there a way to write code that works with Optimisers.jl?The text was updated successfully, but these errors were encountered: