R/optimizers.R
adam_step.Rd
Step for Adam with `lr` on `p`
adam_step(p, lr, mom, step, sqr_mom, grad_avg, sqr_avg, eps, ...)
p
learning rate
momentum
step
sqr momentum
grad average
sqr average
epsilon
additional arguments to pass
None