Conditional Gradient

optimizer_conditional_gradient(
  learning_rate,
  lambda_,
  epsilon = 1e-07,
  use_locking = FALSE,
  name = "ConditionalGradient",
  clipnorm = NULL,
  clipvalue = NULL,
  decay = NULL,
  lr = NULL
)

Arguments

learning_rate

A Tensor or a floating point value, or a schedule that is a tf$keras$optimizers$schedules$LearningRateSchedule The learning rate.

lambda_

A Tensor or a floating point value. The constraint.

epsilon

A Tensor or a floating point value. A small constant for numerical stability when handling the case of norm of gradient to be zero.

use_locking

If True, use locks for update operations.

name

Optional name prefix for the operations created when applying gradients. Defaults to 'ConditionalGradient'.

clipnorm

is clip gradients by norm.

clipvalue

is clip gradients by value.

decay

is included for backward compatibility to allow time inverse decay of learning rate.

lr

is included for backward compatibility, recommended to use learning_rate instead.

Value

Optimizer for use with `keras::compile()`