Create a WGAN from `data`, `generator` and `critic`.
GANLearner_wgan(
dls,
generator,
critic,
switcher = NULL,
clip = 0.01,
switch_eval = FALSE,
gen_first = FALSE,
show_img = TRUE,
cbs = NULL,
metrics = NULL,
opt_func = Adam(),
lr = 0.001,
splitter = trainable_params,
path = NULL,
model_dir = "models",
wd = NULL,
wd_bn_bias = FALSE,
train_bn = TRUE,
moms = list(0.95, 0.85, 0.95)
)
dataloader
generator
critic
switcher
clip value
switch evaluation
generator first
show image or not
callbacks
metrics
optimization function
learning rate
splitter
path
model directory
weight decay
weight decay bn bias
It controls if BatchNorm layers are trained even when they are supposed to be frozen according to the splitter.
momentums
None