Build a convnet style learner from `dls` and `arch`
cnn_learner(
dls,
arch,
loss_func = NULL,
pretrained = TRUE,
cut = NULL,
splitter = NULL,
y_range = NULL,
config = NULL,
n_out = NULL,
normalize = TRUE,
opt_func = Adam(),
lr = 0.001,
cbs = NULL,
metrics = NULL,
path = NULL,
model_dir = "models",
wd = NULL,
wd_bn_bias = FALSE,
train_bn = TRUE,
moms = list(0.95, 0.85, 0.95)
)
data loader object
a model architecture
loss function
pre-trained or not
cut
It is a function that takes self.model and returns a list of parameter groups (or just one parameter group if there are no different parameter groups).
y_range
configuration
the number of out
normalize
The function used to create the optimizer
learning rate
Cbs is one or a list of Callbacks to pass to the Learner.
It is an optional list of metrics, that can be either functions or Metrics.
The folder where to work
Path and model_dir are used to save and/or load models.
It is the default weight decay used when training the model.
It controls if weight decay is applied to BatchNorm layers and bias.
It controls if BatchNorm layers are trained even when they are supposed to be frozen according to the splitter.
The default momentums used in Learner.fit_one_cycle.
learner object
if (FALSE) {
URLs_MNIST_SAMPLE()
# transformations
tfms = aug_transforms(do_flip = FALSE)
path = 'mnist_sample'
bs = 20
#load into memory
data = ImageDataLoaders_from_folder(path, batch_tfms = tfms, size = 26, bs = bs)
learn = cnn_learner(data, resnet18(), metrics = accuracy, path = getwd())
}