Create a sequence of convolutional (`ni` to `nf`), ReLU (if `use_activ`) and `norm_type` layers.
ConvLayer(
ni,
nf,
ks = 3,
stride = 1,
padding = NULL,
bias = NULL,
ndim = 2,
norm_type = 1,
bn_1st = TRUE,
act_cls = nn()$ReLU,
transpose = FALSE,
init = "auto",
xtra = NULL,
bias_std = 0.01,
dilation = 1,
groups = 1,
padding_mode = "zeros"
)
number of inputs
outputs/ number of features
kernel size
stride
padding
bias
dimension number
normalization type
batch normalization 1st
activation
transpose
initializer
xtra
bias standard deviation
specify the dilation rate to use for dilated convolution
groups size
padding mode, e.g 'zeros'
None