Random search tuner.

RandomSearch(
  hypermodel,
  objective,
  max_trials,
  seed = NULL,
  hyperparameters = NULL,
  tune_new_entries = TRUE,
  allow_new_entries = TRUE,
  executions_per_trial = NULL,
  directory = NULL,
  project_name = NULL,
  ...
)

Arguments

hypermodel

Define a model-building function. It takes an argument "hp" from which you can sample hyperparameters.

objective

A loss metrics function for tracking the model performance e.g. "val_precision". The name of the objective to optimize (whether to minimize or maximize is automatically inferred for built-in metrics)

max_trials

the total number of trials (max_trials) to test

seed

Int. Random seed

hyperparameters

HyperParameters class instance. Can be used to override (or register in advance) hyperparamters in the search space

tune_new_entries

Whether hyperparameter entries that are requested by the hypermodel but that were not specified in hyperparameters should be added to the search space, or not. If not, then the default value for these parameters will be used.

allow_new_entries

Whether the hypermodel is allowed to request hyperparameter entries not listed in hyperparameters

executions_per_trial

the number of models that should be built and fit for each trial (executions_per_trial). Note: the purpose of having multiple executions per trial is to reduce results variance and therefore be able to more accurately assess the performance of a model. If you want to get results faster, you could set executions_per_trial=1 (single round of training for each model configuration)

directory

The dir where training logs are stored

project_name

Detailed logs, checkpoints, etc, in the folder my_dir/helloworld, i.e. directory/project_name.

...

Some additional arguments

Value

a hyperparameter tuner object RandomSearch

Examples

if (FALSE) { x_data <- matrix(data = runif(500,0,1),nrow = 50,ncol = 5) y_data <- ifelse(runif(50,0,1) > 0.6, 1L,0L) %>% as.matrix() x_data2 <- matrix(data = runif(500,0,1),nrow = 50,ncol = 5) y_data2 <- ifelse(runif(50,0,1) > 0.6, 1L,0L) %>% as.matrix() build_model = function(hp) { model = keras_model_sequential() model %>% layer_dense(units=hp$Int('units', min_value=32L, max_value=512L, step=32L), input_shape = ncol(x_data), activation='relu') %>% layer_dense(units=1L, activation='softmax') %>% compile( optimizer= tf$keras$optimizers$Adam( hp$Choice('learning_rate', values=c(1e-2, 1e-3, 1e-4))), loss='binary_crossentropy', metrics='accuracy') return(model) } tuner = RandomSearch(hypermodel = build_model, objective = 'val_accuracy', max_trials = 2, executions_per_trial = 1, directory = 'model_dir', project_name = 'helloworld') }