Issue
I'm using Optuna 2.5 to optimize a couple of hyperparameters on a tf.keras CNN model. I want to use pruning so that the optimization skips the less promising corners of the hyperparameters space. I'm using something like this:
study0 = optuna.create_study(study_name=study_name,
storage=storage_name,
direction='minimize',
sampler=TPESampler(n_startup_trials=25, multivariate=True, seed=123),
pruner=optuna.pruners.SuccessiveHalvingPruner(min_resource='auto',
reduction_factor=4, min_early_stopping_rate=0),
load_if_exists=True)
Sometimes the model stops after 2 epochs, some other times it stops after 12 epochs, 48 and so forth. What I want is to ensure that the model always trains at least 30 epochs before being pruned. I guess that the parameter min_early_stopping_rate
might have some control on this but I've tried to change it from 0 to 30 and then the models never get pruned. Can someone explain me a bit better than the Optuna documentation, what these parameters in the SuccessiveHalvingPruner()
really do (specially min_early_stopping_rate
)?
Thanks
Solution
min_resource
's explanation on the documentation says
A trial is never pruned until it executes
min_resource * reduction_factor ** min_early_stopping_rate
steps.
So, I suppose that we need to replace the value of min_resource
with a specific number depending on reduction_factor
and min_early_stopping_rate
.
Answered By - nzw0301
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.