Issue
My hypertuning results are quite different even though my model should effectively be using the same parameters, depending on whether I use hp.Fixed(key, value)
or just value
(where value
is say, an Int). I've verified that repeated runs of each test produce the same results following the instructions for reproducibility as well as setting the seed for all applicable layers/initializers/etc. even though the instructions stated they weren't necessary.
Using hp.Fixed(key, value)
Using value
Looking at the table of all hyperparameters, it appears that hp.Fixed
isn't even doing anything at all. All hyperparameters are being tested.
EDIT: My custom hypermodel's hyperparameters regardless of state are being ignored by the HyperbandTuner.
Here is the offending code:
class MyModel(kt.HyperModel):
def __init__(self, **config):
self.config = config
self.seed = config.get('seed')
def build_model(self):
model = Sequential(name=self.name)
model.add(LSTM(self.units, name='LSTM'))
model.add(Dense(1, name='Output', kernel_initializer=GlorotUniform(seed=self.seed)))
model.compile(loss='mean_squared_error', metrics='mean_squared_error', sample_weight_mode='temporal')
return model
# If the user has supplied the parameter manually, use hp.Fixed()
# Otherwise, use the provided hyperparameter (default)
def _param(self, key, default=None):
value = self.config.get(key)
if value is not None:
return self.hp.Fixed(key, value)
else:
return default
def build(self, hp):
self.hp = hp
self.units = self._param('units', hp.Int('units', 1, 200, step=5))
return self.build_model()
Solution
Ok so after doing some more digging I discovered (specifically through how this tutorial declared the hyperparameters) that when you write hp.[Fixed|Choice|etc.]
, you are immediately declaring the presence of those hyperparameters in the search space, regardless of where that code appears.
Think of the hyperparameter declaration as an eigenclass method, not as a regular Python object that the HyperTuner picks up on from within the model.
Essentially, each of the Fixed/Choice/etc. hyperparameter methods simultaneously sets a global hyperparameter somewhere in the background of the HyperTuner class while also returning a regular variable (Int/Float/String/Range/List/etc.) so that you can still build your model without error before the HyperTuner eventually overwrites it during the search phase.
I was confused by this because typically the hp
shows up as an argument in the build_model()
method or kt.HyperModel
class wherein the calls are assigned to local variables which are then passed to the model declaration.
Here's the fix for the offending code:
class MyModel(kt.HyperModel):
def __init__(self, **config):
self.config = config
self.seed = config.get('seed')
def build_model(self):
model = Sequential(name=self.name)
model.add(LSTM(self.units, name='LSTM'))
model.add(Dense(1, name='Output', kernel_initializer=GlorotUniform(seed=self.seed)))
model.compile(loss='mean_squared_error', metrics='mean_squared_error', sample_weight_mode='temporal')
return model
# If the user has supplied the parameter manually, use hp.Fixed()
# Otherwise, use the provided hyperparameter (default)
def _param(self, key, default=None):
value = self.config.get(key)
if value is not None:
return self.hp.Fixed(key, value)
else:
return default()
def build(self, hp):
self.hp = hp
self.units = self._param('units', lambda: hp.Int('units', 1, 200, step=5))
return self.build_model()
Answered By - SnakeWasTheNameTheyGaveMe
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.