Issue
I am studying the example in this link, but I am not sure how the scheduler function is receiving both the epoch and learning rate (lr). How are they being passed? And how can I pass more arguments?
I tried following this example and I received an error that says scheduler received an extra argument "lr", so I am not sure how to fix that.
Solution
You usually use tf.keras.optimizers.schedules and pass them directly to the model optimizer. The link you are referring to is actually a callback
that needs some kind of scheduler function. Here is an example of an tf.keras.optimizers.schedules.ExponentialDecay
function according to the docs:
initial_learning_rate = 0.1
lr_schedule = tf.keras.optimizers.schedules.ExponentialDecay(
initial_learning_rate,
decay_steps=100000,
decay_rate=0.96,
staircase=True)
model.compile(optimizer=tf.keras.optimizers.Adam(learning_rate=lr_schedule),
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
And this link shows you a good example of how to use a custom callable, which uses the initial learning rate defined in the optimizer. In this case, it is 0.01.
Answered By - AloneTogether
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.