Issue
I have a tensorflow 2.x functional model model whose first layers are from another pretrained model. I want those layers to remain frozen, so I have used tf.stop_gradient
on the pretrained head to stop them from learning. Below is a minimal example of my network:
head = load_my_cool_pretrained_representation_model()
x = tf.keras.layers.Dense(10000)(tf.stop_gradient(head.output))
x = tf.keras.layers.Dense(1)(x)
model = tf.keras.Model(inputs=head.inputs, outputs=x)
model.compile(loss='mse', optimizer=tf.keras.optimizers.Adam())
When I use model.fit()
I get the following warning:
WARNING:tensorflow:Gradients do not exist for variables ['cool_rep_layer/embeddings:0', ...] when minimizing the loss.
I do not need the warning as I want those layers to not have gradients. How can I suppress this warning? I have looked at this answer, but I do not want to get into gradient tape for this model.
Solution
As per noober's comment, I just added
import logging
logging.getLogger('tensorflow').setLevel(logging.ERROR)
to get rid of the warnings. This worked for me.
Answered By - Ottpocket
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.