Issue
I want to add multiple loss values to my model, but when I add them to the loss parameter, they don't show up (it just shows as loss, not each loss value):
dice_loss = sm.losses.DiceLoss()
focal_loss = sm.losses.BinaryFocalLoss()
total_loss = dice_loss + (1 * focal_loss)
optim = tf.keras.optimizers.Adam(LR)
loss = [total_loss, sm.losses.BinaryFocalLoss(), sm.losses.DiceLoss(), sm.losses.JaccardLoss()]
metrics = [sm.metrics.IOUScore(threshold=0.5), sm.metrics.FScore()]
model.compile(optimizer = optim, loss=loss, metrics=metrics)
So to show each loss, I add the loss to the metrics side like this:
dice_loss = sm.losses.DiceLoss()
focal_loss = sm.losses.BinaryFocalLoss()
total_loss = dice_loss + (1 * focal_loss)
optim = tf.keras.optimizers.Adam(LR)
loss = [total_loss]
metrics = [sm.metrics.IOUScore(threshold=0.5), sm.metrics.FScore(), sm.losses.BinaryFocalLoss(), sm.losses.DiceLoss(), sm.losses.JaccardLoss()]
model.compile(optimizer = optim, loss=loss, metrics=metrics)
Is it okay for me to add the loss function to the metrics side? or it will be affect the training process? If it will affect the training process, is there anyway for me to show each loss value without affecting the training process?
Solution
Yes this is fine, it will not change the training process, metrics are only used for monitoring and not for computing gradients that will train the model.
Answered By - Dr. Snoopy
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.