Issue
Is it possible to symbolically multiply a built-in keras loss function by a constant? For example if I want a linear combination of the losses of two outputs.
I could write a custom loss function, but will it not get compiled since it's python code, not symbolic Keras? I'm looking for a way to do it in pure Keras (or TF)
Solution
If your model yields multiple outputs, you can assign a loss function for each output by supplying a list of Keras losses to the loss
argument of the model's compile
method. For example, if your model is of the form
model = Model(inputs=[input_a, input_b], outputs=[output_a, output_b])
You can compile it like so:
model.compile(
optimizer='rmsprop',
loss=['binary_crossentropy', 'mean_squared_error'],
loss_weights=[1., 0.2]
)
This will assign a binary cross-entropy loss to output output_a
and a mean squared error loss to output_b
. The loss that ends up being minimized will be a weighted sum of these losses, with the weights specified in loss_weights
.
Alternatively, if the output layers are named, you can also specify the loss
and loss_weights
with dicts with the output layer name as keys. This might help remove ambiguity about which loss and weights are assigned to which outputs.
See https://keras.io/getting-started/functional-api-guide/#multi-input-and-multi-output-models for further information.
Answered By - tiao
0 comments:
Post a Comment
Note: Only a member of this blog may post a comment.