Compilation options of a multi-output model: multiple losses & loss weighting
See original GitHub issueAs described in the Keras handbook -Deep Learning with Pyhton-, for a multi-output model we need to specify different loss functions for different heads of the network. But because gradient descent requires you to minimize a scalar, you must combine these losses into a single value in order to train the model.
Very imbalanced loss contributions will cause the model representations to be optimized preferentially for the task with the largest individual loss, at the expense of the other tasks. To remedy this, you can assign different levels of importance to the loss values in their contribution to the final loss. This is useful in particular if the losses’ values use different scales.
Can anyone help with the following:
I’ve got a five-output model as described in #10120. The outputs of the model are the following:
- emotion (multiclass, multilabel classification)
- valence (regression)
- arousal (regression)
- dominance (regression)
- age (multiclass classification)
I am using the following :
losses_list = {'EMOTIONS': 'binary_crossentropy',
'VALENCE': 'mse',
'AROUSAL': 'mse',
'DOMINANCE': 'mse',
'AGE': 'categorical_crossentropy'}
losses_weights = {'EMOTIONS': 1.0,
'VALENCE': 0.025,
'AROUSAL': 0.025,
'DOMINANCE': 0.025,
'AGE': 0.45}
metrics ={'EMOTIONS': 'crossentropy',
'VALENCE': 'mse',
'AROUSAL': 'mse',
'DOMINANCE': 'mse',
'AGE': 'categorical_accuracy'}
Can anyone comment on this ? Are those the right loss functions? Are those the right weights and are those metrics properly set?
Issue Analytics
- State:
- Created 5 years ago
- Reactions:3
- Comments:10 (2 by maintainers)
Here is a fully functioning example that may help you out. It is
mnist
as anautoencoder
andclassification
at the same time.The above quote is taken from the Deep Learning with Python book. This is actually the only bit that I’ve found online that gives an actual example. I assume you need to figure out the values that each loss typically take and then assign weights accordingly.