question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Way to utilize `loss_weights` in custom `fit` method!

See original GitHub issue

I’m going through this documents (customizing_what_happens_in_fit by @fchollet), and eventually feel like getting stuck if I want to weighted the loss value. For example, observe the following code

class CustomModel(keras.Model):
    def __init__(self, *args, **kwargs):
        super().__init__(*args, **kwargs)
        self.loss_tracker = keras.metrics.Mean(name="loss")
        self.mae_metric = keras.metrics.MeanAbsoluteError(name="mae")

    def train_step(self, data):
        x, y = data

        with tf.GradientTape() as tape:
            y_pred = self(x, training=True)  # Forward pass
            # Compute our own loss
            loss = keras.losses.mean_squared_error(y, y_pred)

        # Compute gradients
        trainable_vars = self.trainable_variables
        gradients = tape.gradient(loss, trainable_vars)

        # Update weights
        self.optimizer.apply_gradients(zip(gradients, trainable_vars))

        # Compute our own metrics
        self.loss_tracker.update_state(loss)
        self.mae_metric.update_state(y, y_pred)
        return {"loss": self.loss_tracker.result(), "mae": self.mae_metric.result()}

    @property
    def metrics(self):
        # We list our `Metric` objects here so that `reset_states()` can be
        # called automatically at the start of each epoch
        # or at the start of `evaluate()`.
        # If you don't implement this property, you have to call
        # `reset_states()` yourself at the time of your choosing.
        return [self.loss_tracker, self.mae_metric]
# Construct an instance of CustomModel
inputs = keras.Input(shape=(32,))
outputs = keras.layers.Dense(1)(inputs)
model = CustomModel(inputs, outputs)

# We don't passs a loss or metrics here.
model.compile(optimizer="adam")

# Just use `fit` as usual -- you can use callbacks, etc.
x = np.random.random((1000, 32))
y = np.random.random((1000, 1))
model.fit(x, y, epochs=5)

In the above, no loss is passed in model.compile method, because, we’re manually computing loss value inside the train_step method. That’s great. Now, let’s assume I’ve more than one output from model and want to weighted them towards the final loss. In such case, what would be the approach to do so?

In model.compile, it has loss_weights arguments that can be used but for above cases, how to achieve this?

Model.compile(
    optimizer="rmsprop",
    loss=None,
    metrics=None,
    loss_weights=None,
    weighted_metrics=None,
    run_eagerly=None,
    steps_per_execution=None,
    jit_compile=None,
    **kwargs
)

In the above mention document, it somewhat discuss about class_weights and sample_weights, but not about loss_weights.

Issue Analytics

  • State:closed
  • Created 10 months ago
  • Comments:9 (4 by maintainers)

github_iconTop GitHub Comments

1reaction
rchaocommented, Nov 28, 2022

Eventually, I’ve just realized that…

Yes, this approach looks reasonable to me. Only suggestion is to name the attribute .loss_weights or something that incorporates loss in it to not be confused with layers’ weights. Thanks!

0reactions
rchaocommented, Nov 28, 2022

Thanks Mohammed!

Read more comments on GitHub >

github_iconTop Results From Across the Web

Keras Loss Functions: Everything You Need to Know
In this piece we'll look at: loss functions available in Keras and how to use them,; how you can define your own custom...
Read more >
Adaptive weighing of loss functions for multiple output keras ...
fit () method on total number of epochs (total_epochs), we can recompile the model with the adjusted loss weights after every epoch and...
Read more >
Customizing what happens in `fit()` - Keras
When you need to customize what fit() does, you should override the training step function of the Model class. This is the function...
Read more >
changeable loss weights for multiple output #2595 - GitHub
Hi all, what's an easy way to set changeable loss weights for multiple output ... for my application to pass alpha as a...
Read more >
Change multi-output loss weights based on epoch
... try to change it via custom callback just like tf.keras.callbacks. ... call model.compile() and then model.fit() for only one epoch:
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found