question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Should have option to use output weights when validating

See original GitHub issue

We discussed this in passing before, but I think we need to discuss it directly. There are at least some applications where you really do need to use sample_weight in validation as well as in training. This should at least be an option (I’ll be building this for my uses either way, but would like to do so in such a way that I can merge it in to the master branch)

One use of sample_weight is to mask output values when you’re doing sequence to sequence learning with different length sequences. If you don’t also use sample_weight in validation, your loss will be hugely increased by all the meaningless masked output values.

See, for instance, the discussion happening over in #451

This applies to class_weight as well when you are classifying inputs and have very different numbers of training examples for each class, but want to remove that bias in your training. If I have 99 examples in class A for each example in class B I’d like to be able to, rather than throwing away 98% of my training examples to normalize class counts, rather pass in a class_weight of 1/99 to class A, and get equivalent validation loss.

Issue Analytics

  • State:closed
  • Created 8 years ago
  • Comments:14 (11 by maintainers)

github_iconTop GitHub Comments

3reactions
skarakulakcommented, Mar 30, 2018

It would be nice to have a class weighted validation loss without the need to modify the loss function itself.

3reactions
dc-aicommented, Feb 24, 2017

It looks like class_weight still does not get factored into validation loss calculations?

I only found an option for adding in sample_weight to the fit command.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Using class weights with validation data in Keras
I am using validation data for hyperparameter optimization and am trying to use class weights. For model.fit() , there is an option to...
Read more >
Neural Network Train-Validate-Test Stopping
This means you should stop training at epoch 30 and use the values of the weight and biases at that epoch. The test...
Read more >
About Train, Validation and Test Sets in Machine Learning
The validation set is used to evaluate a given model, but this is for frequent evaluation. We, as machine learning engineers, use this...
Read more >
Cross-Validation in Machine Learning: How to Do It Right
Save the result of the validation; Repeat steps 3 – 6 k times. Each time use the remaining fold as the test set....
Read more >
Training & evaluation with the built-in methods - Keras
If your model has multiple outputs, you can specify different losses and metrics for each output, and you can modulate the contribution of...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found