question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Cannot get cross-entropy to work

See original GitHub issue

Hi,

I’m playing around with this code for a research project and everything works fine with mean-squared-error, however as soon as I switch to cross-entropy (which I really want), it does not converge and loss gets bigger over time… I tried numerous parameters but nothing seems to work. I’m using MNIST with the following model.

model = StackedAutoEncoder(
    dims=[100],
    activations=['softmax'], 
    noise='gaussian', 
    epoch=[1000],
    loss='cross-entropy',
    lr=0.005,
    batch_size=100,
    print_step=100
)

Do you know why this isn’t working?

Thanks!

Issue Analytics

  • State:closed
  • Created 7 years ago
  • Comments:14 (1 by maintainers)

github_iconTop GitHub Comments

2reactions
elggemcommented, Nov 9, 2016

I can confirm that using a linear layer and scaled data, global loss decreases over time as expected! On visualization of the filters I still get essentially random noise (rmse shows nice detectors) and loss gets stuck at around 5.0, but that must be a problem with either the visualization or parameters. I am aiming for filters like the ones in the paper you cite in Readme (Vincent et al. 2010) on page 3390.

For now I consider this issue as closed, thank you for the pointer in the right direction, it is much appreciated! 👍

1reaction
rajarsheemcommented, Nov 9, 2016

@elggem: @Nilabhra has pushed a commit . Hope it solves your issue. Get back to us in case of any help.

Read more comments on GitHub >

github_iconTop Results From Across the Web

Why can't I use Cross Entropy Loss for multilabel?
Basically I'm splitting the logits (just not concatinating them) and the labels. I then do Cross Entropy loss on both of them and...
Read more >
Cross entropy loss error - PyTorch Forums
When I use cross entropy loss in my code, with nn.NLLLoss() or code implemented by myself, the loss is very strange, like the...
Read more >
Why Isn't Cross Entropy Used in SVM?
SVMs don't do this, so cross entropy won't work. SVMs don't use MSE, they use the hinge loss, which gives them their maximum...
Read more >
What Is Cross Entropy Loss? A Tutorial With Code - Wandb
Cross entropy loss is a metric used to measure how well a classification model in machine learning performs. The loss (or error) is...
Read more >
Cross-Entropy Loss Function - Towards Data Science
When working on a Machine Learning or a Deep Learning Problem, loss/cost ... The purpose of the Cross-Entropy is to take the output ......
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found