question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

After training by the default args, the result isn't good

See original GitHub issue

Excuse me, after training by the default arguments, I get the recall and NDCG score, but the result isn’t as good as the report in the paper. Here is my result after 500 epochs:

recall@50, ndcg@50
0.0912251655629139 0.04411650302936786
recall@100, ndcg@100
0.28807947019867547 0.08591003366756622
recall@200, ndcg@200
0.43559602649006623 0.10980291110041669

Why is the result of pretty lower than the report of the paper?

Issue Analytics

  • State:open
  • Created 3 years ago
  • Comments:5 (3 by maintainers)

github_iconTop GitHub Comments

2reactions
chenchongthucommented, Apr 23, 2020

不是這樣的,因為我們的loss在優化時把常數項(正,但是無梯度)去掉了,你可以看一下論文,所以我們的loss會是負的。但是還是會降到一定程度就收斂的,而且收斂很快(相比於採樣的方法)。

1reaction
chenchongthucommented, Apr 23, 2020

By using the above settings, you will get a better result like us, (dropout is an important parameter, our code has been updated to the latest version).

recall@50, ndcg@50 0.31026490066225165 0.09599689344574573 recall@100, ndcg@100 0.45397350993377483 0.11925239149771916 recall@200, ndcg@200 0.6041390728476821 0.14029391165062766

Read more comments on GitHub >

github_iconTop Results From Across the Web

c++ - Why is a parameter pack allowed after default arguments?
Non-default, non-parameter-pack arguments cannot follow a default argument without resulting the in the requirement that the default argument ...
Read more >
Trainer - Hugging Face
The Trainer class provides an API for feature-complete training in PyTorch for ... Expand 11 parameters ... We provide a reasonable default that...
Read more >
Why my Parameter Tuning gives worse results? - Kaggle
Parameter Tuning is a try and error process where you try different values and get different accuracies every time (It can be low...
Read more >
Python SyntaxError: non-default argument follows default
To solve this error, make sure that you arrange all the arguments in a function so that default arguments come after non-default arguments....
Read more >
Trainer — PyTorch Lightning 1.8.6 documentation
Automatically tries to find the largest batch size that fits into memory, before any training. # default used by the Trainer (no scaling...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found