Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Logging inconsistency when using `SplitMNIST` with task labels.

See original GitHub issue

Each strategy increments its own training_step_counter after each step. However, it does not distinguish between new step or new task. Therefore, when using SplitMNIST(n_steps=5, return_task_id=True), it logs Step 1 (Task 1), Step 2 (Task 2) and so on.

The expected behavior should be Step 1 (Task 1), Step 1 (Task 2) and so on.

Issue Analytics

  • State:closed
  • Created 3 years ago
  • Comments:5 (2 by maintainers)

github_iconTop GitHub Comments

AntonioCartacommented, Feb 1, 2021

The expected behavior should be Step 1 (Task 1), Step 1 (Task 2) and so on

I don’t think this is obvious or expected in general.

Personally, I prefer to have a step_id that increases even between different tasks (Step 1 (Task 1), Step 2 (Task 1), Step 3 (Task 2), ...). This way, the only difference between MT, MIT and SIT is the task_id=None in SIT scenarios during training.

AntonioCartacommented, Feb 1, 2021

Yes, I thought we already agreed on this 😃

We really need to write down this kind of high-level design choices in the documentation, so that we don’t end up making the same discussions over and over.

I’m closing this since we all agree.

Read more comments on GitHub >

github_iconTop Results From Across the Web

A Neural Dirichlet Process Mixture Model for Task-Free ...
Review: Summary: The paper proposes to use a Bayesian nonparametric mixture model for task-free (without explicit task labels) continual learning.
Read more >
Task Agnostic Continual Learning Using Online Variational ...
In section 2 we analyze this scenario, and present a method (“labels trick”) to improve the performance of “class learning”. For example, on...
Read more >
Split-MNIST - Papers With Code
In this paper, we propose a new method, named Self-Attention Meta-Learner (SAM), which learns a prior knowledge for continual learning that permits learning...
Read more >
Custom Neural Network Implementation on MNIST using ...
I wondered where to start with your multiquestion, and I decided to do so with a statement: Your code definitely should not look...
Read more >
(PDF) Investigating a Baseline Of Self Supervised Learning Towards ...
MNIST classification results for supervised and self-supervised learning ... carry independent supervised task to train a model using the proxy labels.
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Post

No results found

github_iconTop Related Hashnode Post

No results found