question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

Increase flexibility of setting seed

See original GitHub issue

I was running some code that has randomness during evaluation. However I found that my, supposedly, random preprocessing was always done in exactly the same way.

This is related to setting the random seed by default to 12: https://github.com/pytorch/ignite/blob/master/ignite/engine/engine.py#L583

An alternative mechanism, that respects the user’s choice of setting a random seed is to change seed =12 to:

seed = torch.randint(100)

which reads from the manual seed if it was set.

On Slack it was suggested to use int(time.time() * 100 % 1000), however this still surprises a user that wants to run the same script with two different random seeds and have both seeds reproducible.

I think it’s okay if torch.randint changes the random state, because many things in the dataloader already do as well.

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:23 (8 by maintainers)

github_iconTop GitHub Comments

1reaction
y0astcommented, Feb 18, 2020

Aside from using torch.randint there are a bunch of other options: https://pytorch.org/docs/stable/torch.html#random-sampling

Spelling out the two cases:

  1. users sets no random seed: need seed from ignite for proper restarting
  2. users sets random seed: ignite shouldn’t overwrite using default seed, it is unexpected behaviour for a training framework. Instead it should support restarting and respect the user seed (so it can be reproduced).

if we do torch.randint then 1) is safeguarded, but not reproducible between runs. Also 2) is safeguarded, but we do advance the random state by sampling from torch.randint.

I think ignite shouldn’t care about reproducibility between runs if user doesn’t set a seed (model parameters are different).

0reactions
y0astcommented, Mar 2, 2020

Actually, you don’t need to. As long as you set a manual seed in your script then it’ll work as expected.

Because the new mechanism reads from global random state: https://github.com/pytorch/ignite/pull/799/files

Read more comments on GitHub >

github_iconTop Results From Across the Web

SEED Model - CT.gov
Teacher Evaluation- Design Principles ; Foster dialogue about student learning, This model hinges on improving the professional conversation between and among ...
Read more >
Using Regulatory Flexibility to Address Market Informality in ...
Flexible approaches can result in extended access to seed systems, while maintaining quality standards and registration controls.
Read more >
The Three Seeds of Mobility. Part I: Going Beyond the Binaries ...
Another way to think of it is that movement that displays mobility (over flexibility) displays a skillful balance of flexibility, strength, and coordination....
Read more >
3 food tips for flexibility - Ekhart Yoga
We look at some of the foods that may aid flexibility, helping to ... pulses, dried fruit such as raisins, figs, calcium-set tofu,...
Read more >
6 Simple Ways to Increase Your Flexibility - HSS
For example, using a foam roller on your thigh can increase circulation to the muscles and improve flexibility. Simply roll up and down...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found