Increase flexibility of setting seed
See original GitHub issueI was running some code that has randomness during evaluation. However I found that my, supposedly, random preprocessing was always done in exactly the same way.
This is related to setting the random seed by default to 12: https://github.com/pytorch/ignite/blob/master/ignite/engine/engine.py#L583
An alternative mechanism, that respects the user’s choice of setting a random seed is to change seed =12
to:
seed = torch.randint(100)
which reads from the manual seed if it was set.
On Slack it was suggested to use int(time.time() * 100 % 1000)
, however this still surprises a user that wants to run the same script with two different random seeds and have both seeds reproducible.
I think it’s okay if torch.randint
changes the random state, because many things in the dataloader already do as well.
Issue Analytics
- State:
- Created 4 years ago
- Comments:23 (8 by maintainers)
Top GitHub Comments
Aside from using
torch.randint
there are a bunch of other options: https://pytorch.org/docs/stable/torch.html#random-samplingSpelling out the two cases:
if we do
torch.randint
then 1) is safeguarded, but not reproducible between runs. Also 2) is safeguarded, but we do advance the random state by sampling fromtorch.randint
.I think ignite shouldn’t care about reproducibility between runs if user doesn’t set a seed (model parameters are different).
Actually, you don’t need to. As long as you set a manual seed in your script then it’ll work as expected.
Because the new mechanism reads from global random state: https://github.com/pytorch/ignite/pull/799/files