can't guarantee the reproducibility with seeding everything
See original GitHub issueIn the training code, I set all the random seeds,
random.seed(cfg.RNG_SEED)
np.random.seed(cfg.RNG_SEED)
torch.manual_seed(cfg.RNG_SEED)
if args.cuda:
torch.cuda.manual_seed(cfg.RNG_SEED)
torch.cuda.manual_seed_all(cfg.RNG_SEED)
torch.backends.cudnn.deterministic = True
torch.backends.cudnn.benchmark = False
os.environ['PYTHONHASHSEED'] = str(cfg.RNG_SEED)
But when I tried to run the same code (with same settings and hyperparameters) multiple times, I got different loss curves and evaluation results. Do you have any idea about this? Very much thanks.@jwyang
Issue Analytics
- State:
- Created 5 years ago
- Comments:10
Top Results From Across the Web
Reproducibility: fixing random seeds, and why that's not enough
Reproducible research is easy. Just log your parameters and metrics somewhere, fix seeds, and you are good to go”
Read more >Why can't I get reproducible results in Keras even though I set ...
EDIT: I changed my code by moving the setting of all seeds before importing Keras. The results are still not deterministic, however the...
Read more >[Solved] Reproducibility: Where is the randomness coming in?
So what's going on? It seems that if all the seeds are initialized, the results should be equal. A 1% variation over a...
Read more >How to Solve Reproducibility in ML - neptune.ai
Reproducibility in machine learning means being able to replicate the ML ... Neptune allows you to log anything that happens during ML runs, ......
Read more >Ensuring Training Reproducibility in PyTorch | LearnOpenCV
Note: PyTorch does not guarantee reproducibility of results across its ... #seed = 3 #torch.manual_seed(seed) # set device to CUDA if ...
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
@jwyang @squirrel16 Finally I find that with RoiPooling the model is reproducible after every rng_seed is set and cudnn is deterministic. I’ m not clear what makes RoIAlign non-deterministic. I’ m grateful if you can dig out the reason.
Usually the seed in worker_init_fn is set to SEED+worker_id. Maybe you can have a try.
------------------ Original ------------------ From: Mohandass Muthuraja <notifications@github.com> Date: Thu,Oct 17,2019 7:07 PM To: jwyang/faster-rcnn.pytorch <faster-rcnn.pytorch@noreply.github.com> Cc: Jokoe66 <531211903@qq.com>, Mention <mention@noreply.github.com> Subject: Re: [jwyang/faster-rcnn.pytorch] can’t guarantee the reproducibility with seeding everything (#394)
yes . I did . np.random.seed(3)``` and
dataloader = torch.utils.data.DataLoader(dataset, batch_size=args.batch_size, sampler=sampler_batch, num_workers=args.num_workers, worker_init_fn=_init_fn)
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub, or unsubscribe.