Is there data leakage in the maml-omniglot example?
See original GitHub issueIn the maml-omniglot.py
example code, net.train()
is used for meta-test phases (link).
Does this not cause data leakage of meta-test data via the statistics of nn.BatchNorm2d
(net
contains several nn.BatchNorm2d
)?
Issue Analytics
- State:
- Created 2 years ago
- Reactions:1
- Comments:6
Top Results From Across the Web
Why does the MAML split the omniglot data set randomly on ...
It's essentially analogous to, rather than having fixed train/test splits that everyone uses, having everyone choose a different random 80% of ...
Read more >MAML for Omniglot with dataloader - Kaggle
Explore and run machine learning code with Kaggle Notebooks | Using data from No attached data sources.
Read more >Intro to Meta-Learning – Weights & Biases - Wandb
This repo extends few-shot classification to learning from unlabeled examples and more challenging tasks with distractor (previously unseen) ...
Read more >Sharpness-Aware Model-Agnostic Meta Learning - arXiv
improves the known O(ϵ−3) sample complexity of MAML. (C3) Generalization analysis of ... and the generic population loss over a data distribution P....
Read more >Few-Shot Learning - Vevesta
Model Agnostic Meta-Learning or MAML, a few-shot algorithm, ... the tfrecord files containing the omniglot data and convert to a # dataset.
Read more >Top Related Medium Post
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
cross posted:
I think there is.
However, in my experience without it the model diverges and explodes after an adaptation step (i.e. e.g. 5 steps of the inner opt):
though these are on mini-imagenet but I am 100% that the issue causing it. when I do
mdl.train()
it goes away…