Fine-tuning to other datasets using the same self-supervised paradigm
See original GitHub issueThanks for providing such an awesome repository.
I am trying to transfer the learning ability of DINO to other datasets so that an excellent k-NN classifier can be learnt without heavy human annotations. Currently, I find the full checkpoint of ViT-S/16
does not include information of DINO head, which is important for fine-tuning in my experiments.
Is it possible to open-source the DINO head? Could you offer some suggestions for fine-tuning?
Issue Analytics
- State:
- Created 2 years ago
- Comments:6 (1 by maintainers)
Top Results From Across the Web
Mix-and-Match Tuning for Self-Supervised Semantic ...
Our paradigm follows the standard practice in existing self-supervised studies and no extra data or la- bel is required. With the proposed M&M...
Read more >Event-former: A Self-supervised Learning Paradigm for ...
We propose a new paradigm for self-supervised learning for multivariate ... be fine-tuned on a potentially much smaller event dataset, similar to other...
Read more >arXiv:2012.06908v2 [cs.LG] 29 Mar 2021
subsequent fine-tuning on other visual classification datasets ... in [9] uses only a self-supervised objective called “masked.
Read more >Reinforcement Learning as a fine-tuning paradigm
To train agents on real-world data, why don't we simply endow them with knowledge about the real world, and let the RL algorithms...
Read more >Differentially private deep learning can be effective with self ...
Self -supervised learning is a paradigm which leverages unlabeled data to learn ... Fine-Tuning Self-Supervised Models With DP-Optimization.
Read more >Top Related Medium Post
No results found
Top Related StackOverflow Question
No results found
Troubleshoot Live Code
Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start FreeTop Related Reddit Thread
No results found
Top Related Hackernoon Post
No results found
Top Related Tweet
No results found
Top Related Dev.to Post
No results found
Top Related Hashnode Post
No results found
Top GitHub Comments
Hi @kaleidoscopical The head is present in the
full ckpt
links: https://github.com/facebookresearch/dino#pretrained-modelsYou can start from these weights by downloading the full checkpoint and renaming it as
checkpoint.pth
into your experiment repository. For example for ViT-S/16:step1: create experiment repo
mkdir ssl_finetuning
step2: download full pretrained checkpoint and move it into your experiment repo
wget https://dl.fbaipublicfiles.com/dino/dino_deitsmall16_pretrain/dino_deitsmall16_pretrain_full_checkpoint.pth ssl_finetuning/checkpoint.pth
step3: launch dino training, this will start from
checkpoint.pth
located inoutput_dir
python -m torch.distributed.launch --nproc_per_node=8 main_dino.py --arch vit_small --data_path /path/to/imagenet/train --output_dir ssl_finetuning
Hope that helps.
I wish to fine-tune my data in a self-supervised manner, so the head is necessary.