question-mark
Stuck on an issue?

Lightrun Answers was designed to reduce the constant googling that comes with debugging 3rd party libraries. It collects links to all the places you might be looking at while hunting down a tough bug.

And, if you’re still stuck at the end, we’re happy to hop on a call to see how we can help out.

How to finetune from pretrained models trained on coco data with different number of classes?

See original GitHub issue

Is there a config option to load pretrained coco models for finetuning? The last layers where the number of classes may be different, so those weights should not be loaded. If I just use faster_rcnn_r50_fpn_1x_20181010-3d1b3351.pth, Then I got

While copying the parameter named bbox_head.fc_cls.weight, whose dimensions in the model are torch.Size([21, 1024]) and whose dimensions in the checkpoint are torch.Size([81, 1024]).

Or, how should I modify the pretrained weight for my model?

Thank you!

Issue Analytics

  • State:closed
  • Created 4 years ago
  • Comments:11 (2 by maintainers)

github_iconTop GitHub Comments

8reactions
cowry5commented, Jun 1, 2019

Thank you so much @spytensor , I’ve solved the problem by this

import torch
pretrained_weights  = torch.load('faster_rcnn_r50_fpn_1x_20181010-3d1b3351.pth')

num_class = 21
pretrained_weights['state_dict']['bbox_head.fc_cls.weight'].resize_(num_class, 1024)
pretrained_weights['state_dict']['bbox_head.fc_cls.bias'].resize_(num_class)
pretrained_weights['state_dict']['bbox_head.fc_reg.weight'].resize_(num_class*4, 1024)
pretrained_weights['state_dict']['bbox_head.fc_reg.bias'].resize_(num_class*4)

torch.save(pretrained_weights, "faster_rcnn_r50_fpn_1x_%d.pth"%num_class)
3reactions
moshelivcommented, Aug 8, 2019

Thank you so much @spytensor , I’ve solved the problem by this

import torch
pretrained_weights  = torch.load('faster_rcnn_r50_fpn_1x_20181010-3d1b3351.pth')

num_class = 21
pretrained_weights['state_dict']['bbox_head.fc_cls.weight'].resize_(num_class, 1024)
pretrained_weights['state_dict']['bbox_head.fc_cls.bias'].resize_(num_class)
pretrained_weights['state_dict']['bbox_head.fc_reg.weight'].resize_(num_class*4, 1024)
pretrained_weights['state_dict']['bbox_head.fc_reg.bias'].resize_(num_class*4)

torch.save(pretrained_weights, "faster_rcnn_r50_fpn_1x_%d.pth"%num_class)

Can you please share your experience with fine tuning like this? What lr schedule did you use? How many epochs?

Read more comments on GitHub >

github_iconTop Results From Across the Web

Tutorial 7: Finetuning Models - MMDetection's documentation!
Tutorial 7: Finetuning Models · Inherit base configs · Modify head · Modify dataset · Modify training schedule · Use pre-trained model.
Read more >
Fine-tune a pretrained model - Hugging Face
This is known as fine-tuning, an incredibly powerful training technique. In this tutorial, you will fine-tune a pretrained model with a deep learning ......
Read more >
Finetune model with different number of classes using object ...
Yes it is mainly the idea of the Tensorflow object detection garden to fine-tune the models! You should change :
Read more >
08. Finetune a pretrained detection model
Finetuning from pre-trained models can help reduce the risk of overfitting. Finetuned model may also generalizes better if the previously used dataset is...
Read more >
The practical guide for Object Detection with YOLOv5 algorithm
To further compensate for a small dataset size, we'll use the same backbone as the pretrained COCO model, and only train the model's...
Read more >

github_iconTop Related Medium Post

No results found

github_iconTop Related StackOverflow Question

No results found

github_iconTroubleshoot Live Code

Lightrun enables developers to add logs, metrics and snapshots to live code - no restarts or redeploys required.
Start Free

github_iconTop Related Reddit Thread

No results found

github_iconTop Related Hackernoon Post

No results found

github_iconTop Related Tweet

No results found

github_iconTop Related Dev.to Post

No results found

github_iconTop Related Hashnode Post

No results found