-
Notifications
You must be signed in to change notification settings - Fork 247
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Freezing layers - EfficientDet #87
Comments
I wrote the following lines in train_detector.py in order to perform transfer learning:
However I'm still not sure whether this is the correct way to do it |
This is how I am doing it |
Btw, I am trying to freeze half of the network but cant pinpoint the initial layers to freeze due to the architecture. Anyone got any idea about how that could be done? |
Hi!
Is the
load_pretrained_model_from
feature performing transfer learning with frozen layers, or would this need to be implemented somewhere else?I would like to freeze all layers but the last one and train using a pretrained model. How should I do this? Maybe something like the following, but I don't really know where to write it.
efficientdet = self.system_dict["local"]["model"]
for p in efficientdet.parameters(): p.requires_grad = False
for p in efficientdet[-1].parameters(): p.requires_grad = True
Thank you in advance!
The text was updated successfully, but these errors were encountered: