Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unet not passed and unused parameters error sometimes #19

Open
LambdaGuard opened this issue Jan 19, 2024 · 0 comments
Open

Unet not passed and unused parameters error sometimes #19

LambdaGuard opened this issue Jan 19, 2024 · 0 comments

Comments

@LambdaGuard
Copy link

if tensors[0].features.shape[0] == 0:
return (targets.new_zeros((0, 1)),
targets.new_zeros((0, 1)),
targets.new_zeros(0),
targets.new_zeros(0),
[targets.new_zeros((0, 7)) for i in range(len(bbox_list))],
[targets.new_zeros(0) for i in range(len(bbox_list))],
[targets.new_zeros(0) for i in range(len(bbox_list))])
feats = ME.SparseTensor(tensors[0].features[:, :-2], tensors[0].coordinates)
targets = tensors[0].features[:, -2:]
preds = self.unet(feats).features
return preds, targets, feats.coordinates[:, 0].long(), ids[0], rois[0], scores[0], labels[0]

In this If branch, if we have detection result so poor that no GT is assgined to any rois, it will return a empty result with proper shapes.

However, if we trigger this branch and return without passing through the Unet, pytorch will find parameters of Unet unused and raises error, which may happen in a few iterations after training starts.

Adding find_unused_parameters=True in the config file would solve this preblem. However this may be brutal and cost more time for pytorch will check all parameters each iteration. Even if find_unused_parameters can be turned off after some training, like 1 epoch.

Another solution is to set Unet not requiring grad when going in this branch and vice versa. Unfortunately I failed trying setting it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant