Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Lower depth yields same performance #95

Open
Youyoun opened this issue Jun 21, 2022 · 0 comments
Open

Lower depth yields same performance #95

Youyoun opened this issue Jun 21, 2022 · 0 comments

Comments

@Youyoun
Copy link

Youyoun commented Jun 21, 2022

Hi !

Thanks for publishing your training repository !

I trained the 17 depth model on BSD dataset (train) using the pytorch_training scripts (that I fixed because of some compatibility issues), and it yielded the same result as a model of depth 4 (didn't try to go lower).

It just feels weird that I get the same mean train loss / PSNR with a much lower depth model. I only tried on gray scale images.

Is it possible that something is wrong with the code, or is this result normal ? Is there a thorough study on the performance of DnCNN according to its depth ?

Thank you in advance for your response.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant