Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the params and FLOPs of ENet #44

Open
mrzhouxixi opened this issue Oct 12, 2020 · 2 comments
Open

Question about the params and FLOPs of ENet #44

mrzhouxixi opened this issue Oct 12, 2020 · 2 comments

Comments

@mrzhouxixi
Copy link

Does anyone reproduce the ENet?Why the params and GFLOPs of my reproduced Network are about 10 and 4 times bigger than the values mentiond in the original paper (Table 3) respectively?
My calculated value—— params: 3.5Million,GFLOPs: 16.9

@davidtvs
Copy link
Owner

davidtvs commented Oct 18, 2020

I just used THOP to confirm the number of parameters and GFLOPS for the same input size that's given in table 3 from the paper (3 x 640 x 360):

from models.enet import ENet
from thop import profile

model = ENet(12).to('cpu')
input = torch.randn(1, 3, 640, 360)
flops, num_parameters = profile(model, (input,), verbose=False)

I got 2.2 GFLOPS and 0.35Milion parameters. There's very little difference in the number of parameters but a significant difference in the number of FLOPS that I could look into.

How did you find your calculated values?

@rashedkoutayni
Copy link

For 3x224x224 input:
Computational complexity: 0.48 GMac
Number of parameters: 350.65 k

For 3x640x360 input:
Computational complexity: 2.2 GMac
Number of parameters: 350.65 k

This profiling was done using ptflops.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants