You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have come across the following main deviation between the paper(https://arxiv.org/abs/1710.09829) and the implementation of this repo:
In the paper: there are 32 primary capsule layer with 6x6 capsules each of 8 dimensions each. Hence, we needs to have 32 independent convolution layers with 8 output channels.
In the repo: it is implemented to be having 8 independent convolution layers with 32 output channels.
I have noticed this issue in another repo as well. I'm not sure if there is a misunderstanding in my interpretation of the paper. gram-ai/capsule-networks#23
Can you please check and comment on this?
The text was updated successfully, but these errors were encountered:
This is a good question! I wonder if it has to do with weight sharing? It says in the paper that "each capsule in the [6 × 6] grid is sharing their weights with each other" so I'm wondering if its only per capsule dimension. Any input is useful! This is just my conjecture
I believe the sharing of weights in [6 x 6] grid is achieved by default due to the use of Convolution layer with 8 dimensions (Channels). Don't you think so?
I have come across the following main deviation between the paper(https://arxiv.org/abs/1710.09829) and the implementation of this repo:
In the paper: there are 32 primary capsule layer with 6x6 capsules each of 8 dimensions each. Hence, we needs to have 32 independent convolution layers with 8 output channels.
In the repo: it is implemented to be having 8 independent convolution layers with 32 output channels.
reference:
https://photos.app.goo.gl/FeCg4ejNdF3eVPvh6
https://github.com/cedrickchee/capsule-net-pytorch/blob/master/capsule_layer.py#L52
I have noticed this issue in another repo as well. I'm not sure if there is a misunderstanding in my interpretation of the paper.
gram-ai/capsule-networks#23
Can you please check and comment on this?
The text was updated successfully, but these errors were encountered: