You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
hi, thank you for sharing your work, but I'm really interested in how your training phase looks like, in your paper you mentioned that your batch size is 64, what I want to know is, a batch here means 64 FODs in a batch or 64 patches in a batch?
If it is 64 patches in a batch, I noticed that your learning rate is 0.01, which is, in my point of view, to large, and make the network learning painfully. If the batch is 64 FODs, how did you solve the memory issues.
Looking forward for your reply.
Kind regards
Jia
The text was updated successfully, but these errors were encountered:
hi, thank you for sharing your work, but I'm really interested in how your training phase looks like, in your paper you mentioned that your batch size is 64, what I want to know is, a batch here means 64 FODs in a batch or 64 patches in a batch?
If it is 64 patches in a batch, I noticed that your learning rate is 0.01, which is, in my point of view, to large, and make the network learning painfully. If the batch is 64 FODs, how did you solve the memory issues.
Looking forward for your reply.
Kind regards
Jia
The text was updated successfully, but these errors were encountered: