Skip to content

Adding 'batch mode' to MessagePassing #361

Answered by danielegrattarola
cclough asked this question in Q&A
Discussion options

You must be logged in to vote

A few thoughts.

  1. Batch mode is mostly there because it feels familiar to people coming to GNNs from typical ML, but in most cases you're better off using disjoint mode. So unless you need some specific pooling layer that only works in batch mode, I would suggest you to re-evaluate whether you really need batch mode (I don't know the specific of your project, I'm just giving my 2 cents)
  2. The best way to go about it would be to subclass Conv. There is no advantage of using MessagePassing over Conv anyway, at the end you just need to implement your desired tensor manipulation inside of the call method. MessagePassing only gives you a bit of structure in defining the tensor manipulation, and a…

Replies: 2 comments

Comment options

You must be logged in to vote
0 replies
Answer selected by cclough
Comment options

You must be logged in to vote
0 replies
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
2 participants