Skip to content

Error obtaining attention weights from GATConv for heterogeneous graph #8177

Closed Answered by rusty1s
enter802 asked this question in Q&A
Discussion options

You must be logged in to vote

I am sorry to inform you that return_attention_weights=True and to_hetero is currently not compatible, and I don't see a good way to support this long-term. The only workaround would be to implement manually what to_hetero is doing internally, i.e., specifying a GATConv layer for every edge type, and then calling these layers in a loop:

self.convs = ModuleDict({edge_type: GATConv((-1, -1), output_dim, ...) for edge_type in edge_types})

def forward(self, x_dict, edge_index_dict):
    for edge_type, edge_index in edge_index_dict.items():
         src, _, dst = edge_type
         
         x, alpha = self.convs[edge_type]((x_dict[src], x_dict[dst]), edge_index, return_attention_weights=True)

Replies: 1 comment 5 replies

Comment options

You must be logged in to vote
5 replies
@enter802
Comment options

@xubingze
Comment options

@enter802
Comment options

@xubingze
Comment options

@xubingze
Comment options

Answer selected by enter802
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Category
Q&A
Labels
None yet
3 participants