Providing weights to NegativeSampling in LinkNeighborLoader #9229
Unanswered
aaronzberger
asked this question in
Q&A
Replies: 1 comment 3 replies
-
Do you mind to share a small code snippet to reproduce your error? Given that you have a node-level weight vector, you should be able to do loader = LinkNeighborLoader(
data,
neg_sampling=dict(mode='binary', amount=1, weight=weight_vector),
) |
Beta Was this translation helpful? Give feedback.
3 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi! I'm doing a link prediction task, but my graph is pretty highly unbalanced on node degree (some nodes have far more links than others). I'm passing the
NegativeSampling
class to theneg_sampling
argument in the construction of aLinkNeighborLoader
. I'm hypothesizing that weighting the negative sampling will lead to better performance (since, right now, I'm seeing that links between highest-degree nodes more often get predicted as always positive and ones between the lowest-degree ones always negative). However, I can't seem to figure out what theweight
parameter should actually be.It seems to need to be provided to the
sample
method at sample-time, since it sometimes needs to be of shapetorch.Size([NUMBER OF SOURCE NODES IN THE GRAPH])
and sometimes of shapetorch.Size([NUMBER OF TARGET NODES IN THE GRAPH])
.Can I provide this
weight
parameter atLinkNeighborLoader
construction, or would I need to call the functional method at each training iteration or something else? Specifically, I imagined I'd be able to provide simply the node degrees for all the source and/or entity nodes when constructing theLinkNeighborLoader
(perhaps a 1-d Tensor with sizetorch.Size([num_source_nodes])
ortorch.Size([num_target_nodes])
, or perhaps a 2-d Tensor with sizetorch.Size([num_source_nodes, num_target_nodes])
).Wondering if @rusty1s or someone else has a minute to help out. Thanks!
Beta Was this translation helpful? Give feedback.
All reactions