New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We鈥檒l occasionally send you account related emails.
Already on GitHub? Sign in to your account
Providing a costume collate_fn to DataLoader has no affect #9263
Comments
Yes, PyG |
Thank you. |
If we would allow overriding Note that you can also customize concatenation by overriding |
Thanks. Nonetheless, the standard DataLoader fails to add a dimension to the edge index as the edges are different sizes for different graphs. So let's say I am not interested in the batching of the edge indexes in one huge graph, and I just want to wrap multiple graphs together, i.e., to stack the keys of the graphs in the batch, but the tensors of each key can be of different shapes (as in edge indexes). So the gradient computation will be done on the loss over the whole batch, but the forward pass will be done on each graph in the batch separately anyway (so GPU-wise it's not the most efficient it could be, but that's ok). |
Do you mean you simply want to "batch" tensors together by stacking them in a list? I am not yet sure I understand, sorry. |
Yes. So I have some costum keys in my Data object, that have different dimensions and I cannot stack them, I just want to put them in a list. |
I see, that's indeed currently not possible. What we could do is to provide an option in |
馃悰 Describe the bug
Pyg DataLoader can receive a custom collate_fn as it extends the torch DataLoader, but in its constructor, it doesn't use the given collate_fn; instead, it always uses Collater.
I'm not sure if this is a bug or if the documentation is wrong, but the Pyg documentation states that any parameter used in torch's DataLoader can be used with Pyg's DataLoader. Still, this collate_fn parameter cannot be used.
So, to actually use a custom collate_fn, do I have to Extend DataLoader to use the given collate_fn?
Thanks.
Versions
2.5.3
The text was updated successfully, but these errors were encountered: