Skip to content

Releases: microsoft/ptgnn

v0.10.4: Use trick shown here https://pytorch.org/tutorials/recipes/recipes/tu…

21 Oct 21:42
Compare
Choose a tag to compare
…ning_guide.html#use-parameter-grad-none-instead-of-model-zero-grad-or-optimizer-zero-grad

v0.10.3

14 Sep 08:24
Compare
Choose a tag to compare
Option to catch occasional CUDA OOMs to allow for more robust training.

Fix loss reporting and AMP

25 Jul 19:09
Compare
Choose a tag to compare
v0.10.2

Fix reported training loss when using AMP

Autosaving of optimizer state

22 Jul 08:30
Compare
Choose a tag to compare

Multi-GPU training

14 Jul 13:44
27b3102
Compare
Choose a tag to compare
Merge pull request #13 from microsoft/dev/miallama/distributedtrainer

A multi-GPU distributed trainer

v0.9.2: Merge pull request #8 from microsoft/dev/miallama/pna-layer

12 Mar 08:00
207074d
Compare
Choose a tag to compare

v0.9.1

23 Feb 12:47
66cf9ea
Compare
Choose a tag to compare
Update setup.py

GNNs now allow edge attributes beyond edge types

23 Feb 12:28
7a23ce4
Compare
Choose a tag to compare
Merge pull request #9 from microsoft/dev/miallama/edge-features

Allow using arbitrary edge features in GNNs

v0.8.7

07 Dec 19:12
16c7ce6
Compare
Choose a tag to compare
Update pre-commit.yml

v0.8.6: Merge pull request #7 from microsoft/mallamanis/simplify-saving

30 Nov 09:58
2ef1159
Compare
Choose a tag to compare
Simplify model saving, when parent folder doesn't exist.