Skip to content

Releases: gorgonia/gorgonia

v0.9.18

03 Dec 23:10
d7a3ce2
Compare
Choose a tag to compare

Possibly last release of the 0.9 branch before the huge changes coming from v0.10.0

CI on ARM64 / API change (BatchNorm1d removed)

14 Mar 07:55
19def46
Compare
Choose a tag to compare

CI

CI (GitHub actions) has a new template system that will ease the go releases' upgrade.
On top of that, it now has a custom runner for ARM64. This leads to discovering and fixing a couple of issues in the tests on ARM64.

Fixes

  • Support flat weights for the BatchNorm op (#465)
  • fix the reset method of the tape machine (#467)
  • fix clipping in Adam solver (#469)
  • fix panic message in GlorotEtAlN64 (#470)
  • fix concurrent example (#472)

API change

  • functions to create primitive Value types (NewF64, NewF32, ...) (#481)
  • Breaking change: the BatchNorm1d function has been removed; BatchNorm function supports 1d and 2d operations (#482)

Clarified Semantics

31 Dec 13:10
bc72cf7
Compare
Choose a tag to compare

This version incorporates the semantics clarification of the tensor package - the unsafe pointer things are cleaned up as well.

Small bugfixes to SoftMax was also fixed - SoftMax no longer cause a race condition.

Bugfix release: Vectors were not properly broadcasted

28 Sep 01:19
45cf447
Compare
Choose a tag to compare

When vectors were broadcast with a repeat of 1, one of the values is accidentally zero'd. This leaves very strange artifacts in neural networks.

This has now been fixed

Complex Numbers Are Now Supported

10 Sep 18:15
a16c5b5
Compare
Choose a tag to compare

With the release of gorgonia.org/tensor@v0.9.11, the tensor now supports complex numbers as well

New Experimental GoMachine

06 Aug 14:36
41f024d
Compare
Choose a tag to compare

This references GoMachine's new implementation.

Upsample2D is added as a function

18 Jun 18:39
9f5ade0
Compare
Choose a tag to compare

The Upsample2D operator has been added by @cpllbstr . It is similar to the operator in PyTorch: https://pytorch.org/docs/master/generated/torch.nn.Upsample.html

Fixed some shape inference issues

15 Jun 19:24
15014b3
Compare
Choose a tag to compare

Due to the great work by @wzzhu, shape inference is now a bit more robust. It goes back to the original Gorgonia understanding of shapes - where reductions do not aggressively squeeze the dimensions.

Repeat is deoptimized

10 Apr 09:10
0640ff1
Compare
Choose a tag to compare

In the previous version, the repeatOp was a compound operation. It had this function signature effectively: func repeat(a, nTimes *Node, axes ...int). So you could do something like repeat(a, 300, 1, 2, 3) in which a gets repeated 300 times across axes 1, 2 and 3.

This has been deoptimized such that it's effectively func repeat(a, repeat *Node, axis int). The reason for this deoptimization is because upon further analyses of what the function actually does, it simply calls tensor.Repeat many times. This causes many new tensors to be allocated. But the whole point of symbolic operations is so that we may preallocate ahead of time.

This deoptimization allows for the repeatOp to call tensor.RepeatReuse which allows for a repeat operation to reuse preallocated values, leading to less allocations, improving performance

Bugfix: Dropout

25 Mar 21:35
3dc3784
Compare
Choose a tag to compare

Dropout had a long standing bug that was fixed by @MarkKremer