Skip to content

Releases: facebookincubator/flowtorch

Fixed duplicate parameters bug

27 Apr 05:18
Compare
Choose a tag to compare
  • Fixed a bug in distributions.Flow.parameters() where it returned duplicate parameters
  • Several tutorials converted from .mdx to .ipynb format in anticipation of new tutorial system
  • Removed yarn.lock

New class 'bij.Invert`, and `Bijector`s are now `nn.Module`'s

25 Apr 05:10
Compare
Choose a tag to compare

This release add two new minor features.

A new class flowtorch.bijectors.Invert can be used to swap the forward and inverse operator of a Bijector. This is useful to turn, for example, Inverse Autoregressive Flow (IAF) into Masked Autoregressive Flow (MAF).

Bijector objects are now nn.Modules, which amongst other benefits allows easily saving and loading of state.

Fixed bug in `bijectors.ops.Spline`

03 Mar 20:10
Compare
Choose a tag to compare

This small release fixes a bug in bijectors.ops.Spline where the sign of log(det(J)) was inverted for the .inverse method. It also fixes the unit tests so that they pick up this error in the future.

Caching of `x`, `y = f(x)`, and `log|det(J)|`

03 Feb 02:07
Compare
Choose a tag to compare

In this release, we add caching of intermediate values for Bijectors.

What this means is that you can often reduce computation by calculating log|det(J)| at the same time as y = f(x). It's also useful for performing variational inference on Bijectors that don't have an explicit inverse. The mechanism by which this is achieved is a subclass of torch.Tensor called BijectiveTensor that bundles together (x, y, context, bundle, log_det_J).

Special shout out to @vmoens for coming up with this neat solution and taking the implementation lead! Looking forward to your future contributions 🥳

Initial Release!

18 Nov 22:15
Compare
Choose a tag to compare

Implementations of Inverse Autoregressive Flow and Neural Spline Flow.

Basic content for website.

Some unit tests for bijectors and distributions.