Skip to content

Commit

Permalink
Fixed bug in bijectors.ops.spline.Spline and unit test for log(detJ…
Browse files Browse the repository at this point in the history
…)) (#100)

Summary:
### Motivation
From other work, I discovered that training Neural Spline Flows was not working as expected, being unable to learn simple toy distributions... This PR fixes this, as well as the reason that the unit tests were not picking it up.

### Changes proposed
I changed the sign of the `log(det(J))` in the `inverse` method of `bijectors.ops.spline.Spline`, and ensured the unit tests are not using cached values of `log(det(J))` (via `BijectiveTensor`) when comparing `log(det(J))` from the forward method to that of the inverse one.

Pull Request resolved: #100

Test Plan: Run `pytest tests/` and try the Neural Spline Flow example in the theory tutorials

Reviewed By: ToddSmall

Differential Revision: D34616134

Pulled By: stefanwebb

fbshipit-source-id: d29a94fd80b2e32a920da194a3c75c05c382e461
  • Loading branch information
stefanwebb authored and facebook-github-bot committed Mar 3, 2022
1 parent 45e91ca commit 4992731
Show file tree
Hide file tree
Showing 2 changed files with 7 additions and 6 deletions.
5 changes: 1 addition & 4 deletions flowtorch/bijectors/ops/spline.py
Original file line number Diff line number Diff line change
Expand Up @@ -53,10 +53,7 @@ def _inverse(
self, y: torch.Tensor, params: Optional[Sequence[torch.Tensor]]
) -> Tuple[torch.Tensor, torch.Tensor]:
x_new, log_detJ = self._op(y, params, inverse=True)

# TODO: Should I invert the sign of log_detJ?
# TODO: A unit test that compares log_detJ from _forward and _inverse
return x_new, _sum_rightmost(log_detJ, self.domain.event_dim)
return x_new, _sum_rightmost(-log_detJ, self.domain.event_dim)

def _log_abs_det_jacobian(
self,
Expand Down
8 changes: 6 additions & 2 deletions tests/test_bijector.py
Original file line number Diff line number Diff line change
Expand Up @@ -108,12 +108,16 @@ def test_inverse(flow, epsilon=1e-5):
x_true = torch.distributions.transform_to(bij.domain)(x_true)

y = bij.forward(x_true)
J_1 = y.log_detJ
y = y.detach_from_flow()

x_calculated = bij.inverse(y)
J_2 = x_calculated.log_detJ
x_calculated = x_calculated.detach_from_flow()

assert (x_true - x_calculated).abs().max().item() < epsilon

# Test that Jacobian after inverse op is same as after forward
J_1 = bij.log_abs_det_jacobian(x_true, y)
J_2 = bij.log_abs_det_jacobian(x_calculated, y)
assert (J_1 - J_2).abs().max().item() < epsilon


Expand Down

0 comments on commit 4992731

Please sign in to comment.