Releases: facebookincubator/flowtorch
Fixed duplicate parameters bug
- Fixed a bug in
distributions.Flow.parameters()where it returned duplicate parameters - Several tutorials converted from
.mdxto.ipynbformat in anticipation of new tutorial system - Removed
yarn.lock
New class 'bij.Invert`, and `Bijector`s are now `nn.Module`'s
This release add two new minor features.
A new class flowtorch.bijectors.Invert can be used to swap the forward and inverse operator of a Bijector. This is useful to turn, for example, Inverse Autoregressive Flow (IAF) into Masked Autoregressive Flow (MAF).
Bijector objects are now nn.Modules, which amongst other benefits allows easily saving and loading of state.
Fixed bug in `bijectors.ops.Spline`
This small release fixes a bug in bijectors.ops.Spline where the sign of log(det(J)) was inverted for the .inverse method. It also fixes the unit tests so that they pick up this error in the future.
Caching of `x`, `y = f(x)`, and `log|det(J)|`
In this release, we add caching of intermediate values for Bijectors.
What this means is that you can often reduce computation by calculating log|det(J)| at the same time as y = f(x). It's also useful for performing variational inference on Bijectors that don't have an explicit inverse. The mechanism by which this is achieved is a subclass of torch.Tensor called BijectiveTensor that bundles together (x, y, context, bundle, log_det_J).
Special shout out to @vmoens for coming up with this neat solution and taking the implementation lead! Looking forward to your future contributions 🥳
Initial Release!
Implementations of Inverse Autoregressive Flow and Neural Spline Flow.
Basic content for website.
Some unit tests for bijectors and distributions.