Git Product home page Git Product logo

awesome-normalizing-flows's People

Contributors

byeongkeunahn avatar cranmer avatar francois-rozet avatar hanlaoshi avatar hartikainen avatar hushon avatar iphysresearch avatar janosh avatar jejjohnson avatar jfcrenshaw avatar kleinicke avatar ksachdeva avatar mattskiff avatar maximevandegar avatar naagar avatar prbzrg avatar pre-commit-ci[bot] avatar qazalbash avatar rafaelorozco avatar thoglu avatar vincentstimper avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

awesome-normalizing-flows's Issues

Back fill date_added

Would be good to go back in git history and set the date_added field on some of the older items in this collection.

date_added should be mandatory for all newly added items.

Related #44.

Add SurVAE Flows

Hi, thanks for the comprehensive list and nice summary of materials. I would like to recommend SurVAE Flows to the list which I had an interesting read.

They present a generalized framework (SurVAE Flows) which encompasses Flows (deterministic maps) and VAEs (stochastic maps). By seeing a deterministic map (x=f(z)) as a limiting case of a stochastic map (x~p(x|z)), the ELBO can be reinterpreted as a change of variables formula for the stochastic maps. Moreover, stochastic maps are able to model surjections, which might be useful in incorporating bottleneck architectures to Flows. They also give few examples of surjective layers, which can be composed together with Flow layers.

New Pytorch package - Jammy Flows

I would like to add the package jammy_flows to the collection (https://github.com/thoglu/jammy_flows). I developed it for normalizing-flow inference in the context of astro-particle physics (https://arxiv.org/abs/2008.05825), but I think it might be useful in general.

It stands for Joint Autoregressive Manifold (MY) flows and models a (conditional) PDF on a tensor product of manifolds. The sub-PDFs are connected autoregressively similar to Inverse Autoregressive Flows (IAF - Kingma et al. 2016), but compared to IAF generalize via 1) allowing for arbitrary (non-affine) coupling layers (every flow in the package is amortizable) and 2) because we have general couplings we can link flows on different manifolds (e.g. Euclidean and a sphere).

It is mostly designed for low-dimensional applications (maybe a few 10s of dimensions - although simple flows ĺike the affine flow should reasonably work at much higher dimensionality) and should be simple to set up.

For example, a 5-d PDF defined on a 3-dimensional Euclidean manifold and an autoregressively linked 2-sphere conditional PDF is defined like this (together this forms a joint distribution on the tensor product space $\mathcal{R}^3 \times \mathcal{S}^2)$:

import jammy_flows

pdf=jammy_flows.pdf("e3+s2", "gg+n")

The first argument defines the autoregressive manifold structure, the second argument the exact flow layer used for each manifold.
Each flow is abbreviated with a letter (see below) - for example "g" stand for a Gaussinization flow layer and "n" for an autoregressive flow on the 2-sphere. The autoregressive connectivity and amortization (for conditional PDFs) is taken care of by the module, and in the configuration indicated by a "+" connecting the different options.

Without much tuning, you should get something that just works and has the properties of a flexible PDF... however high customization of flow parameters and connectivity is also possible if desired.

It implements a few state of the art flows that to my knowledge are not really found in other repositories yet (e.g. Gaussianization flows).

Currently implemented manifolds with respective flows (from the README):

Euclidean flows:

  • Generic affine flow (Multivariate normal distribution) ("t")
  • Gaussianization flow arXiv:2003.01941 ("g")
  • Hybrid of nonlinear scalings and rotations ("Polynomial Stretch flow") ("p")

1-sphere Flows:

2-sphere Flows:

Interval Flows:

Simplex Flows:

All of those can be combined in any way and the package automatically manages the connectivity.
More info can also be found in the docs.

Best,
Thorsten

LAMPE: a PyTorch package for posterior estimation which implements normalizing flows

Hello 👋,

TLDR. The lampe package implements normalizing flows with PyTorch. I believe this is relevant for this collection. I hope you like it!

I'm a researcher interested in simulation-based inference and posterior estimation. I have written a low-level library for amortized posterior estimation called lampe. Initially, LAMPE was relying on nflows for its normalizing flows, but it quickly became a limitation. I was not happy with some of nflows design choices. For instance, it is only possible to sample or evaluate batches and most operators do not support broadcasting. It is also not possible to use other networks than the built-in ones. I considered contributing to nflows, but it seems the package is not actively developed anymore.

So I decided to implement my own normalizing flows within LAMPE. The goal was to rely as much as possible onto the already existing distributions and transformations of PyTorch. Unfortunately, PyTorch distributions and transforms are not modules, meaning that they don't implement a forward method, you cannot send the parameters to GPU with .to('cuda') or even get their parameters with .parameters(). To solve this problem, LAMPE defines two (abstract) classes: DistributionModule and TransformModule. The former is any nn.Module whose forward method returns a PyTorch
Distribution. Similarly, the latter is any nn.Module whose forward method returns a PyTorch Transform. Then, what is a normalizing flow? It is simply a nn.Module that is constructed from a base DistributionModule and a list of TransformModule.

This design allows for very concise implementations of quite complex flows. Currently, LAMPE implements masked autoregressive flow (MAF), neural spline flow (NSF), neural autoregressive flow (NAF) and NAF based on unconstrained monotonic neural network (UMNN). All these flows support coupling (2 passes for inverse), fully autoregressive (as many passes as features) or anything in between (see Graphical Normalizing Flows). And all of that in about 800 lines of code, including whitespace and documentation. If you are interested, take a look at the transformations and flows.

Here is a small example with a neural spline flow (NSF).

>>> import lampe
>>> flow = lampe.nn.flows.NSF(7, context=16, transforms=3, hidden_features=[64] * 3, activation='ELU')
>>> flow
NSF(
  (transforms): ModuleList(
    (0): SoftclipTransform(bound=5.0)
    (1): MaskedAutoregressiveTransform(
      (base): MonotonicRQSTransform(bins=8)
      (order): [0, 1, 2, 3, 4, 5, 6]
      (params): MaskedMLP(
        (0): MaskedLinear(in_features=23, out_features=64, bias=True)
        (1): ELU(alpha=1.0)
        (2): MaskedLinear(in_features=64, out_features=64, bias=True)
        (3): ELU(alpha=1.0)
        (4): MaskedLinear(in_features=64, out_features=64, bias=True)
        (5): ELU(alpha=1.0)
        (6): MaskedLinear(in_features=64, out_features=161, bias=True)
      )
    )
    (2): MaskedAutoregressiveTransform(
      (base): MonotonicRQSTransform(bins=8)
      (order): [6, 5, 4, 3, 2, 1, 0]
      (params): MaskedMLP(
        (0): MaskedLinear(in_features=23, out_features=64, bias=True)
        (1): ELU(alpha=1.0)
        (2): MaskedLinear(in_features=64, out_features=64, bias=True)
        (3): ELU(alpha=1.0)
        (4): MaskedLinear(in_features=64, out_features=64, bias=True)
        (5): ELU(alpha=1.0)
        (6): MaskedLinear(in_features=64, out_features=161, bias=True)
      )
    )
    (3): MaskedAutoregressiveTransform(
      (base): MonotonicRQSTransform(bins=8)
      (order): [0, 1, 2, 3, 4, 5, 6]
      (params): MaskedMLP(
        (0): MaskedLinear(in_features=23, out_features=64, bias=True)
        (1): ELU(alpha=1.0)
        (2): MaskedLinear(in_features=64, out_features=64, bias=True)
        (3): ELU(alpha=1.0)
        (4): MaskedLinear(in_features=64, out_features=64, bias=True)
        (5): ELU(alpha=1.0)
        (6): MaskedLinear(in_features=64, out_features=161, bias=True)
      )
    )
    (4): Inverse(SoftclipTransform(bound=5.0))
  )
  (base): DiagNormal(loc: torch.Size([7]), scale: torch.Size([7]))
)

The flow is currently a nn.Module. To condition the flow with respect to a context y, we call it. This returns a distribution which can be evaluated (log_prob) or sampled (sample) just like any torch distribution.

>>> y = torch.randn(16)
>>> conditioned = flow(y)
>>> conditioned.sample()
tensor([ 1.1381,  0.3619, -1.9963,  0.2681, -0.1613,  0.1885, -0.4108])
>>> conditioned.sample((5, 6)).shape
torch.Size([5, 6, 7])
>>> x = torch.randn(7)
>>> conditioned.log_prob(x)
tensor(-8.6289, grad_fn=<AddBackward0>)
>>> x = torch.randn(5, 6, 7)
>>> conditioned.log_prob(x).shape
torch.Size([5, 6])

About time series data

Has the author tried to apply normalizing flows to the temporal prediction field? Or use normalizing flows to time series data?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.