Git Product home page Git Product logo

Comments (3)

femtomc avatar femtomc commented on June 2, 2024 1

Hi @RobertAudlab! Here's one (cached) answer: https://discourse.julialang.org/t/whats-the-difference-between-gen-and-turing-for-probabilistic-programming/36624 (specialized to comparing Gen to Turing in Julia).

A non-cached answer, which will likely echo some of the points above:

Answer focused on Gen

The key conceptual contribution of Gen is the generative function interface - an abstract specification for models which describes method interfaces and associated data types. This interface is defined to expose common operations used in many different classes of inference algorithms. This abstraction layer helps support the goals you stated above - allowing a user to write custom inference algorithms. One other benefit is that user can implement this interface for new objects, allowing performance customization - as required for their use case.

The interface also supports the composition of generative functions into new generative functions - including things like combinators (generative functions which accept other generative functions as arguments, and implement structured patterns of control flow for generative computations), as well as languages for generative functions like the dynamic one which ships with Gen.jl which support a function call-like abstraction for invoking generative functions as callees.

But Gen.jl isn't just the concepts - it's also a (mostly) batteries included PPL, exposing several modeling languages which implement this interface (c.f. the docs and the ecosystem) and an inference standard library supporting custom proposal importance sampling, a spectrum of customized MCMC kernels like custom proposal MH and involutive MH, MALA, variational inference, and several variants of sequential Monte Carlo (both in Gen.jl and in GenParticleFilters.jl).

Short, informal: Gen exposes a level of customization for models and inference focused on the interface I described above. The interface allows users to customize their generative code, supporting optimizations, or means of composition. The interface helps automate the tricky math which makes custom inference design hard.

Answer for Pyro & PyMC

I think some of my points above should provide reference for comparison, but here's a few more direct observations:

  • PyMC seems to be focused on MCMC methods (although they mention SMC in the docs as well). I also suspect that it may be difficult to implement performance optimization decisions for models (if it is possible to define new model objects) -- but I can't prove this. Gen's interface supports a wider range of inference families.

  • Pyro supports a wider range of inference families than PyMC -- I've had a tougher time identifying the core interface for model object with Pyro. I think models are of the class: Python + effect handlers. Effect handlers in Pyro appear to mix model and inference code together. This is a design decision - Gen makes a different one (and I personally find Gen's to be more aesthetically pleasing from a programming perspective, but others may differ!). Pyro provides some nice automated guide generation functionality for VI -- Gen doesn't provide this out-of-the-box - but there's no reason (within the assumptions of the interface) that Gen couldn't accommodate this functionality. Pyro is also in Python -- so there may be ecosystem / accessibility benefits for researchers whose stack is Python-based. But we have a few variants of Gen in Python which we are currently spending many cycles on, so this point will soon apply to both systems.

Gen in other languages

Note that the key conceptual layer of the generative function interface is language agnostic. We have successfully ported Gen (as a framework of concepts for modeling and inference) to Python and C++ (currently closed source). There will be more information about these variants within the next year or so (and sooner, for specific variants).

from gen.jl.

ztangent avatar ztangent commented on June 2, 2024

I'll add that I think Pyro is much more specialized / optimized for (amortized) variational inference as an inference strategy compared to Gen (arguably at the expense of making it convenient to implement other kinds of inference algorithms, such as custom MCMC and SMC algorithms). This is in part because Pyro was developed around the concept of deep probabilistic programming, whereas that concept is not as central to Gen's design.

This manifests in the way that a lot of Pyro models (as shown in their tutorials) are written to accept batched inputs, because this allows for faster batched training of both model parameters and variational parameters. Pyro also makes use of it's underlying PyTorch / JAX backends to make training more efficient, and to enable easier specification of neural networks as variational families, whereas Gen.jl (in its current form) is not quite as optimized for variational inference. Among other things, Gen.jl currently does not provide support for automatic reparameterization of random variables, or marginalization of discrete random variables, whereas Pyro does in some cases. This means that the VI algorithms that Gen.jl currently supports are limited to black-box VI (which suffers from higher variance gradients then VI methods that make use of reparameterization, such as ADVI).

Note that none of the above is a strict limitation of Gen.jl -- it should be possible to write custom compilers and inference algorithms that perform some of the optimizations mentioned above, and @femtomc is actually working on a JAX implementation of Gen that should enjoy some of the parallelization benefits that (Num)Pyro does! So things may change in the future :)

from gen.jl.

ztangent avatar ztangent commented on June 2, 2024

Also, we hadn't enabled the feature previously, but I'm going to turn this issue into a discussion question, since it makes more sense to have Q&A there!

from gen.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.