Comments (7)
Yes, definitely part of it. I was wondering if AeMCMC has the structure to emulate/automate common conjugate distributions as in this Wiki table.
Absolutely; that's a big part of this work (e.g. #48, #46). The current conjugation implementations are mostly framed as kanren
relations, but there are some challenges with that approach that we need to address (e.g. #108); however, we really don't need to use that approach. We can implement these conjugates (in one "direction") using plain Aesara rewrites.
from aemcmc.
You're right, that one identity won't work in general. What I'm talking about is the general approach of operating in a non-(log-)density space, for which that convolution identity is just a single example
(Sorry, been on the last part of a big project for a little while now; will provide a real response as soon as I can!)
from aemcmc.
Does the following illustrate the problem you're talking about—or part of it?
import aemcmc
import aesara
import aesara.tensor as at
srng = at.random.RandomStream(0)
lam_rv = srng.gamma(1., 1., name="lam")
Y1_rv = srng.poisson(lam=lam_rv, name="Y1")
Y2_rv = srng.poisson(lam=lam_rv, name="Y2")
y1_vv = Y1_rv.clone()
y1_vv.name = "y1"
y2_vv = Y2_rv.clone()
y2_vv.name = "y2"
sampler, initial_values = aemcmc.construct_sampler({Y1_rv: y1_vv, Y2_rv: y2_vv}, srng)
p_posterior_step = sampler.sample_steps[lam_rv]
# Only one posterior sampling step for `lam_rv`
sampler.sample_steps.keys()
# dict_keys([lam])
# and it only uses the value of `Y2_rv` (i.e. `y2`)
aesara.dprint(p_posterior_step)
# gamma_rv{0, (0, 0), floatX, False}.1 [id A]
# |RandomGeneratorSharedVariable(<Generator(PCG64) at 0x7FDEAC9F9660>) [id B]
# |TensorConstant{[]} [id C]
# |TensorConstant{11} [id D]
# |Elemwise{add,no_inplace} [id E]
# | |TensorConstant{1.0} [id F]
# | |y2 [id G]
# |Elemwise{true_divide,no_inplace} [id H]
# |TensorConstant{1.0} [id I]
# |Elemwise{add,no_inplace} [id J]
# |TensorConstant{1.0} [id F]
# |TensorConstant{1} [id K]
This example seems like it might require a new relation (e.g. the sum of independent Poissons being a Poisson) in order to give the expected result. At the very least, it should be aware of the other observed Poisson.
from aemcmc.
Yes, definitely part of it. I was wondering if AeMCMC has the structure to emulate/automate common conjugate distributions as in this Wiki table. These likelihoods assume some
I imagine that, eventually, instead of having Y1_rv, Y2_rv
, we could have some batched n-vector Y_rv
in the pseudo-code above?
from aemcmc.
This example seems like it might require a new relation (e.g. the sum of independent Poissons being a Poisson) in order to give the expected result. At the very least, it should be aware of the other observed Poisson.
Why would we need to recognize that a sum of independent Poissons is a Poisson? The likelihood portion of the posterior kernel is the same as a sum of Poissons, but maybe that's just a coincidence here? Here are some math scribbles:
where the kernel of
from aemcmc.
Why would we need to recognize that a sum of independent Poissons is a Poisson?
That's just one fairly straightforward approach to handling this exact situation, but there are likely many others. The reason that approach is nice: we need only add one simple additional rewrite and it should immediately work with the current rewrites to produce the desired result. In other words, we wouldn't need to change how things work.
The likelihood portion of the posterior kernel is the same as a sum of Poissons, but maybe that's just a coincidence here? Here are some math scribbles:
πn(λ)∝λπ0(λ)P(Y1=y1∣λ)P(Y2=y2∣λ)=π0(λ)λy1exp(−λ)y1!λy2exp(−λ)y2!∝λπ0(λ)λy1+y2exp(−2λ)y1!y2!∝λπ0(λ)λy1+y2exp(−2λ)
where the kernel of Y1+Y2 would have been λy1+y2exp(−2λ)(y1+y2)!∝λλy1+y2exp(−2λ) as well. (The denominators are different) πn(⋅) and π0(⋅) denote the posterior and prior distributions respectively
Yeah, it looks like you're deriving the rewrite I proposed adding, but in "log-density space".
Our rewrite process operates primarily in a "measurable function/random variable space": i.e. on random variables like
Using this approach, we avoid all the extra machinery needed to perform the log-density derivation/proof of
from aemcmc.
Thanks for the clarifications above
Our rewrite process operates primarily in a "measurable function/random variable space": i.e. on random variables like Z=X+Y, where X∼Pois(λx) and Y∼Pois(λy) by adding/using/replacing Z∼Pois(λx+λy) in our IR graphs. After that, it becomes possible/easy to formulate (β|Z=z) for any β upon which X and/or Y is dependent.
I think that I'm missing something here. I was referring to
from aemcmc.
Related Issues (20)
- Add `sample_prior`, `sample` and `sample_posterior_predictive` functions
- Add automatic Laplace approximation
- Use miniKanren to walk through mathematically equivalent model representations
- Gcc_flag issue with Max os HOT 1
- Dynamically generate lists in documentation
- Make sure `kanren` rewrites account for `SharedVariable.default_update`s
- Add AeMCMC logo to RTD?
- Demo automatic MAP estimation based on proximal operators
- Assign FFBS sampler to variables in HMM models
- Refactor NUTS builder to use the new `logprob` interface
- Set up scheduled nightly builds
- Add a utility function to sample using `scan`
- Add examples to the README HOT 3
- Add a function to change the scan order of the sampling steps returned by `construct_sampler`
- Update the README with the new sampler interface
- Add standard HMC/NUTS defaults and options HOT 1
- `construct_sampler` does not support transformed observables HOT 1
- Logistic regression with time-varying coefficients: cannot construct sampler HOT 6
- Replacements need to apply to `SharedVariable.default_update` and `OpFromGraph`s
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from aemcmc.