Comments (7)
acceptance_rate
Would it be accept_prob
instead? To me, accept_prob is a more meaningful term than acceptance_rate. It also aligns with target_accept_prob
arg. In addition, num_steps is also a useful dynamic term which I usually print out to see how the model and inference works. Please consider it if the progbar has some spaces. :)
heuristic_step_size: The name makes it appear as though it is a float arg and not a boolean arg. Don't have a better idea at the moment though.
I am keeping up with the progress of lax.scan
in differentiable-scan
branch of JAX. Up to now, the compiling time is really fast. In addition, for hmm potential_fn evaluation, CPU is 100x faster than GPU (CPU potential_fn speed is in hundred of micro seconds). So I expect the performance will be comparable to Stan. However, jitting the whole trajectory is still triggering an error. Let's chat a bit on this if you are interested in the progress. About the flag, I think that we can remove it. If we face other slow compiling models which are unable to be solved by using lax.scan
, then we can add a decorator to disable it instead (as we discussed in some PRs).
If the model or potential_fn is not jittable, I think we'll end up throwing an exception given than sample_kernel has a jit decorator. I think the right approach would be to pass a jit_compile=True flag to hmc so that the user does not need to modify the source code in case their model is not jittable.
Hmm, we already have a decorator to disable it... I don't think that our HMC works for non-jit models (I mean it would be pretty slow). ^^!
However, this style isn't very pythonic otherwise. So lets change initialize_model(rng, model, model_args, model_kwargs) to initialize_model(rng, model, *args, **kwargs).
Yes, agree.
from numpyro.
@neerajprad I am intending to rename inv_transform_fn
to constrain_fn
and its arg constrain
to invert
(the old arg). The name will reflect its purpose and avoid the confusion with transform
arg in fori_collect
. What do you think about it?
from numpyro.
@fehiepsi - Feel free to add to this if you notice anything that appears off in our API.
from numpyro.
Would it be accept_prob instead? To me, accept_prob is a more meaningful term than acceptance_rate. It also aligns with target_accept_prob arg.
Should we show mean value of accept_prob then? That's what we adapt after all.
In addition, num_steps is also a useful dynamic term which I usually print out to see how the model and inference works. Please consider it if the progbar has some spaces.
Sure, lets see if we can put in num_steps
as well. I think that is particularly useful for NUTS.
In addition, for hmm potential_fn evaluation, CPU is 100x faster than GPU (CPU potential_fn speed is in hundred of micro seconds). So I expect the performance will be comparable to Stan.
That's great to know! 🎉 Looking forward to a fast HMM example in numpyro, that'll be super cool! Since I think this is a temporary flag, it's fine to leave this as is and just mark it as an experimental arg for now.
Hmm, we already have a decorator to disable it... I don't think that our HMC works for non-jit models (I mean it would be pretty slow). ^^!
Haha..yes, maybe this is moot given that HMC/NUTS might be much slower without JIT that there would be little point in using it. I don't foresee users using the debugging context manager though, and that will slow down a lot of other stuff too by not using any primitive operations.
from numpyro.
Should we show mean value of accept_prob then?
Yeah, good idea! The mean of accept_probs is what we target at the end! I don't remember exactly but other frameworks will suggest to increase target_accept_prob or decrease it based on that mean value (and divergence info).
I think that is particularly useful for NUTS.
Yup, step_size does not vary after warmup but num_steps will vary in NUTS. During the warmup, maybe it is useful for HMC as well (when adapt_step_size=True, num_steps will vary together with step_size during warmup phase).
from numpyro.
What do you think about it?
Yes, please. This will really help my mental model. Otherwise, I need to go through a process of mental translation where I spend a second or two trying to recall what inverse transform is. +1 on this change.
from numpyro.
Great! All the tasks are done.
from numpyro.
Related Issues (20)
- How can I gibbs before HMC/NUTS? HOT 8
- Large potential energy while using `HMCGibbs` at the initial stage HOT 3
- Inference Test Failing HOT 2
- Figure in AR2 example is not reproducible
- Got Problems When Computing Log Likelihoods in a Scan-Based VAR Model HOT 2
- Inference for `gaussian_hmm` is broken on latest jax version (0.4.30) HOT 1
- autocorrelation function HOT 5
- saving render_model() output to the desired file path HOT 5
- Object oriented wrapper API HOT 5
- Stress test utility for numpyro? HOT 2
- Samples are outside the support for DiscreteUniform distribution HOT 3
- Crash when using `TruncatedNormal` in `parallel` MCMC, but not in `sequential` MCMC HOT 5
- `nuts.get_extra_fields()["num_steps"]=0` after warmup HOT 4
- Trouble Initializing Parameters with Highly Degenerate Data HOT 1
- Predictive: better docs for setting obs to `None` HOT 4
- Error in model using CircularReparam when trying to use Predictive HOT 1
- Error when using obs_mask and predictive with different input shape HOT 2
- Add CodeCov HOT 1
- Infer optimizer hyperparameters with NumPyro HOT 3
- Strange behavior when using joblib parallel and numpyro effect handler for masking HOT 3
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from numpyro.