Git Product home page Git Product logo

pystan2's Introduction

PyStan: The Python Interface to Stan

Stan logo

pypi version travis-ci build status appveyor-ci build status zenodo citation DOI

Tip

PyStan 3 is available for Linux and macOS users. Visit the PyStan 3 documentation for details. PyStan 2 is not maintained.

PyStan provides a Python interface to Stan, a package for Bayesian inference using the No-U-Turn sampler, a variant of Hamiltonian Monte Carlo.

For more information on Stan and its modeling language, see the Stan User's Guide and Reference Manual at http://mc-stan.org/.

Projects using PyStan

Similar projects

PyStan3 / Stan3

The development of PyStan3 with updated API can be found under stan-dev/pystan-next

Detailed Installation Instructions

Detailed installation instructions can be found in the doc/installation_beginner.md file.

Windows Installation Instructions

Detailed installation instructions for Windows can be found in docs under PyStan on Windows

Quick Installation (Linux and macOS)

NumPy and Cython (version 0.22 or greater) are required. matplotlib is optional. ArviZ is recommended for visualization and analysis.

PyStan and the required packages may be installed from the Python Package Index using pip.

pip install pystan

Alternatively, if Cython (version 0.22 or greater) and NumPy are already available, PyStan may be installed from source with the following commands

git clone --recursive https://github.com/stan-dev/pystan.git
cd pystan
python setup.py install

To install latest development version user can also use pip

pip install git+https://github.com/stan-dev/pystan

If you encounter an ImportError after compiling from source, try changing out of the source directory before attempting import pystan. On Linux and OS X cd /tmp will work.

make (mingw32-make on Windows) is a requirement for building from source.

Example

import pystan
import numpy as np
import matplotlib.pyplot as plt

schools_code = """
data {
    int<lower=0> J; // number of schools
    real y[J]; // estimated treatment effects
    real<lower=0> sigma[J]; // s.e. of effect estimates
}
parameters {
    real mu;
    real<lower=0> tau;
    real eta[J];
}
transformed parameters {
    real theta[J];
    for (j in 1:J)
        theta[j] = mu + tau * eta[j];
}
model {
    eta ~ normal(0, 1);
    y ~ normal(theta, sigma);
}
"""

schools_dat = {'J': 8,
               'y': [28,  8, -3,  7, -1,  1, 18, 12],
               'sigma': [15, 10, 16, 11,  9, 11, 10, 18]}

sm = pystan.StanModel(model_code=schools_code)
fit = sm.sampling(data=schools_dat, iter=1000, chains=4)

print(fit)

eta = fit.extract(permuted=True)['eta']
np.mean(eta, axis=0)

# if matplotlib is installed (optional, not required), a visual summary and
# traceplot are available
fit.plot()
plt.show()

# updated traceplot can be plotted with
import arviz as az
az.plot_trace(fit)

pystan2's People

Contributors

ahartikainen avatar ariddell avatar atc3 avatar avehtari avatar braaannigan avatar chendaniely avatar dfm avatar djsutherland avatar edwinnglabs avatar gallamine avatar icoxfog417 avatar jackielxu avatar jjramsey avatar jrings avatar jseabold avatar kcarnold avatar koadman avatar manishearth avatar mshron avatar omarfsosa avatar randommm avatar rgerkin avatar riddell-stan avatar s-black avatar seantalts avatar shoyer avatar smoh avatar suzuki-shm avatar syclik avatar terhardt avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pystan2's Issues

Automatic caching/loading of pre-compiled models

dougalsutherland's very useful rstan_interface.py script has the very useful feature that it automatically caches compiled stan models to a pickle in the working directory. When the user tries to sample with the model again, it will automatically loads the cached, pre-compiled model, unless the model code has changed. This is useful because the user doesn't have to carry around a reference to the compiled model between runs.

Would it be possible to implement a similar automatic caching feature in Pystan, at least for models specified by filename rather than as model_code? Perhaps this should be turned off by default, to avoid inadvertent writing to the working directory, but I would find the feature very useful.

If this behavior is not desired, what is the best way to initiate a new sampling run with an identical model, without recompiling the model?

pystan binary releases

Inspired by stan-dev/pystan#23 (comment), I put together a pystan recipe for binary distributions with the conda Python package manager, and uploaded 64-bit Linux builds for python 2.7 and 3.3. It can be installed in Anaconda with

conda install -c http://conda.binstar.org/dougal pystan

If you're interested in releasing pystan in this format, it may be better to have them under an official stan-dev account. (To do it, you just download/check out the pystan folder from my conda-recipes repo and run conda build pystan; it'll prompt you to upload to binstar afterwards. You'd also want to update the source block in meta.yaml with each new release.)

The OSX build hangs during the test step, because of #10, so I didn't think it'd be worth uploading until that's resolved.

memory explosion when using models with different names

I don't entirely understand what's going on here, but this code starts eating up memory for me:

import pystan
import numpy as np

schools_code = """
data {
    int<lower=0> J; // number of schools
    real y[J]; // estimated treatment effects
    real<lower=0> sigma[J]; // s.e. of effect estimates
}
parameters {
    real mu;
    real<lower=0> tau;
    real eta[J];
}
transformed parameters {
    real theta[J];
    for (j in 1:J)
    theta[j] <- mu + tau * eta[j];
}
model {
    eta ~ normal(0, 1);
    y ~ normal(theta, sigma);
}
"""

schools_dat = {'J': 8,
               'y': [28,  8, -3,  7, -1,  1, 18, 12],
               'sigma': [15, 10, 16, 11,  9, 11, 10, 18]}

fit = pystan.stan(model_code=schools_code, data=schools_dat,
                  iter=1000, chains=4)
eta = fit.extract(permuted=True)['eta']
print np.mean(eta, axis=0)

fit2 = pystan.stan(model_code=schools_code, model_name='foo', data=schools_dat,
                   iter=1000, chains=4)
eta2 = fit2.extract(permuted=True)['eta']
print np.mean(eta2, axis=0)

When I run it, it does the first block just fine, then memory usage blows up during the second block. The first time sampling, memory usage stays comfortably very low (~100mb); during the second block of sampling, it goes up and up, maxing out around 10gb. If I ask for more iterations, it uses even more until it starts swapping and everything grinds to a halt.

Behavior is of course the same if I use a different model for the second block (which is where I originally found this).

Memory usage doesn't blow up if I don't explicitly pass model_name the second time, so that they're both "anon_model", or if I explicitly pass the same model_name to both.

If I interrupt the code during sampling, the traceback is

Traceback (most recent call last):
  File "test_pystan.py", line 36, in <module>
    iter=1000, chains=4)
  File "/usr/local/anaconda/lib/python2.7/site-packages/pystan/api.py", line 253, in stan
    verbose=verbose, **kwargs)
  File "/usr/local/anaconda/lib/python2.7/site-packages/pystan/model.py", line 575, in sampling
    ret, samples_i = fit._call_sampler(args_list[i])
  File "stanfit4foo_e3851f0232a737888e7f3974a747ab8e.pyx", line 322, in stanfit4foo_e3851f0232a737888e7f3974a747ab8e.StanFit4foo._call_sampler (/var/folders/vr/my5gl2b925v5j34_11007g1c000f7f/T/tmpImoWB5/pystan/stanfit4foo_e3851f0232a737888e7f3974a747ab8e.cpp:4550)
  File "stanfit4foo_e3851f0232a737888e7f3974a747ab8e.pyx", line 62, in stanfit4foo_e3851f0232a737888e7f3974a747ab8e._dict_from_pystanholder (/var/folders/vr/my5gl2b925v5j34_11007g1c000f7f/T/tmpImoWB5/pystan/stanfit4foo_e3851f0232a737888e7f3974a747ab8e.cpp:1256)
  File "/usr/local/anaconda/lib/python2.7/site-packages/numpy/core/numeric.py", line 252, in asarray
    def asarray(a, dtype=None, order=None):
KeyboardInterrupt

(but it doesn't happen right away; I guess it's in non-GIL-holding code when I hit ^C, so the signal doesn't get handled until we get back into python or GIL-holding cython).

For what it's worth, the equivalent code in RStan doesn't have the same behavior.

I have no idea what would be causing this....

expose log_prob method

The log_prob and probably grad_log_prob methods should be exposed at the python level, i.e. added in stanfit4model.

There should also be a better way to get a fit object without sampling than doing something like model.sampling(iter=1). (In RStan I think you can do iter=0, but pystan.misc._config_argss complains about that.)

ZeroDivisionError when calculating rhat for fixed transformed parameters

Hello,

I've encountered an edge case that causes splitrhat() to throw an error, reproducible with the minimal example code below. The problem occurs when a transformed parameter takes a constant value. I assume the underlying problem is that the variance in the trace for that parameter is 0, leading to the ZeroDivisionError when rhat is calculated. I think it would be preferable to handle the error and perhaps spit back rhat = NaN with a warning rather than an error.

I encountered this error because I am defining an array in transformed parameters that is essentially sparse (not defined for a few particular indeces). You might argue that this falls in the realm of user error rather than a bug, since perhaps there is no legitimate need for defining a constant parameter in a model. But I can't think of a clean / convenient way to otherwise define my sparse transformed parameter matrix, and perhaps others will encounter this issue as well.

## Model
stanmodel="""
data {
}

parameters {
  real alpha[2];
}

transformed parameters {
  real beta[2];

  beta[1] <- alpha[1] * 2;
  beta[2] <- 1;
}

model {
    alpha[1] ~ normal(0,1);
}
"""

## Try sampling
fit = pystan.stan(model_code=stanmodel, iter=100, chains=10)

for i in range(4):
  print i,pystan.chains.splitrhat(fit.sim,i)


"""
The for loop returns:

0 1.00793999969
1 1.91711193046
2 1.00793999969
3---------------------------------------------------------------------------
ZeroDivisionError                         Traceback (most recent call last)
<ipython-input-26-acd6a23a7cd7> in <module>()
      1 for i in range(4):
----> 2       print i,pystan.chains.splitrhat(fit.sim,i)
      3 

/usr/local/lib/python2.7/dist-packages/pystan/chains.pyc in splitrhat(sim, n)
     23         Chain index starting from 0
     24     """
---> 25     return _chains.split_potential_scale_reduction(sim, n)

/usr/local/lib/python2.7/dist-packages/pystan/_chains.so in pystan._chains.split_potential_scale_reduction (pystan/_chains.cpp:2395)()

ZeroDivisionError: float division
"""

PyStan should use callbacks to python instead of writing directly to stdout

Currently, sampling progress is written directly to stdout via stan_fit.hpp. Ideally, it seems that updating sampling progress should be a call back to python, so progress could be passed to a python logger or progress bar object.

My use case is running pystan non-interactively in a cluster computing environment where it would be useful to be able to use a logger object so I can log progress to a file.

Another use case would be displaying progress in an IPython notebook, which currently doesn't happen. Writing to stdout from the C++ side ends up printing in the terminal window which I used to launch IPython.

To give a bit more context, I've written a script which writes output via stdout (so I can use Hadoop streaming), and it is a little annoying that the only way to turn off the status messages writing to stdout is to set refresh=0, which also disables pystan's logging. On the whole, it seems like slightly poor manners for a library to write directly to stdout.

A side benefit might be allowing for a KeyboardInterrupt to stop progress even in the midst of sampling from a chain.

I recognize that this may be a little tricky to implement (involving hacking on the C++/Cython side) but I think it would be worth doing eventually, so I thought I would mention it.

Get plotting methods working

If matplotlib is available, make the plots, otherwise raise an exception. IIRC pandas does this sort of thing.

Compiler issues on OS X

I'm having trouble getting PyStan working on my Mac. I am running Cython 0.19.1 and Numpy 1.7.0.

The installation appears to complete successfully (manually or using pip). The trouble begins when I try to import pystan:

Python 2.7.2 (default, Oct 11 2012, 20:14:37)
[GCC 4.2.1 Compatible Apple Clang 4.0 (tags/Apple/clang-418.0.60)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import pystan
clang: warning: argument unused during compilation: '-mno-fused-madd'
/var/folders/_t/8qcjjcfn43s4s30k7gx6b_rsb92xk6/T/pyxbld-DayGQf/temp.macosx-10.8-intel-2.7/pyrex/pystan/_api.c:312:10: fatal error:
      'string' file not found
#include <string>
         ^
1 error generated.
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "pystan/__init__.py", line 8, in <module>
    from pystan.api import stanc, stan
  File "pystan/api.py", line 12, in <module>
    import pystan._api  # stanc wrapper
  File "/export/disk0/wb/python2.6/lib/python2.7/site-packages/pyximport/pyximport.py", line 431, in load_module
    language_level=self.language_level)
  File "/export/disk0/wb/python2.6/lib/python2.7/site-packages/pyximport/pyximport.py", line 209, in load_module
    inplace=build_inplace, language_level=language_level)
  File "/export/disk0/wb/python2.6/lib/python2.7/site-packages/pyximport/pyximport.py", line 186, in build_module
    reload_support=pyxargs.reload_support)
  File "/export/disk0/wb/python2.6/lib/python2.7/site-packages/pyximport/pyxbuild.py", line 104, in pyx_to_dll
    dist.run_commands()
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/dist.py", line 953, in run_commands
    self.run_command(cmd)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/dist.py", line 972, in run_command
    cmd_obj.run()
  File "/export/disk0/wb/python2.6/lib/python2.7/site-packages/Cython/Distutils/build_ext.py", line 163, in run
    _build_ext.build_ext.run(self)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/command/build_ext.py", line 340, in run
    self.build_extensions()
  File "/export/disk0/wb/python2.6/lib/python2.7/site-packages/Cython/Distutils/build_ext.py", line 171, in build_extensions
    self.build_extension(ext)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/command/build_ext.py", line 499, in build_extension
    depends=ext.depends)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/ccompiler.py", line 624, in compile
    self._compile(obj, src, ext, cc_args, extra_postargs, pp_opts)
  File "/System/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/distutils/unixccompiler.py", line 180, in _compile
    raise CompileError, msg
ImportError: Building module pystan._api failed: ["CompileError: command 'clang' failed with exit status 1\n"]

By the looks of it, this is an issue with my compiler. Presumably, I am using whatever comes with Apple's Xcode 4.6.3.

pars keyword in pystan.stan does nothing

I have a model with 1000+ parameters. I am running this model on the compute nodes of of a cluster, and running into severely limiting memory issues. I can only sample ~2000 chains on each compute node before I get a memory error crash.

Saving to a sample_file is resulting in several GB per compute node. Even using sample_file however still results in a python crash due to memory error.

Most of the parameters of the model are hyperparameters and I am not particularly interested in their values. So I am trying to pass the pars keyword with a list of the parameters I am interested in, to reduce the memory footprint of this model. However, this seems to do nothing. If I then use fit.extract, all parameters are still being stored. In the chain file all parameters are also stored. The code for StanModel.sampling only seems to reference the pars keyword here, at line 655:

        if pars is not None and len(pars) > 0:
            if not all(p in m_pars for p in pars):
                pars = np.asarray(pars)
                unmatched = pars[np.invert(np.in1d(pars, m_pars))]
                msg = "No parameter(s): {}; sampling not done."
                raise ValueError(msg.format(', '.join(pars[unmatched])))

Am I misinterpreting the pars keyword, or is there a bug that is causing the list I pass to be ignored? I have tried intentionally adding an incorrect parameter to the list and the code does crash in the above quoted block.

Avoid recompilation when changing priors?

The current information of priors are provided in the Stan code block.

Is it possible to pass priors through the function call pystan.stan the same way as the data information so that one can avoid recompilation when changing priors?

add get_inits() to StanFit

in R

ii <- get_inits(fit)
ii
[[1]]
[[1]]$mu
[1] 0.6838074

[[1]]$tau
[1] 0.6452438

[[1]]$eta
[1] 1.2558788 0.2486423 -1.1750325 1.3058190 -1.1714209 1.0709960
0.7367854
[8] -0.4932346

[[1]]$theta
[1] 1.49415540 0.84424232 -0.07437499 1.52637900 -0.07204463 1.37486092
[7] 1.15921362 0.36555088

[[2]]
[[2]]$mu
[1] -0.9556651

[[2]]$tau
[1] 4.332435

[[2]]$eta
[1] 1.4987792 1.5643572 -1.8967160 -0.3008269 -0.3494690 0.5654918
-0.4822732
[8] -0.1321233

[[2]]$theta
[1] 5.537699 5.821811 -9.173065 -2.258978 -2.469717 1.494291 -3.045082
[8] -1.528081

[[3]]
[[3]]$mu
[1] -1.023053

[[3]]$tau
[1] 0.1470396

[[3]]$eta
[1] 0.6825585 -1.3558435 -1.7691651 -0.7966862 -0.5692770 0.1701447
-0.9438256
[8] 1.5039454

[[3]]$theta
[1] -0.9226900 -1.2224157 -1.2831904 -1.1401975 -1.1067594 -0.9980351
-1.1618328
[8] -0.8019136

[[4]]
[[4]]$mu
[1] 0.7117374

[[4]]$tau
[1] 0.8896294

[[4]]$eta
[1] -0.05865353 0.57099054 -0.13552422 0.44198954 -1.89727981 0.03972086
[7] 0.51042112 -0.93344531

[[4]]$theta
[1] 0.6595575 1.2197074 0.5911711 1.1049443 -0.9761384 0.7470743
1.1658231
[8] -0.1186829

PyStan broken on IPython Notebook + qtconsole ('OutStream' error)

Running into the following error on Windows (32 bit Anaconda installation) when I attempt to run the example:

AttributeError: 'OutStream' object has no attribute 'fileno'

I read that this issue can be related to the IPython Notebook and how it uses /over-writes the stderr

See what looked to be a similar issue several months ago in SciPy:
ipython/ipython#3017

I can confirm that this issue happens in the IPython Notebook and the qtconsole. The same code is tested and working fine from the terminal.

Since the IPython notebook is so popular, this is definitely one to fix!

Thanks for implementing PyStan, though! This is going to be quite useful =)

WARNING:root:COMPILING THE C++ CODE FOR MODEL anon_model NOW.
---------------------------------------------------------------------------
AttributeError                            Traceback (most recent call last)
<ipython-input-5-f3d4d4964bda> in <module>()
      1 fit = pystan.stan(model_code=schools_code, data=schools_dat,
----> 2                   iter=1000, chains=4)

C:\Anaconda\lib\site-packages\pystan\api.pyc in stan(file, model_name, model_code, fit, data, pars, chains, iter, warmup, thin, init, seed, sample_file, diagnostic_file, save_dso, verbose, boost_lib, eigen_lib, **kwargs)
    246         m = StanModel(file=file, model_name=model_name, model_code=model_code,
    247                       boost_lib=boost_lib, eigen_lib=eigen_lib,
--> 248                       save_dso=save_dso, verbose=verbose, **kwargs)
    249     if sample_file is not None:
    250         raise NotImplementedError

C:\Anaconda\lib\site-packages\pystan\model.pyc in __init__(self, file, charset, model_name, model_code, stanc_ret, boost_lib, eigen_lib, save_dso, verbose, **kwargs)
    258         if not verbose:
    259             # silence stderr for compilation
--> 260             orig_stderr = pystan.misc.redirect_stderr()
    261 
    262         build_extension.run()

C:\Anaconda\lib\site-packages\pystan\misc.py in redirect_stderr()
    438     """
    439     sys.stderr.flush()
--> 440     stderr_fileno = sys.stderr.fileno()
    441     orig_stderr = os.dup(stderr_fileno)
    442     devnull = os.open(os.devnull, os.O_WRONLY)

AttributeError: 'OutStream' object has no attribute 'fileno'

Access to n_divergent log?

Hello,

The NUTS sampler in Stan keeps track of the number of leapfrog iterations which encounter a divergence error, "n_divergent." I see that this statistic is reported in the CmdStan output as of version 2.1.0:

https://github.com/stan-dev/stan/releases/tag/v2.1.0

Is it possible to access the n_divergent value from Pystan? I don't see any mention of it in the API.

pystan breaks on array parameters

pystan fit.extract() doesn't work with array parameters. scalars, vectors, matrices work fine.

For example:

data {
  int<lower=2> K;
}
parameters {
  real beta[K,1,2];
}
model {
  for (k in 1:K)
    beta[k,1,1] ~ normal(0,1);
  for (k in 1:K)
    beta[k,1,2] ~ normal(100,1);
}

stan() ignores user provided initial values

Thanks to Fulton Wang for this bug report:

import pystan
model = """
data{
real x;
}
parameters{
real mu;
}
model{
x ~ normal(mu,1);
}
"""

fit1 = pystan.stan(model_code=model, chains=1, seed=2, data={'x':2}, init=[{'mu':4}], warmup=0)
fit2 = pystan.stan(model_code=model, chains=1, seed=2, data={'x':2}, init=[{'mu':400}], warmup=0)

print(fit1.get_inits())
print(fit2.get_inits())

get_inits() returns same values (neither 4 nor 400)

Read rdump into a Python dict

Having a read_rdump function would be great. We don't need to support all data types, just the ones that are relevant for working with Stan.

Error installing PyStan with Pip

[moved here from Stan issue tracker: https://github.com/stan-dev/stan/issues/364 ]

I'm trying to install PyStan in an Ubuntu 12.04 LTS VM I have. For reference, it installs/builds perfectly on the host machine (Arch). Here's the tail end of the output of pip install:

creating build/temp.linux-i686-2.7/pystan/stan/src/stan/gm/grammars

compile options: '-DBOOST_RESULT_OF_USE_TR1 -DBOOST_NO_DECLTYPE -DBOOST_DISABLE_ASSERTS -Ipystan/stan/src -Ipystan/stan/lib/eigen_3.2.0 -Ipystan/stan/lib/boost_1.54.0 -I/usr/include/python2.7 -c'

extra options: '-O3'

gcc: pystan/stan/src/stan/gm/grammars/statement_2_grammar_inst.cpp

cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]

gcc: pystan/stan/src/stan/gm/grammars/program_grammar_inst.cpp

cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]

gcc: pystan/stan/src/stan/gm/grammars/whitespace_grammar_inst.cpp

cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]

gcc: pystan/stan/src/stan/gm/grammars/term_grammar_inst.cpp

cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]

gcc: pystan/stan/src/stan/gm/grammars/expression_grammar_inst.cpp

cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]

gcc: pystan/stan/src/stan/gm/grammars/var_decls_grammar_inst.cpp

cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]

gcc: internal compiler error: Killed (program cc1plus)

Please submit a full bug report,

with preprocessed source if appropriate.

See <file:///usr/share/doc/gcc-4.6/README.Bugs> for instructions.

cc1plus: warning: command line option '-Wstrict-prototypes' is valid for Ada/C/ObjC but not for C++ [enabled by default]

gcc: internal compiler error: Killed (program cc1plus)

Please submit a full bug report,

with preprocessed source if appropriate.

See <file:///usr/share/doc/gcc-4.6/README.Bugs> for instructions.

error: Command "gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -DBOOST_RESULT_OF_USE_TR1 -DBOOST_NO_DECLTYPE -DBOOST_DISABLE_ASSERTS -Ipystan/stan/src -Ipystan/stan/lib/eigen_3.2.0 -Ipystan/stan/lib/boost_1.54.0 -I/usr/include/python2.7 -c pystan/stan/src/stan/gm/grammars/var_decls_grammar_inst.cpp -o build/temp.linux-i686-2.7/pystan/stan/src/stan/gm/grammars/var_decls_grammar_inst.o -O3" failed with exit status 4

----------------------------------------
Command /usr/bin/python -c "import setuptools;__file__='/home/vagrant/build/pystan/setup.py';exec(compile(open(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --single-version-externally-managed --record /tmp/pip-ozWGET-record/install-record.txt failed with error code 1
Exception information:
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/pip/basecommand.py", line 126, in main
    self.run(options, args)
  File "/usr/lib/python2.7/dist-packages/pip/commands/install.py", line 228, in run
    requirement_set.install(install_options, global_options)
  File "/usr/lib/python2.7/dist-packages/pip/req.py", line 1093, in install
    requirement.install(install_options, global_options)
  File "/usr/lib/python2.7/dist-packages/pip/req.py", line 566, in install
    cwd=self.source_dir, filter_stdout=self._filter_install, show_stdout=False)
  File "/usr/lib/python2.7/dist-packages/pip/__init__.py", line 255, in call_subprocess
    % (command_desc, proc.returncode))
InstallationError: Command /usr/bin/python -c "import setuptools;__file__='/home/vagrant/build/pystan/setup.py';exec(compile(open(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --single-version-externally-managed --record /tmp/pip-ozWGET-record/install-record.txt failed with error code 1

Fix docs to reflect parallel by default

@shoyer wrote

Setting this to -1 broke the documentation... one reason why I prefer not to specify
defaults in docstrings if possible (and let the function signature specify that)

an excellent idea.

Build issue on OS X 10.9

I'm running into a build failure on OS X 10.9 related to the compilation of statement_2_grammar_inst.cpp. I am running Xcode 5.0.1, Python 2.7.5 and Cython 0.19.2. Here is the failure:

error: Command "cc -fno-strict-aliasing -fno-common -dynamic -arch x86_64 -arch i386 -g -Os -pipe -fno-common -
fno-strict-aliasing -fwrapv -mno-fused-madd -DENABLE_DTRACE -DMACOSX -DNDEBUG -Wall -Wstrict-prototypes -
Wshorten-64-to-32 -DNDEBUG -g -fwrapv -Os -Wall -Wstrict-prototypes -DENABLE_DTRACE -arch x86_64 -arch i386 -
pipe -DBOOST_RESULT_OF_USE_TR1 -DBOOST_NO_DECLTYPE -DBOOST_DISABLE_ASSERTS -    Ipystan/stan/src 
-Ipystan/stan/lib/eigen_3.2.0 -Ipystan/stan/lib/boost_1.54.0 -
I/System/Library/Frameworks/Python.framework/Versions/2.7/include/python2.7 -c 
pystan/stan/src/stan/gm/grammars/statement_2_grammar_inst.cpp -o build/temp.macosx-10.9-intel-
2.7/pystan/stan/src/stan/gm/grammars/statement_2_grammar_inst.o -O3" failed with exit status 1


----------------------------------------
Command /usr/bin/python -c "import setuptools;__file__='/tmp/pip-    build/pystan/setup.py';exec(compile(open(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record /tmp/pip-
2mqIVE-record/install-record.txt --single-version-externally-managed failed with error code 1 in /tmp/pip-build/pystan

catch exceptions during sampling and reject

Fil Krynicki reports a bug on stan-users. I verified that this isn't an issue in CmdStan, just in PyStan. The exception should be caught and the sample rejected --- the exception should not terminate sampling.

The exception is causing the code to die, which is the problem. The exact exception is:

File "/usr/lib/python3.3/site-packages/pystan/api.py", line 366, in stan
    n_jobs=n_jobs, **kwargs)
  File "/usr/lib/python3.3/site-packages/pystan/model.py", line 679, in sampling
    ret_and_samples = _map_parallel(call_sampler_star, call_sampler_args, n_jobs)
  File "/usr/lib/python3.3/site-packages/pystan/model.py", line 81, in _map_parallel
    return list(mymap(function, args))
  File "/usr/lib/python3.3/multiprocessing/pool.py", line 228, in map
    return self._map_async(func, iterable, mapstar, chunksize).get()
  File "/usr/lib/python3.3/multiprocessing/pool.py", line 564, in get
    raise self._value
OverflowError: Error in function boost::math::digamma<d>(d): numeric overflow

The model is:

data {
    int N;
    real y[N];
}

parameters {
    real<lower=0> alpha;
    real<lower=0> beta;
}

model {
    for (i in 1:N) {
        y[i] ~ beta(alpha, beta);
    }
}

and the data is

{"y": [0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9837587006960556, 0.9953596287703016, 0.9930394431554525, 0.9930394431554525, 0.9930394431554525, 0.9956910838581372, 0.9956910838581372, 0.9956910838581372, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.9439840901557839, 0.9890619821014253, 0.9721577726218098, 0.9721577726218098, 0.3606231355651309, 0.9946967185946304, 0.3606231355651309, 0.9953596287703016, 0.9930394431554525, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.998342724560822, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9439840901557839, 0.3606231355651309, 0.998342724560822, 0.998342724560822, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9973483592973152, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.9996685449121644, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.9996685449121644, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.9986741796486576, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.998342724560822, 0.9439840901557839, 0.996685449121644, 0.3606231355651309, 0.9996685449121644, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9980112694729865, 0.9963539940338084, 0.3606231355651309, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9993370898243288, 0.9993370898243288, 0.3606231355651309, 0.998342724560822, 0.9980112694729865, 0.9980112694729865, 0.9907192575406032, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9956910838581372, 0.9956910838581372, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.9956910838581372, 0.9956910838581372, 0.9956910838581372, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9930394431554525, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9870732515744116, 0.3606231355651309, 0.9837587006960556, 0.9963539940338084, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9976798143851509, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9990056347364932, 0.3606231355651309, 0.9996685449121644, 0.9953596287703016, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9993370898243288, 0.3606231355651309, 0.9980112694729865, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9933708982432881, 0.9993370898243288, 0.9439840901557839, 0.9996685449121644, 0.996685449121644, 0.9990056347364932, 0.998342724560822, 0.998342724560822, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9990056347364932, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.996685449121644, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9837587006960556, 0.3606231355651309, 0.3606231355651309, 0.9953596287703016, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9963539940338084, 0.9963539940338084, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.9837587006960556, 0.9996685449121644, 0.9933708982432881, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9940338084189593, 0.9976798143851509, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9890619821014253, 0.9956910838581372, 0.9903878024527677, 0.9903878024527677, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9980112694729865, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9940338084189593, 0.3606231355651309, 0.3606231355651309, 0.9940338084189593, 0.9940338084189593, 0.9940338084189593, 0.9940338084189593, 0.9990056347364932, 0.9990056347364932, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.9867417964865761, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.9903878024527677, 0.3606231355651309, 0.9996685449121644, 0.9990056347364932, 0.9990056347364932, 0.9940338084189593, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9940338084189593, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9956910838581372, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9990056347364932, 0.9439840901557839, 0.9439840901557839, 0.9937023533311237, 0.9937023533311237, 0.3606231355651309, 0.9976798143851509, 0.9976798143851509, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9963539940338084, 0.9963539940338084, 0.9963539940338084, 0.998342724560822, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9993370898243288, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.9721577726218098, 0.3606231355651309, 0.9963539940338084, 0.9867417964865761, 0.9439840901557839, 0.3606231355651309, 0.9973483592973152, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9937023533311237, 0.9937023533311237, 0.9937023533311237, 0.9937023533311237, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.9990056347364932, 0.3606231355651309, 0.9903878024527677, 0.9903878024527677, 0.9937023533311237, 0.9993370898243288, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9930394431554525, 0.9930394431554525, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.3606231355651309, 0.9937023533311237, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.9953596287703016, 0.998342724560822, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.9993370898243288, 0.9986741796486576, 0.9986741796486576, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.9976798143851509, 0.9990056347364932, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.9986741796486576, 0.9986741796486576, 0.9993370898243288, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9903878024527677, 0.996685449121644, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9963539940338084, 0.9867417964865761, 0.3606231355651309, 0.9963539940338084, 0.9963539940338084, 0.9963539940338084, 0.9963539940338084, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.9990056347364932, 0.3606231355651309, 0.9439840901557839, 0.9933708982432881, 0.9976798143851509, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9996685449121644, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9976798143851509, 0.9990056347364932, 0.9907192575406032, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9721577726218098, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9953596287703016, 0.9953596287703016, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9937023533311237, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9890619821014253, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.998342724560822, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.9990056347364932, 0.3606231355651309, 0.9867417964865761, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9890619821014253, 0.9890619821014253, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.9986741796486576, 0.3606231355651309, 0.9986741796486576, 0.9867417964865761, 0.9867417964865761, 0.9867417964865761, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9976798143851509, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.9990056347364932, 0.3606231355651309, 0.9867417964865761, 0.9867417964865761, 0.9990056347364932, 0.9990056347364932, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9837587006960556, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9953596287703016, 0.9930394431554525, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.9870732515744116, 0.9870732515744116, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9867417964865761, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9937023533311237, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.9907192575406032, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9940338084189593, 0.3606231355651309, 0.9439840901557839, 0.9940338084189593, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9976798143851509, 0.9993370898243288, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9937023533311237, 0.3606231355651309, 0.9937023533311237, 0.998342724560822, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.3606231355651309, 0.9980112694729865, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9721577726218098, 0.9946967185946304, 0.9946967185946304, 0.9890619821014253, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.9996685449121644, 0.9946967185946304, 0.3606231355651309, 0.9990056347364932, 0.9439840901557839, 0.9903878024527677, 0.9980112694729865, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.9721577726218098, 0.9990056347364932, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9953596287703016, 0.9933708982432881, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.9986741796486576, 0.9980112694729865, 0.9996685449121644, 0.9980112694729865, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9953596287703016, 0.9930394431554525, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9837587006960556, 0.9837587006960556, 0.3606231355651309, 0.9439840901557839, 0.9937023533311237, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9439840901557839, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9963539940338084, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.9990056347364932, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9903878024527677, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9963539940338084, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9953596287703016, 0.9930394431554525, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9940338084189593, 0.3606231355651309, 0.9940338084189593, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.9980112694729865, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.9993370898243288, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9996685449121644, 0.996685449121644, 0.9940338084189593, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9980112694729865, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9721577726218098, 0.9837587006960556, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.9870732515744116, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9903878024527677, 0.3606231355651309, 0.9993370898243288, 0.9940338084189593, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9990056347364932, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9870732515744116, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9721577726218098, 0.9953596287703016, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.9439840901557839, 0.3606231355651309, 0.9903878024527677, 0.9996685449121644, 0.3606231355651309, 0.9993370898243288, 0.9439840901557839, 0.3606231355651309, 0.9837587006960556, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9890619821014253, 0.9930394431554525, 0.3606231355651309, 0.9870732515744116, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9953596287703016, 0.9953596287703016, 0.9953596287703016, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9976798143851509, 0.9439840901557839, 0.9990056347364932, 0.9976798143851509, 0.9953596287703016, 0.9980112694729865, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.9996685449121644, 0.3606231355651309, 0.9996685449121644, 0.9976798143851509, 0.3606231355651309, 0.9996685449121644, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.9867417964865761, 0.998342724560822, 0.9986741796486576, 0.9870732515744116, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9993370898243288, 0.9996685449121644, 0.998342724560822, 0.3606231355651309, 0.9996685449121644, 0.9973483592973152, 0.9993370898243288, 0.9993370898243288, 0.3606231355651309, 0.9933708982432881, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9993370898243288, 0.9993370898243288, 0.9933708982432881, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.9721577726218098, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9870732515744116, 0.9963539940338084, 0.9963539940338084, 0.9867417964865761, 0.9976798143851509, 0.9963539940338084, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9986741796486576, 0.3606231355651309, 0.9973483592973152, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.998342724560822, 0.9993370898243288, 0.3606231355651309, 0.9973483592973152, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9933708982432881, 0.3606231355651309, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9976798143851509, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.9867417964865761, 0.9721577726218098, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.9903878024527677, 0.9933708982432881, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9837587006960556, 0.9996685449121644, 0.3606231355651309], "N": 1366}

and in Stan's dump format:

y <- c(0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9837587006960556, 0.9953596287703016, 0.9930394431554525, 0.9930394431554525, 0.9930394431554525, 0.9956910838581372, 0.9956910838581372, 0.9956910838581372, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.9439840901557839, 0.9890619821014253, 0.9721577726218098, 0.9721577726218098, 0.3606231355651309, 0.9946967185946304, 0.3606231355651309, 0.9953596287703016, 0.9930394431554525, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.998342724560822, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9439840901557839, 0.3606231355651309, 0.998342724560822, 0.998342724560822, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9973483592973152, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.9996685449121644, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.9996685449121644, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.9986741796486576, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.998342724560822, 0.9439840901557839, 0.996685449121644, 0.3606231355651309, 0.9996685449121644, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9980112694729865, 0.9963539940338084, 0.3606231355651309, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9993370898243288, 0.9993370898243288, 0.3606231355651309, 0.998342724560822, 0.9980112694729865, 0.9980112694729865, 0.9907192575406032, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9956910838581372, 0.9956910838581372, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.9956910838581372, 0.9956910838581372, 0.9956910838581372, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9930394431554525, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9870732515744116, 0.3606231355651309, 0.9837587006960556, 0.9963539940338084, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9976798143851509, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9990056347364932, 0.3606231355651309, 0.9996685449121644, 0.9953596287703016, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9993370898243288, 0.3606231355651309, 0.9980112694729865, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9933708982432881, 0.9993370898243288, 0.9439840901557839, 0.9996685449121644, 0.996685449121644, 0.9990056347364932, 0.998342724560822, 0.998342724560822, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9990056347364932, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.996685449121644, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9837587006960556, 0.3606231355651309, 0.3606231355651309, 0.9953596287703016, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9963539940338084, 0.9963539940338084, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.9837587006960556, 0.9996685449121644, 0.9933708982432881, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9940338084189593, 0.9976798143851509, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9890619821014253, 0.9956910838581372, 0.9903878024527677, 0.9903878024527677, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9980112694729865, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9940338084189593, 0.3606231355651309, 0.3606231355651309, 0.9940338084189593, 0.9940338084189593, 0.9940338084189593, 0.9940338084189593, 0.9990056347364932, 0.9990056347364932, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.9867417964865761, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.9903878024527677, 0.3606231355651309, 0.9996685449121644, 0.9990056347364932, 0.9990056347364932, 0.9940338084189593, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9940338084189593, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9956910838581372, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9990056347364932, 0.9439840901557839, 0.9439840901557839, 0.9937023533311237, 0.9937023533311237, 0.3606231355651309, 0.9976798143851509, 0.9976798143851509, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9963539940338084, 0.9963539940338084, 0.9963539940338084, 0.998342724560822, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9993370898243288, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.3606231355651309, 0.9721577726218098, 0.3606231355651309, 0.9963539940338084, 0.9867417964865761, 0.9439840901557839, 0.3606231355651309, 0.9973483592973152, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9937023533311237, 0.9937023533311237, 0.9937023533311237, 0.9937023533311237, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.9990056347364932, 0.3606231355651309, 0.9903878024527677, 0.9903878024527677, 0.9937023533311237, 0.9993370898243288, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9930394431554525, 0.9930394431554525, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.3606231355651309, 0.9937023533311237, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.9953596287703016, 0.998342724560822, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.9993370898243288, 0.9986741796486576, 0.9986741796486576, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.9976798143851509, 0.9990056347364932, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.9986741796486576, 0.9986741796486576, 0.9993370898243288, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9903878024527677, 0.996685449121644, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9963539940338084, 0.9867417964865761, 0.3606231355651309, 0.9963539940338084, 0.9963539940338084, 0.9963539940338084, 0.9963539940338084, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.9990056347364932, 0.3606231355651309, 0.9439840901557839, 0.9933708982432881, 0.9976798143851509, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9996685449121644, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9976798143851509, 0.9990056347364932, 0.9907192575406032, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9721577726218098, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9953596287703016, 0.9953596287703016, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9937023533311237, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9890619821014253, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.998342724560822, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.9990056347364932, 0.3606231355651309, 0.9867417964865761, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9890619821014253, 0.9890619821014253, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.9986741796486576, 0.3606231355651309, 0.9986741796486576, 0.9867417964865761, 0.9867417964865761, 0.9867417964865761, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9976798143851509, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.9990056347364932, 0.3606231355651309, 0.9867417964865761, 0.9867417964865761, 0.9990056347364932, 0.9990056347364932, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9837587006960556, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9953596287703016, 0.9930394431554525, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.9870732515744116, 0.9870732515744116, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9867417964865761, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9937023533311237, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.9907192575406032, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9940338084189593, 0.3606231355651309, 0.9439840901557839, 0.9940338084189593, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9976798143851509, 0.9993370898243288, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9937023533311237, 0.3606231355651309, 0.9937023533311237, 0.998342724560822, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.3606231355651309, 0.9980112694729865, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9721577726218098, 0.9946967185946304, 0.9946967185946304, 0.9890619821014253, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.9996685449121644, 0.9946967185946304, 0.3606231355651309, 0.9990056347364932, 0.9439840901557839, 0.9903878024527677, 0.9980112694729865, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.9721577726218098, 0.9990056347364932, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9953596287703016, 0.9933708982432881, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.9986741796486576, 0.9980112694729865, 0.9996685449121644, 0.9980112694729865, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9953596287703016, 0.9930394431554525, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9837587006960556, 0.9837587006960556, 0.3606231355651309, 0.9439840901557839, 0.9937023533311237, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9439840901557839, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9890619821014253, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9963539940338084, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.3606231355651309, 0.9990056347364932, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9903878024527677, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9963539940338084, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9953596287703016, 0.9930394431554525, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9940338084189593, 0.3606231355651309, 0.9940338084189593, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.9980112694729865, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.9993370898243288, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9996685449121644, 0.996685449121644, 0.9940338084189593, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9956910838581372, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9980112694729865, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.9721577726218098, 0.9837587006960556, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.9870732515744116, 0.3606231355651309, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9439840901557839, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9903878024527677, 0.3606231355651309, 0.9993370898243288, 0.9940338084189593, 0.9439840901557839, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9867417964865761, 0.9990056347364932, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.996685449121644, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.9870732515744116, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9721577726218098, 0.9953596287703016, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9907192575406032, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.9439840901557839, 0.3606231355651309, 0.9903878024527677, 0.9996685449121644, 0.3606231355651309, 0.9993370898243288, 0.9439840901557839, 0.3606231355651309, 0.9837587006960556, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9890619821014253, 0.9930394431554525, 0.3606231355651309, 0.9870732515744116, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9953596287703016, 0.9953596287703016, 0.9953596287703016, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9976798143851509, 0.9439840901557839, 0.9990056347364932, 0.9976798143851509, 0.9953596287703016, 0.9980112694729865, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9986741796486576, 0.9996685449121644, 0.3606231355651309, 0.9996685449121644, 0.9976798143851509, 0.3606231355651309, 0.9996685449121644, 0.9986741796486576, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.9867417964865761, 0.998342724560822, 0.9986741796486576, 0.9870732515744116, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9976798143851509, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.3606231355651309, 0.3606231355651309, 0.9439840901557839, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9993370898243288, 0.9993370898243288, 0.9996685449121644, 0.998342724560822, 0.3606231355651309, 0.9996685449121644, 0.9973483592973152, 0.9993370898243288, 0.9993370898243288, 0.3606231355651309, 0.9933708982432881, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9993370898243288, 0.9993370898243288, 0.9933708982432881, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.9721577726218098, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9870732515744116, 0.9963539940338084, 0.9963539940338084, 0.9867417964865761, 0.9976798143851509, 0.9963539940338084, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9986741796486576, 0.3606231355651309, 0.9973483592973152, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9990056347364932, 0.998342724560822, 0.9993370898243288, 0.3606231355651309, 0.9973483592973152, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9933708982432881, 0.3606231355651309, 0.9867417964865761, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9996685449121644, 0.9976798143851509, 0.3606231355651309, 0.9996685449121644, 0.3606231355651309, 0.9867417964865761, 0.9721577726218098, 0.9721577726218098, 0.3606231355651309, 0.3606231355651309, 0.9903878024527677, 0.3606231355651309, 0.998342724560822, 0.3606231355651309, 0.9903878024527677, 0.9933708982432881, 0.3606231355651309, 0.3606231355651309, 0.3606231355651309, 0.9837587006960556, 0.9996685449121644, 0.3606231355651309)
N <- 1366

"pip install numpy cython pystan" doesn't work

pip fails because pystan's setup.py exits upon failing to import numpy and cython.

This is an issue that other python packages relying on numpy appear to have resolved. For examples of the fix, see: scipy/scipy#453

This is worth fixing because it makes it possible to install pystan as part of a reproducible build with pip.

Adapt logging for parallel sampling

Also report lp__, possibly using damped rolling average.

I'd also like to see the lp__ output so you can get an informal monitor of convergence --- before converging to the high posterior volume, this value tends to go down, then at convergence starts bouncing around. The third thing on my wish list is an estimated time to completion. But as Daniel pointed out, this is tricky if the job gets paused (for instance by shutting the lid of a notebook). Also, the time during warmup and during sampling per iteration is very different. So I'm imagining something like a damped rolling average estimate of iteration time and then reporting iteration time * number of iterations left.

Write a stan_rdump function

There's interest in having a standard data format so PyStan users can talk to RStan users. RStan has a function stan_rdump which should be easy to adapt.

Improve API docs on plotting

Someone just asked about plotting a single parameter. The relevant docstring is in a Cython file so Sphinx isn't automatically grabbing it. (It doesn't help that it also doesn't exist yet.)

add rstan like summary() for fitted models

RStan produces a summary like this when you print a stanfit object:

> model_code = '
+ data {
+   int<lower=2> K;
+   int<lower=1> D;
+ }
+ parameters {
+   matrix[K,D] beta;
+ }
+ model {
+   for (k in 1:K)
+     for (d in 1:D)
+       beta[k,d] ~ normal(0,5);
+ }'
> fit2 = stan(model_code=model_code, data=list(K=3,D=3))

TRANSLATING MODEL 'model_code' FROM Stan CODE TO C++ CODE NOW.

> fit2
Inference for Stan model: model_code.
4 chains, each with iter=2000; warmup=1000; thin=1; 
post-warmup draws per chain=1000, total post-warmup draws=4000.

          mean se_mean  sd  2.5%  25%  50%  75% 97.5% n_eff Rhat
beta[1,1]  0.1     0.1 4.9  -9.7 -3.2  0.1  3.3   9.9  4000    1
beta[1,2]  0.0     0.1 4.9  -9.6 -3.3 -0.1  3.4   9.6  4000    1
beta[1,3]  0.1     0.1 5.0  -9.2 -3.4  0.0  3.4   9.5  4000    1
beta[2,1]  0.0     0.1 5.0  -9.5 -3.4  0.1  3.3   9.7  4000    1
beta[2,2]  0.0     0.1 5.0  -9.9 -3.4  0.0  3.2  10.4  4000    1
beta[2,3]  0.0     0.1 4.7  -9.7 -3.1  0.0  3.0   9.4  4000    1
beta[3,1]  0.1     0.1 5.0  -9.6 -3.5  0.1  3.7  10.0  4000    1
beta[3,2] -0.1     0.1 5.1 -10.0 -3.5 -0.1  3.5   9.6  4000    1
beta[3,3]  0.0     0.1 5.0 -10.3 -3.2  0.0  3.4   9.9  4000    1
lp__      -4.4     0.1 2.0  -9.0 -5.7 -4.1 -2.9  -1.4   342    1

Samples were drawn using NUTS2 at Tue Aug 20 18:12:24 2013.
For each parameter, n_eff is a crude measure of effective sample size,
and Rhat is the potential scale reduction factor on split chains (at 
convergence, Rhat=1).

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.