Git Product home page Git Product logo

svae's People

Contributors

duvenaud avatar mattjj avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

svae's Issues

Wired results after running gmm_svae_synth. py

After running the script gmm_svae_synth.py (around one and a half hours on a Mac desktop computer), the results seem not reasonable (running_gmm_2000.png). So I just wondering if there is anything wrong with the script?
image

Question about the computation of the natural gradient of the PGM parameters

Hey Matthew!

First of all, thanks for the awesome ideas and work.

I have a question about the way you're computing pgm_natgrad. Specifically here. I'm copying the relevant lines:

# this expression for pgm_natgrad drops a term that can be computed using
# the function autograd.misc.fixed_points.fixed_point
pgm_natgrad = -natgrad_scale / num_datapoints * \
            (flat(pgm_prior) + num_batches*flat(saved.stats) - flat(pgm_params))

If I understand correctly, the dropped term is this:

image

Which in the paper you mention "is computed automatically as part of the backward pass for computing the gradients with respect to the other parameters".

Can you clarify why that term is dropped? Also, I don't understand the minus sign, right in the beginning of the assignment, line 33.

Again, congrats on the awesome work!


On a side note, I think I spotted 2 errors in the paper:

  1. In section 4.2 (and then again in the appendix), where you define \eta_x to be a partial local optimizer of the surrogate objective:
    image
    I believe this should be argmax, rather than argmin. Can you confirm?

  2. In the second expression of proposition 4.2:
    image
    I think the gradient should be w.r.t. \theta, rather than x. Is that correct?

gmm_svae_synth assertion error

Hey Matty --- I'm seeing an assertion error during the resnet_decode step when running the gmm_svae_synth.py example as is:

│/Users/acm/Dropbox/Proj/svae/svae/forward_models.pyc in resnet_decode(z, phi)
│     81 def resnet_decode(z, phi):
│     82     phi_linear, phi_mlp = phi
│---> 83     return add(linear_decode(z, phi_linear), mlp_decode(z, phi_mlp, tanh_scale=
│2., sigmoid_output=False))
│     84     # return linear_decode(z, phi_linear)
│     85
│
│/Users/acm/Dropbox/Proj/svae/svae/util.py in wrapped(a, b)
│    134         if shape(a) != shape(b):
│    135             print shape(a), shape(b)
│--> 136         assert shape(a) == shape(b)
│    137         return binop(a, b)
│    138     return wrapped
│
│AssertionError:

It looks like the param sizes passed to add have shapes

((150, 10, 2), (150, 10, 4)) ((150, 10, 2), (150, 10, 2))

so the second set of params coming from linear_decode has a last dimension twice as big as the second set of params coming from mlp_decode.

Is one passing back a dense covariance and the other passing back a diagonal covariance?

Reproducing figures in paper

Hi,

It seems that there has been a great deal of rearranging of this code since publication. Which commit should I clone to reproduce the figures shown in the paper?

Brian

install dependency error

Hi, i tried to re-produce the results in paper using this code, but i meet the following confusing errors:
i run instruction python setup.py build_ext --inplace on ubuntu 14.04 with python 2.7.6, scipy, numpy and gcc 4.7.3 correctly installed. Could anyone help me? Thanks!
So sorry for placing so long errors here.

  • andrew@ubuntu1:~/xu/svae$ python setup.py build_ext --inplace
  • `missing cimport in module 'scipy.linalg.cython_lapack': ./svae/cython_linalg_grads.pxd
  • missing cimport in module 'scipy.linalg.cython_blas': ./svae/cython_linalg_grads.pxd
  • missing cimport in module 'scipy.linalg.cython_lapack': ./svae/cython_util.pxd
  • missing cimport in module 'scipy.linalg.cython_blas': ./svae/cython_util.pxd
  • missing cimport in module 'scipy.linalg.cython_lapack': ./svae/lds/cython_gaussian_grads.pxd
  • missing cimport in module 'scipy.linalg.cython_blas': ./svae/lds/cython_gaussian_grads.pxd
  • missing cimport in module 'scipy.linalg.cython_lapack': svae/lds/cython_lds_inference.pyx
  • missing cimport in module 'scipy.linalg.cython_blas': svae/lds/cython_lds_inference.pyx
  • missing cimport in module 'scipy.linalg.cython_blas': svae/hmm/cython_hmm_inference.pyx
  • Compiling tests/test_cython_linalg_grads.pyx because it changed.
  • Compiling tests/test_cython_gaussian_grads.pyx because it changed.
  • Compiling svae/lds/cython_lds_inference.pyx because it changed.
  • Compiling svae/hmm/cython_hmm_inference.pyx because it changed.
  • Cythonizing svae/hmm/cython_hmm_inference.pyx
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • import numpy as np
  • cimport numpy as np
  • import numpy.random as npr
  • from scipy.linalg.cython_blas cimport ddot, dgemm, dgemv
  • ^

  • svae/hmm/cython_hmm_inference.pyx:7:0: 'scipy.linalg.cython_blas.pxd' not found
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • import numpy as np
  • cimport numpy as np
  • import numpy.random as npr
  • from scipy.linalg.cython_blas cimport ddot, dgemm, dgemv
  • ^

  • svae/hmm/cython_hmm_inference.pyx:7:0: 'ddot.pxd' not found
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • import numpy as np
  • cimport numpy as np
  • import numpy.random as npr
  • from scipy.linalg.cython_blas cimport ddot, dgemm, dgemv
  •                                  ^
    

  • svae/hmm/cython_hmm_inference.pyx:7:38: Name 'ddot' not declared in module 'scipy.linalg.cython_blas'
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • import numpy as np
  • cimport numpy as np
  • import numpy.random as npr
  • from scipy.linalg.cython_blas cimport ddot, dgemm, dgemv
  • ^

  • svae/hmm/cython_hmm_inference.pyx:7:0: 'dgemm.pxd' not found
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • import numpy as np
  • cimport numpy as np
  • import numpy.random as npr
  • from scipy.linalg.cython_blas cimport ddot, dgemm, dgemv
  •                                        ^
    

  • svae/hmm/cython_hmm_inference.pyx:7:44: Name 'dgemm' not declared in module 'scipy.linalg.cython_blas'
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • import numpy as np
  • cimport numpy as np
  • import numpy.random as npr
  • from scipy.linalg.cython_blas cimport ddot, dgemm, dgemv
  • ^

  • svae/hmm/cython_hmm_inference.pyx:7:0: 'dgemv.pxd' not found
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • import numpy as np
  • cimport numpy as np
  • import numpy.random as npr
  • from scipy.linalg.cython_blas cimport ddot, dgemm, dgemv
  •                                               ^
    

  • svae/hmm/cython_hmm_inference.pyx:7:51: Name 'dgemv' not declared in module 'scipy.linalg.cython_blas'
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  • ^

  • svae/cython_util.pxd:3:0: 'dsymv.pxd' not found
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  •                                  ^
    

  • svae/cython_util.pxd:3:38: Name 'dsymv' not declared in module 'scipy.linalg.cython_blas'
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  •                                         ^
    

  • svae/cython_util.pxd:3:45: Name 'ddot' not declared in module 'scipy.linalg.cython_blas'
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  •                                               ^
    

  • svae/cython_util.pxd:3:51: Name 'dgemm' not declared in module 'scipy.linalg.cython_blas'
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  •                                                      ^
    

  • svae/cython_util.pxd:3:58: Name 'dgemv' not declared in module 'scipy.linalg.cython_blas'
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  • from scipy.linalg.cython_lapack cimport dtrtrs, dpotrf, dpotrs, dpotri
  • ^

  • svae/cython_util.pxd:4:0: 'scipy.linalg.cython_lapack.pxd' not found
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  • from scipy.linalg.cython_lapack cimport dtrtrs, dpotrf, dpotrs, dpotri
  • ^

  • svae/cython_util.pxd:4:0: 'dtrtrs.pxd' not found
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  • from scipy.linalg.cython_lapack cimport dtrtrs, dpotrf, dpotrs, dpotri
  •                                    ^
    

  • svae/cython_util.pxd:4:40: Name 'dtrtrs' not declared in module 'scipy.linalg.cython_lapack'
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  • from scipy.linalg.cython_lapack cimport dtrtrs, dpotrf, dpotrs, dpotri
  • ^

  • svae/cython_util.pxd:4:0: 'dpotrf.pxd' not found
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  • from scipy.linalg.cython_lapack cimport dtrtrs, dpotrf, dpotrs, dpotri
  •                                            ^
    

  • svae/cython_util.pxd:4:48: Name 'dpotrf' not declared in module 'scipy.linalg.cython_lapack'
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  • from scipy.linalg.cython_lapack cimport dtrtrs, dpotrf, dpotrs, dpotri
  • ^

  • svae/cython_util.pxd:4:0: 'dpotrs.pxd' not found
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  • from scipy.linalg.cython_lapack cimport dtrtrs, dpotrf, dpotrs, dpotri
  •                                                    ^
    

  • svae/cython_util.pxd:4:56: Name 'dpotrs' not declared in module 'scipy.linalg.cython_lapack'
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  • from scipy.linalg.cython_lapack cimport dtrtrs, dpotrf, dpotrs, dpotri
  • ^

  • svae/cython_util.pxd:4:0: 'dpotri.pxd' not found
  • Error compiling Cython file:

  • ...
  • cython: boundscheck=False, nonecheck=False, wraparound=False, cdivision=True

  • from scipy.linalg.cython_blas cimport dsymv, ddot, dgemm, dgemv
  • from scipy.linalg.cython_lapack cimport dtrtrs, dpotrf, dpotrs, dpotri
  •                                                            ^
    

  • svae/cython_util.pxd:4:64: Name 'dpotri' not declared in module 'scipy.linalg.cython_lapack'
  • Error compiling Cython file:

  • ...
  • for t in range(T):
    
  •     themax = max_vector(node_params[t])
    
  •     for i in range(N):
    
  •         alpha[t,i] = in_potential[i] * exp(node_params[t,i] - themax)
    
  •     lognorm += log(normalize_inplace(alpha[t])) + themax
    
  •     dgemv('T', &N, &N, &one, &pair_params[0,0], &N, &alpha[t,0], &inc,
    
  •         ^
    

  • svae/hmm/cython_hmm_inference.pyx:85:13: undeclared name not builtin: dgemv
  • Error compiling Cython file:

  • ...
  • for t in range(T):
    
  •     themax = max_vector(node_params[t])
    
  •     for i in range(N):
    
  •         alpha[t,i] = in_potential[i] * exp(node_params[t,i] - themax)
    
  •     lognorm += log(normalize_inplace(alpha[t])) + themax
    
  •     dgemv('T', &N, &N, &one, &pair_params[0,0], &N, &alpha[t,0], &inc,
    
  •               ^
    

  • svae/hmm/cython_hmm_inference.pyx:85:19: Cannot convert 'int *' to Python object
  • Error compiling Cython file:

  • ...
  • for t in range(T):
    
  •     themax = max_vector(node_params[t])
    
  •     for i in range(N):
    
  •         alpha[t,i] = in_potential[i] * exp(node_params[t,i] - themax)
    
  •     lognorm += log(normalize_inplace(alpha[t])) + themax
    
  •     dgemv('T', &N, &N, &one, &pair_params[0,0], &N, &alpha[t,0], &inc,
    
  •                   ^
    

  • svae/hmm/cython_hmm_inference.pyx:85:23: Cannot convert 'int *' to Python object
  • Error compiling Cython file:

  • ...
  • for t in range(T):
    
  •     themax = max_vector(node_params[t])
    
  •     for i in range(N):
    
  •         alpha[t,i] = in_potential[i] * exp(node_params[t,i] - themax)
    
  •     lognorm += log(normalize_inplace(alpha[t])) + themax
    
  •     dgemv('T', &N, &N, &one, &pair_params[0,0], &N, &alpha[t,0], &inc,
    
  •                       ^
    

  • svae/hmm/cython_hmm_inference.pyx:85:27: Cannot convert 'double *' to Python object
  • Error compiling Cython file:

  • ...
  • for t in range(T):
    
  •     themax = max_vector(node_params[t])
    
  •     for i in range(N):
    
  •         alpha[t,i] = in_potential[i] * exp(node_params[t,i] - themax)
    
  •     lognorm += log(normalize_inplace(alpha[t])) + themax
    
  •     dgemv('T', &N, &N, &one, &pair_params[0,0], &N, &alpha[t,0], &inc,
    
  •                             ^
    

  • svae/hmm/cython_hmm_inference.pyx:85:33: Cannot convert 'double *' to Python object
  • Error compiling Cython file:

  • ...
  • for t in range(T):
    
  •     themax = max_vector(node_params[t])
    
  •     for i in range(N):
    
  •         alpha[t,i] = in_potential[i] * exp(node_params[t,i] - themax)
    
  •     lognorm += log(normalize_inplace(alpha[t])) + themax
    
  •     dgemv('T', &N, &N, &one, &pair_params[0,0], &N, &alpha[t,0], &inc,
    
  •                                                ^
    

  • svae/hmm/cython_hmm_inference.pyx:85:52: Cannot convert 'int *' to Python object
  • Error compiling Cython file:

  • ...
  • for t in range(T):
    
  •     themax = max_vector(node_params[t])
    
  •     for i in range(N):
    
  •         alpha[t,i] = in_potential[i] * exp(node_params[t,i] - themax)
    
  •     lognorm += log(normalize_inplace(alpha[t])) + themax
    
  •     dgemv('T', &N, &N, &one, &pair_params[0,0], &N, &alpha[t,0], &inc,
    
  •                                                    ^
    

  • svae/hmm/cython_hmm_inference.pyx:85:56: Cannot convert 'double *' to Python object
  • Error compiling Cython file:

  • ...
  • for t in range(T):
    
  •     themax = max_vector(node_params[t])
    
  •     for i in range(N):
    
  •         alpha[t,i] = in_potential[i] * exp(node_params[t,i] - themax)
    
  •     lognorm += log(normalize_inplace(alpha[t])) + themax
    
  •     dgemv('T', &N, &N, &one, &pair_params[0,0], &N, &alpha[t,0], &inc,
    
  •                                                                 ^
    

  • svae/hmm/cython_hmm_inference.pyx:85:69: Cannot convert 'int *' to Python object
  • Error compiling Cython file:

  • ...
  •     themax = max_vector(node_params[t])
    
  •     for i in range(N):
    
  •         alpha[t,i] = in_potential[i] * exp(node_params[t,i] - themax)
    
  •     lognorm += log(normalize_inplace(alpha[t])) + themax
    
  •     dgemv('T', &N, &N, &one, &pair_params[0,0], &N, &alpha[t,0], &inc,
    
  •           &zero, &in_potential[0], &inc)
    
  •          ^
    

  • svae/hmm/cython_hmm_inference.pyx:86:14: Cannot convert 'double *' to Python object
  • Error compiling Cython file:

  • ...
  •     themax = max_vector(node_params[t])
    
  •     for i in range(N):
    
  •         alpha[t,i] = in_potential[i] * exp(node_params[t,i] - themax)
    
  •     lognorm += log(normalize_inplace(alpha[t])) + themax
    
  •     dgemv('T', &N, &N, &one, &pair_params[0,0], &N, &alpha[t,0], &inc,
    
  •           &zero, &in_potential[0], &inc)
    
  •                 ^
    

  • svae/hmm/cython_hmm_inference.pyx:86:21: Cannot convert 'double *' to Python object
  • Error compiling Cython file:

  • ...
  •     themax = max_vector(node_params[t])
    
  •     for i in range(N):
    
  •         alpha[t,i] = in_potential[i] * exp(node_params[t,i] - themax)
    
  •     lognorm += log(normalize_inplace(alpha[t])) + themax
    
  •     dgemv('T', &N, &N, &one, &pair_params[0,0], &N, &alpha[t,0], &inc,
    
  •           &zero, &in_potential[0], &inc)
    
  •                                   ^
    

  • svae/hmm/cython_hmm_inference.pyx:86:39: Cannot convert 'int *' to Python object
  • Traceback (most recent call last):
  • File "setup.py", line 6, in
  • ext_modules=cythonize('**/*.pyx'),
    
  • File "/usr/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 798, in cythonize
  • cythonize_one(*args[1:])
    
  • File "/usr/lib/python2.7/dist-packages/Cython/Build/Dependencies.py", line 915, in cythonize_one
  • raise CompileError(None, pyx_file)
    
  • Cython.Compiler.Errors.CompileError: svae/hmm/cython_hmm_inference.pyx`

cannot import TUpleNode from autograd.container_types

I was trying to import svae.svae and got the error:

Traceback (most recent call last):
File "", line 1, in
File "svae/svae.py", line 5, in
from util import split_into_batches, get_num_datapoints
File "svae/util.py", line 12, in
from autograd.container_types import TupleNode, ListNode
ImportError: cannot import name TupleNode

and checked autograd.container_types, this TupleNode is not available there, or did I miss something? Thanks for any help.

NIW natural parameterization

Hi! Thanks for sharing the code and congrats for this amazing article!

I have a particular doubt about the natural parameterisation of the NIW distribution and I saw that in your code there is a function to re-parameterise it (standard_to_natural() in svae/distributions/niw.py). In particular, I don't exactly see where the outer product in parameter S. Do you know any reference where I can check out the natural parameterisation for the NIW distribution (I couldn't find any)?

Many thanks in advance!

KL(q(x)||P(x|theta)) make things worse?

Hi Matt,

I'm not sure if this is right place to ask question about your paper?
But anyway, I implemented a version of latent Gaussain Mixture model (normal-gamma prior for the gaussian). I found that the gradient from the KL loss term ( E_q[ KL(q(x)||P(x|theta))] ) make the result worse. Well, it makes the latent space looks like a gaussian and the generator network wouldn't be able to learn to reconstruct at all. I manage to make it works sometime but all of the time without the KL loss.
What I am wondering is, is this the behaviour you see on your experiments?

Sub-question, I'm using learning rate around 0.1- 0.2 for updating global parameters with natural gradient and I'm using 0.001 for the neural network recogniser and generator...Do you optimise the two parts seperately or together? It seems like in the paper the theory suggest same learning rate but I'm not sure. Sorry, I should have read through your code to answer these question but I'm a little bit clueless reading code in general.

my code if anyone interested https://github.com/Nat-D/SVAE-Torch

Thanks alot in advance,
Nat

Question regarding the LDS example

Hi, I've been going through this code for the last week and I'm really excited about its potential. However, I'm currently stuck with the lds_svae_dots.py example. I've tried many different hyper-parameter initializations and I cannot reproduce the corresponding figure from the paper (not even with the ones reported in the paper). What you see below is kind of the closest I can get:

https://imgur.com/a/f9mV5

For those who are trying as well and if the code is slow then please go to the svae.optimizers.py file and unindent the callback such that the plotting function is called just once for a corresponding batch. You might also need to comment out the line with plt.close('all') in lds_svae_dots.py.

What follows is just a short report of what I've found out. The VAE part seems to work as the input is correctly reconstructed whereas the inference is usually off (see fig above). I ran the tests and realized that some of them were failing for the newest version of pylds. I therefore installed an older version of pylds (73fceec2215347e0a0e35a5f116e69aa719b2efc) which made the tests pass on a commit from April 2016 (a89e886).

Unfortunately this does not fix the issue with reproducing the LDS result so I am wondering if you are aware of what might be the underlying reason for why the model doesn't converge to a good solution? Could there be a bug in the inference part of the code which causes this behavior?

Thank you for making the code publicly available!

Cheers,

Haffi

small diffs

I saw you fixed the from test_util (change to svae.util), but somehow it reverted on trunk?
perhaps flesh out the setup.py a bit. Here is what I did (version requirements are just whatever
I had available on my Ubuntu 16.04 box).
I did not see the other experiments/ that wuaalb was mentioning. Perhaps they are on a branch?

diff --git a/setup.py b/setup.py
index c841313..01b893c 100644
--- a/setup.py
+++ b/setup.py
@@ -3,6 +3,16 @@ import numpy as np
from Cython.Build import cythonize

 setup(
+    name='svae',
+    version='0.0.0',
+    description='structure variational auto-encoder',
+    install_requires=['autograd>=1.1.7', 'numpy>=1.11.0', 'scipy>=0.17.0', 'Cython>=0.25.1'
+                      , 'pyhsmm>=0.1.6', 'toolz>=0.8.1'],
+    keywords=['autoencoder', 'machine learning', 'optimization'
+              , 'neural networks', 'Python', 'Numpy', 'Scipy'],
+    url='https://github.com/mattjj/svae',
+    packages=['svae', 'svae.distributions', 'svae.hmm', 'svae.lds', 'svae.models'],
+
     ext_modules=cythonize('**/*.pyx'),
     include_dirs=[np.get_include(),],
 )
diff --git a/tests/test_gaussian.py b/tests/test_gaussian.py
index 7b7e560..2fa0daa 100644
--- a/tests/test_gaussian.py
+++ b/tests/test_gaussian.py
@@ -5,8 +5,7 @@ from autograd import grad
 
 from svae.distributions.gaussian import logZ, expectedstats, \
     pack_dense, unpack_dense
-from test_util import rand_psd
-
+from svae.util import rand_psd
 
 def rand_gaussian(n):
     J = rand_psd(n) + n * np.eye(n)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.