forcedotcom / distributions Goto Github PK
View Code? Open in Web Editor NEWLow-level primitives for collapsed Gibbs sampling in python and C++
License: BSD 3-Clause "New" or "Revised" License
Low-level primitives for collapsed Gibbs sampling in python and C++
License: BSD 3-Clause "New" or "Revised" License
Problem Event Name: APPCRASH
Application Name: pythonw.exe
Application Version: 0.0.0.0
Application Timestamp: 5398d7c1
Fault Module Name: python34.dll
Fault Module Version: 3.4.1150.1013
Fault Module Timestamp: 5398d7c0
Exception Code: c0000005
Exception Offset: 00000000000d7881
OS Version: 6.1.7601.2.1.0.256.4
Locale ID: 1033
This happens whenever I try to install distributions-2.0.26 under Anaconda3 (Python 3.4), in Windows 7, on 64-bit. Happens if pip is used or install is attempted from source. I have tried installing from an Administrator Command Prompt as well, and from the QT window. Also have gone back to distributions-2.0.10 and tried that. This happens trying to install dpmix as well.
Anaconda3 was just installed today, and I ran updates of numpy, scipy, and anaconda itself.
Thoughts?
The generated code in git is compatible with ubuntu 12.04 but not 14.04. This currently requires a work-around in loom on ubuntu 14.04.
> from distributions.lp.models import nich
> shared = nich.Shared.from_dict(nich.EXAMPLES[0]['shared'])
> group = nich.Group.from_values(shared, nich.sample_group(shared, 10))
> sampler = nich.Sampler(shared, group)
> sampler.eval(shared)
0.0
and
> nosetests distributions/tests/test_models.py:test_joint
FSegmentation fault (core dumped)
core dump is also not caught by running tests via make.
I think for a newcomer it would be really useful to have a much simpler example. E.g. the classic: samples from a mixture of gaussians that's then estimated parametrically and non-parametrically.
I also found the compression example a little hard to follow. Could certainly use more comments at each step and what the model is actually inferring.
sample_beta
will return NaNs when alpha, beta are small (e.g. 1/100).
A more robust implementation might look like the one in numpy (https://github.com/numpy/numpy/blob/master/numpy/random/mtrand/distributions.c#L183), which uses Johnk's algorithm (see e.g. page 418 of http://www.eirene.de/Devroye.pdf)
pip install distributions
Downloading/unpacking distributions
Downloading distributions-2.0.0.tar.gz (67kB): 67kB downloaded
Running setup.py egg_info for package distributions
building cython extensions
Traceback (most recent call last):
File "<string>", line 16, in <module>
File "/home/wiecki/envs/pymc3/build/distributions/setup.py", line 48, in <module>
with open('README.md') as f:
IOError: [Errno 2] No such file or directory: 'README.md'
Complete output from command python setup.py egg_info:
building cython extensions
Traceback (most recent call last):
File "<string>", line 16, in <module>
File "/home/wiecki/envs/pymc3/build/distributions/setup.py", line 48, in <module>
with open('README.md') as f:
IOError: [Errno 2] No such file or directory: 'README.md'
I then tried to pip install directly from the github repo but got:
building 'distributions.hp.random' extension
x86_64-linux-gnu-gcc -pthread -fno-strict-aliasing -DNDEBUG -g -fwrapv -O2 -Wall -Wstrict-prototypes -fPIC -Iinclude -Idistributions -I/home/wiecki/envs/pymc3/local/lib/python2.7/site-packages/numpy/core/include -I/usr/include/python2.7 -c distributions/hp/random.cpp -o build/temp.linux-x86_64-2.7/distributions/hp/random.o -DDIST_DEBUG_LEVEL=3 -DDIST_THROW_ON_ERROR=1 -std=c++0x -Wall -Werror -Wno-unused-function -Wno-sign-compare -Wno-strict-aliasing -O3 -ffast-math -funsafe-math-optimizations -mfpmath=sse -msse4.1 -DNPY_NO_DEPRECATED_API=NPY_1_7_API_VERSION
cc1plus: warning: command line option ‘-Wstrict-prototypes’ is valid for C/ObjC but not for C++ [enabled by default]
distributions/hp/random.cpp: In function ‘int __pyx_pf_5numpy_7ndarray___getbuffer__(PyArrayObject*, Py_buffer*, int)’:
distributions/hp/random.cpp:2044:52: error: ‘NPY_C_CONTIGUOUS’ was not declared in this scope
__pyx_t_2 = ((!(PyArray_CHKFLAGS(__pyx_v_self, NPY_C_CONTIGUOUS) != 0)) != 0);
^
distributions/hp/random.cpp:2082:52: error: ‘NPY_F_CONTIGUOUS’ was not declared in this scope
__pyx_t_1 = ((!(PyArray_CHKFLAGS(__pyx_v_self, NPY_F_CONTIGUOUS) != 0)) != 0);
^
distributions/hp/random.cpp:2245:42: error: ‘PyArrayObject’ has no member named ‘descr’
__pyx_t_4 = ((PyObject *)__pyx_v_self->descr);
^
In file included from /usr/include/python2.7/Python.h:80:0,
from distributions/hp/random.cpp:16:
distributions/hp/random.cpp: In function ‘void __pyx_f_5numpy_set_array_base(PyArrayObject*, PyObject*)’:
distributions/hp/random.cpp:3797:27: error: ‘PyArrayObject’ has no member named ‘base’
Py_XDECREF(__pyx_v_arr->base);
^
/usr/include/python2.7/object.h:823:34: note: in definition of macro ‘Py_XDECREF’
#define Py_XDECREF(op) do { if ((op) == NULL) ; else Py_DECREF(op); } while (0)
^
distributions/hp/random.cpp:3797:27: error: ‘PyArrayObject’ has no member named ‘base’
Py_XDECREF(__pyx_v_arr->base);
^
/usr/include/python2.7/object.h:772:24: note: in definition of macro ‘Py_DECREF’
--((PyObject*)(op))->ob_refcnt != 0) \
^
distributions/hp/random.cpp:3797:3: note: in expansion of macro ‘Py_XDECREF’
Py_XDECREF(__pyx_v_arr->base);
^
distributions/hp/random.cpp:3797:27: error: ‘PyArrayObject’ has no member named ‘base’
Py_XDECREF(__pyx_v_arr->base);
^
/usr/include/python2.7/object.h:115:47: note: in definition of macro ‘Py_TYPE’
#define Py_TYPE(ob) (((PyObject*)(ob))->ob_type)
^
/usr/include/python2.7/object.h:775:9: note: in expansion of macro ‘_Py_Dealloc’
_Py_Dealloc((PyObject *)(op)); \
^
/usr/include/python2.7/object.h:823:54: note: in expansion of macro ‘Py_DECREF’
#define Py_XDECREF(op) do { if ((op) == NULL) ; else Py_DECREF(op); } while (0)
^
distributions/hp/random.cpp:3797:3: note: in expansion of macro ‘Py_XDECREF’
Py_XDECREF(__pyx_v_arr->base);
^
distributions/hp/random.cpp:3797:27: error: ‘PyArrayObject’ has no member named ‘base’
Py_XDECREF(__pyx_v_arr->base);
^
/usr/include/python2.7/object.h:762:45: note: in definition of macro ‘_Py_Dealloc’
(*Py_TYPE(op)->tp_dealloc)((PyObject *)(op)))
^
/usr/include/python2.7/object.h:823:54: note: in expansion of macro ‘Py_DECREF’
#define Py_XDECREF(op) do { if ((op) == NULL) ; else Py_DECREF(op); } while (0)
^
distributions/hp/random.cpp:3797:3: note: in expansion of macro ‘Py_XDECREF’
Py_XDECREF(__pyx_v_arr->base);
^
distributions/hp/random.cpp:3806:16: error: ‘PyArrayObject’ has no member named ‘base’
__pyx_v_arr->base = __pyx_v_baseptr;
^
distributions/hp/random.cpp: In function ‘PyObject* __pyx_f_5numpy_get_array_base(PyArrayObject*)’:
distributions/hp/random.cpp:3841:30: error: ‘PyArrayObject’ has no member named ‘base’
__pyx_t_1 = ((__pyx_v_arr->base == NULL) != 0);
^
In file included from /usr/include/python2.7/Python.h:80:0,
from distributions/hp/random.cpp:16:
distributions/hp/random.cpp:3864:44: error: ‘PyArrayObject’ has no member named ‘base’
__Pyx_INCREF(((PyObject *)__pyx_v_arr->base));
^
/usr/include/python2.7/object.h:767:18: note: in definition of macro ‘Py_INCREF’
((PyObject*)(op))->ob_refcnt++)
^
distributions/hp/random.cpp:3864:5: note: in expansion of macro ‘__Pyx_INCREF’
__Pyx_INCREF(((PyObject *)__pyx_v_arr->base));
^
distributions/hp/random.cpp:3865:41: error: ‘PyArrayObject’ has no member named ‘base’
__pyx_r = ((PyObject *)__pyx_v_arr->base);
^
error: command 'x86_64-linux-gnu-gcc' failed with exit status 1
This is with most recent cython and numpy 1.8.1.
A C++ implementation nich.cc is already in place, but is missing a sampler.
Distributions 1.0 python code can be found at
https://github.com/forcedotcom/distributions/blob/de5876fe/distributions/conjugate/nich.py
and cython code at
https://github.com/forcedotcom/distributions/blob/de5876fe/distributions/conjugate/cynich.pyx
Really this is an issue with std::uniform_real_distribution, but until that's fixed we should make sure this code is robust to its weird behavior. Very rarely (say one in 10^7 samples) std::uniform_real_distribution (0.0, 1.0) returns exactly 1.0, meaning sample_unif01 returns exactly 1.0. If the vector of likelihoods passed to sample_from_likelihoods ends with zero, then sample_from_likelihoods will return the length of the likelihood array, rather than the last nonzero element of the likelihood array. This can be fixed by replacing if (DIST_UNLIKELY(t < 0)) {
on line 325 of distributions/random.hpp with if (DIST_UNLIKELY(t <= 0)) {
A C++ implementation of template<> ADD<2>
can be found at
https://github.com/priorknowledge/tardis/blob/b5ef6cf4/util/compmodels.h#L396
A python implementation of beta-bernoulli can be found at
https://github.com/forcedotcom/distributions/blob/de5876fe/distributions/conjugate/bb.py
====================================================================== FAIL: distributions.tests.test_model_flavors.test_group('dpd',) ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/ubuntu/.virtualenvs/sf/local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/home/ubuntu/sf/distributions/distributions/tests/test_model_flavors.py", line 86, in _test_group shared.add_value(value) File "/home/ubuntu/sf/distributions/distributions/dbg/models/dpd.py", line 138, in add_value assert value != OTHER, 'cannot add OTHER' AssertionError: cannot add OTHER
Maintainers have moved to the repo http://github.com/posterior/distributions
do we want this? data-microscopes is using my unofficial version of it already, and it seems to work OK. if so, I think we'll want to do two things:
(A) make an official distributions channel on binstar
(B) get a travis CI hook to build new recipes when tags are made (or this can be manual, but it has to be done on both an OS X machine and Linux machine separately)
right now the pip install route is (a) not kept up to date and (b) doesn't quite work out of the box (you still need to compile libdistributions by yourself). I think the conda install can potentially be more portable since it, for example, packages libprotobuf so you don't have to worry about protobuf library version discrepancies. conda also gets the rpath stuff right so you don't have to keep exporting (DY)LD_LIBRARY_PATH
Destination: https://pypi.python.org/pypi/distributions
I'm trying to install distributions via pip install distributions
on OS X, and I get the following error (after a lot of spew):
distributions/lp/models/_bb.cpp:6342:42: error: expected the class name after '~' to name a destructor
p->scores.distributions::VectorFloat::~VectorFloat();
^
5 warnings and 1 error generated.
error: command 'gcc' failed with exit status 1
My environment:
(x)stephentu@ibanez:~$ python --version
Python 2.7.7 :: Anaconda 2.0.1 (x86_64)
(x)stephentu@ibanez:~$ uname -a
Darwin ibanez 13.3.0 Darwin Kernel Version 13.3.0: Tue Jun 3 21:27:35 PDT 2014; root:xnu-2422.110.17~1/RELEASE_X86_64 x86_64
(x)stephentu@ibanez:~$ python -c 'import Cython; print Cython.__version__'
0.20.2
This error I believe is related to cython generator invalid C++ code for clang (gcc is OK with it). A simple fix I believe is to require cython>=0.20.2b1 when compiling with clang
I am getting persistent failure of one of the tests:
======================================================================
FAIL: distributions.tests.test_clustering.test_sample_matches_score_counts('lp.LowEntropy',)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/stephentu/anaconda/envs/distributions_clean/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/Users/stephentu/distributions_clean/distributions/tests/test_clustering.py", line 92, in test_one_model
test_fun(Model, EXAMPLE, sample_count)
File "/Users/stephentu/distributions_clean/distributions/tests/test_clustering.py", line 164, in test_sample_matches_score_counts
assert_greater(gof, MIN_GOODNESS_OF_FIT)
AssertionError: 0.00042565398058435689 not greater than 0.001
-------------------- >> begin captured stdout << ---------------------
example 1/4
sample_size = 2
Prob Count
0.738 1428 ------------------------------------------------------------
0.262 572 ------------------------
LowEntropy gof = 0.000426
--------------------- >> end captured stdout << ----------------------
Here's how I'm building/running the tests:
(distributions_clean)stephentu@ibanez:~/distributions_clean(master)$ git log -n1
commit 8867b7823b3ab1dc0fa041f1fe25089d8393ca89
Merge: e1c80d0 3c82590
Author: jglidden-salesforce <[email protected]>
Date: Fri Aug 1 16:26:40 2014 -0700
Merge pull request #72 from forcedotcom/rng-args
Rng args
(distributions_clean)stephentu@ibanez:~/distributions_clean(master)$ git status
On branch master
Your branch is up-to-date with 'origin/master'.
nothing to commit, working directory clean
(distributions_clean)stephentu@ibanez:~/distributions_clean(master)$ mkdir build
(distributions_clean)stephentu@ibanez:~/distributions_clean(master)$ cd build
(distributions_clean)stephentu@ibanez:~/distributions_clean/build(master)$ cmake -DCMAKE_INSTALL_PREFIX=/Users/stephentu/anaconda/envs/distributions_clean -DEXTRA_INCLUDE_PATH=/Users/stephentu/anaconda/envs/distributions_clean/include -DEXTRA_LIBRARY_PATH=/Users/stephentu/anaconda/envs/distributions_clean/lib ..
-- The C compiler identification is AppleClang 5.1.0.5030040
-- The CXX compiler identification is AppleClang 5.1.0.5030040
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
CXX_FLAGS = -stdlib=libc++ -mmacosx-version-min=10.7 -Wno-deprecated-register -fPIC -g -std=c++0x -Wall -Wextra -Werror -Wno-unused-parameter -Wno-strict-aliasing -O3 -mfpmath=sse -msse4.1 -ffast-math -funsafe-math-optimizations
[... snip ...]
(distributions_clean)stephentu@ibanez:~/distributions_clean/build(master)$ make && make install && cd ..
[... snip ...]
(distributions_clean)stephentu@ibanez:~/distributions_clean(master)$ PYDISTRIBUTIONS_USE_LIB=1 LIBRARY_PATH=/Users/stephentu/anaconda/envs/distributions_clean/lib EXTRA_INCLUDE_PATH=/Users/stephentu/anaconda/envs/distributions_clean/include pip install -e .
(distributions_clean)stephentu@ibanez:~/distributions_clean(master)$ export DYLD_FALLBACK_LIBRARY_PATH=~/anaconda/envs/distributions_clean/lib
(distributions_clean)stephentu@ibanez:~/distributions_clean(master)$ nosetests -v distributions
Adapt cython implementation from
https://github.com/priorknowledge/distributions/blob/de5876fe/distributions/conjugate/dpm.py
Adapt python implementation from
https://github.com/priorknowledge/distributions/blob/de5876fe/distributions/conjugate/dpm.py
I think Fritz and I had a conversation somewhat related to this topic, but I sort of forgot the outcome. If I have a DD where the dim is very large, and I expect the non-zero entries of the suffstats (e.g. the counts) to be very sparse, what's the right way to do this in distributions?
Essentially I want a DD where the counts[] is a Sparse<> instead of float[]. Would it be worth created a separate model which is SparseDD?
Am I missing some undocumented environment variable?
python -m loom.datasets init
Traceback (most recent call last):
File "/usr/lib/python2.7/runpy.py", line 162, in _run_module_as_main
"__main__", fname, loader, pkg_name)
File "/usr/lib/python2.7/runpy.py", line 72, in _run_code
exec code in run_globals
File "/home/ubuntu/sf/loom/loom/datasets.py", line 32, in <module>
import loom.format
File "loom/format.py", line 39, in <module>
import loom.cFormat
ImportError: libdistributions_shared.so: cannot open shared object file: No such file or directory
This is required to use ${DISTRIBUTIONS_SHARED_LIBS} and ${DISTRIBUTIONS_STATIC_LIBS} variables in client CMakeLists.txt scripts.
====================================================================== FAIL: distributions.tests.test_random.test_sample_prob_from_scores ---------------------------------------------------------------------- Traceback (most recent call last): File "/home/travis/virtualenv/python2.7_with_system_site_packages/local/lib/python2.7/site-packages/nose/case.py", line 197, in runTest self.test(*self.arg) File "/home/travis/build/forcedotcom/distributions/distributions/tests/test_random.py", line 210, in test_sample_prob_from_scores assert_samples_match_scores(sampler) File "/home/travis/build/forcedotcom/distributions/distributions/tests/util.py", line 217, in assert_samples_match_scores assert_counts_match_probs(counts, probs) File "/home/travis/build/forcedotcom/distributions/distributions/tests/util.py", line 203, in assert_counts_match_probs assert gof > tol, 'failed with goodness of fit {}'.format(gof) AssertionError: failed with goodness of fit 0.000945166953359 -------------------- begin captured stdout --------------------- EXPECT ACTUAL VALUE 10000.0 10000 0 goodness of fit = 1 EXPECT ACTUAL VALUE 5693.5 5747 0 4306.5 4253 1 goodness of fit = 0.126409842659 EXPECT ACTUAL VALUE 5779.1 5802 2 2534.8 2458 0 1686.1 1740 1 goodness of fit = 0.0671215589621 EXPECT ACTUAL VALUE 8314.4 8298 2 670.7 706 0 585.9 581 1 429.0 415 3 goodness of fit = 0.44019109939 EXPECT ACTUAL VALUE 4839.9 4842 3 1876.9 1901 2 1808.5 1814 1 1412.0 1379 0 62.6 64 4 goodness of fit = 0.856219051563 EXPECT ACTUAL VALUE 3088.0 3109 4 2174.6 2212 3 1713.8 1651 2 1430.5 1450 5 808.0 794 1 785.0 784 0 goodness of fit = 0.495742040274 EXPECT ACTUAL VALUE 2266.4 2226 0 2015.0 2095 5 1752.0 1722 2 1525.7 1621 3 904.2 846 4 872.3 807 6 664.4 683 1 goodness of fit = 0.000945166953359 --------------------- end captured stdout ---------------------- ---------------------------------------------------------------------- Ran 310 tests in 28.206s FAILED (SKIP=10, failures=1) make: *** [test_cy] Error 1 The command "make test$FLAVOR" exited with 2. Done. Your build exited with 1.
pip fails on OSX 10.10.5 with the following error
clang++ -bundle -undefined dynamic_lookup build/temp.macosx-10.10-x86_64-2.7/distributions/lp/special.o -L/usr/local/lib -L/usr/local/opt/openssl/lib -L/usr/local/opt/sqlite/lib -ldistributions_shared -lm -o build/lib.macosx-10.10-x86_64-2.7/distributions/lp/special.so
ld: library not found for -ldistributions_shared
clang: error: linker command failed with exit code 1 (use -v to see invocation)
error: command 'clang++' failed with exit status 1
A C++ implementation gp.cc is already in place, but is missing a sampler.
Distributions 1.0 python code can be found at
https://github.com/forcedotcom/distributions/blob/de5876fe/distributions/conjugate/gp.py
and cython code at
https://github.com/forcedotcom/distributions/blob/de5876fe/distributions/conjugate/cygp.pyx
Does it make sense to add a plus_group method to group to combine suff stats? This is a common pattern in inference for HDP models and, I believe, split-merge kernels.
shared = model.Shared()
group1 = model.Group()
group1.init(shared)
group2 = model.Group()
group2.init(shared)
group3 = group1.plus_group(group2)
Running CC=clang PYDISTRIBUTIONS_USE_LIB=1 make test
on OSX, I see these errors:
======================================================================
FAIL: distributions.tests.test_clustering.test_models(<type 'distributions.lp.clustering.PitmanYor'>, 5)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/cpetschulat/.virtualenvs/pk/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/Users/cpetschulat/src/distributions/distributions/tests/test_clustering.py", line 97, in _test_models
'not normalized: {}'.format(total))
AssertionError: not normalized: 0.989162792478
-------------------- >> begin captured stdout << ---------------------
Example 0
Prob Count
0.200 190 ------------------------------------------------------------
0.050 63 --------------------
0.050 53 -----------------
0.050 52 ----------------
0.050 42 -------------
0.050 40 -------------
0.017 22 -------
0.017 21 -------
PitmanYor gof = 0.214
Example 1
Prob Count
0.161 151 ------------------------------------------------------------
0.045 61 ------------------------
0.045 52 ---------------------
0.045 47 -------------------
0.045 45 ------------------
0.045 49 -------------------
0.020 20 --------
0.019 22 ---------
PitmanYor gof = 0.412
Example 2
Prob Count
0.755 763 ------------------------------------------------------------
0.016 18 -
0.016 18 -
0.016 15 -
0.016 11 -
0.016 15 -
0.016 15 -
0.016 14 -
PitmanYor gof = 0.919
Example 3
--------------------- >> end captured stdout << ----------------------
======================================================================
FAIL: distributions.tests.test_models.test_sample_group('lp.models.dpd',)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/Users/cpetschulat/.virtualenvs/pk/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/Users/cpetschulat/src/distributions/distributions/tests/test_models.py", line 95, in test_one_model
test_fun(Model, EXAMPLE)
File "/Users/cpetschulat/src/distributions/distributions/tests/test_models.py", line 292, in test_sample_group
assert_greater(gof, MIN_GOODNESS_OF_FIT)
AssertionError: 1.4720371676952502e-06 not greater than 0.001
-------------------- >> begin captured stdout << ---------------------
example 1/1
Prob Count
0.417 368 ------------------------------------------------------------
0.188 195 --------------------------------
0.188 187 ------------------------------
0.042 65 -----------
0.042 56 ---------
0.042 48 --------
0.042 30 -----
0.021 34 ------
DirichletProcessDiscrete gof = 1.47e-06
--------------------- >> end captured stdout << ----------------------
make test
now fails for me with:
...
Linking CXX executable foo
...
/home/ubuntu/sf/distributions/build/src/libdistributions_shared.so: undefined reference to `google::protobuf::internal::WireFormat::SerializeUnknownFields(google::protobuf::UnknownFieldSet const&, google::protobuf::io::CodedOutputStream*)'
Am I missing an undocumented environment variable?
$ env | grep DIST
DISTRIBUTIONS_PATH=/home/ubuntu/sf/distributions
DISTRIBUTIONS_USE_PROTOBUF=1
$ env | grep PATH
LD_LIBRARY_PATH=:/home/ubuntu/johann/lib:/home/ubuntu/pomagma/lib:/home/ubuntu/johann/lib:/home/ubuntu/pomagma/lib:/home/ubuntu/johann/lib:/home/ubuntu/pomagma/lib:/home/ubuntu/johann/lib:/home/ubuntu/pomagma/lib:/home/ubuntu/johann/lib:/home/ubuntu/pomagma/lib:/home/ubuntu/johann/lib:/home/ubuntu/pomagma/lib:/home/ubuntu/johann/lib:/home/ubuntu/pomagma/lib:/home/ubuntu/johann/lib:/home/ubuntu/pomagma/lib:/home/ubuntu/johann/lib:/home/ubuntu/pomagma/lib:/home/ubuntu/johann/lib:/home/ubuntu/pomagma/lib:/home/ubuntu/johann/lib:/home/ubuntu/pomagma/lib:/home/ubuntu/johann/lib:/home/ubuntu/pomagma/lib
PATH=/home/ubuntu/.virtualenvs/sf/bin:/usr/local/heroku/bin:/home/ubuntu/.virtualenvs/sf/bin:/usr/local/heroku/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/home/ubuntu/bin:/home/ubuntu/tool_config/bin:/home/ubuntu/johann/bin:/home/ubuntu/kazoo/bin:/home/ubuntu/jtext/bin:/home/ubuntu/bin:/home/ubuntu/tool_config/bin:/home/ubuntu/johann/bin:/home/ubuntu/kazoo/bin:/home/ubuntu/jtext/bin
DISTRIBUTIONS_PATH=/home/ubuntu/sf/distributions
Where did they go?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.