Git Product home page Git Product logo

brancher's People

Contributors

cclauss avatar chocoladisco avatar colcarroll avatar erjanmx avatar immiora avatar lucaambrogioni avatar seelikat avatar svendh avatar umuguc avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

brancher's Issues

Undefined name: 'matmul' in functions.py

Missing import? Typo?

flake8 testing of https://github.com/AI-DI/Brancher on Python 3.7.1

$ flake8 . --count --select=E9,F63,F72,F82 --show-source --statistics

./brancher/functions.py:135:12: F821 undefined name 'matmul'
    return matmul(weight, input) + bias
           ^
1     F821 undefined name 'matmul'
1

E901,E999,F821,F822,F823 are the "showstopper" flake8 issues that can halt the runtime with a SyntaxError, NameError, etc. These 5 are different from most other flake8 issues which are merely "style violations" -- useful for readability but they do not effect runtime safety.

  • F821: undefined name name
  • F822: undefined name name in __all__
  • F823: local variable name referenced before assignment
  • E901: SyntaxError or IndentationError
  • E999: SyntaxError -- failed to compile a file into an Abstract Syntax Tree

for_gradient parameter not used?

The variables.py module contains some functions that have a for_gradient parameter. I assume it is for calculating gradients using the results of the functions but I cannot find what actually happens in the code when this variable is set to True. Only estimate_log_model_evidence seems to use for_gradient.

Boolean Operations Between Variables and PartialLink

Seems like lt, le, gt, ge, eq, ne are all useful operations in probabilistic models. I tried out adding these to the Variable & PartialLink classes but maybe there's more to it than that. Are these operations that are doable?

Fixing random seeds

I have been trying to fix multiple runs of the exact same program so that it is repeatable and having no luck. I have been using the following method calls:

def fix_seed():
    seed_value = 1337

    np.random.seed(seed_value) 
    torch.manual_seed(seed_value) 
    random.seed(seed_value)
    torch.cuda.manual_seed(seed_value)
    torch.cuda.manual_seed_all(seed_value) 
    torch.backends.cudnn.deterministic = True
    torch.backends.cudnn.benchmark = False

I have tried setting cfg.set_device to gpu and to cpu and in both cases I cannot reproduce Brancher calls. My actual program just does an inference and then samples from the posterior. I was under the impression that if I fix all of these random seeds before a run and input is the exact same, runs should be reproducible. So far I have not been able to achieve that.

Combine with other pytorch models?

In many cases, the probability model built by Brancher might be just a part of a model, which is usually built by pytorch (e.g., ResNet). How to combine them together in a simple/easy way?

How to use GPU to accelerate ?

Thank you for this interesting package. Can we use GPU to accelerate the computation? Like "model.cuda()" in pytorch?

Bounded variables?

Hi, I am looking to implement a probabilistic model in Brancher that requires a truncated normal distribution. In pymc3 that can be achieved with https://docs.pymc.io/api/bounds.html as well as an explicit TruncatedNormal class. Is there an equivalent or some sort of work-around in Brancher?
Thanks

"scale" parameter value modified?

The following code with scale=0.01

x = NormalVariable(loc=0., scale=0.01, name="mu")
model = ProbabilisticModel([x])
sample = model.get_sample(2)
print(sample)

returns this:

   mu_scale  mu_loc        mu
0 -4.600166     0.0 -0.030332
1 -4.600166     0.0 -0.008252

Value of the mu_scale has been changed. Is this expected behavior since I have declared the random variable x with the scale value scale=0.01, but got something like log(0.01) = -4.600166.

Negative value of "scale" parameter doesn't throw an exception

Hi :),
setting the negative value of the scale parameter doesn't throw an exception and allows user to even sample from that model. There is only a RuntimeWarning: "invalid value encountered in log return np.log(np.exp(y - self.lower_bound) - 1)".

x = NormalVariable(loc=0., scale=-10., name="mu")
model = ProbabilisticModel([x])

If one try to sample from this model, output will contain NaN but maybe it would be better to throw an exception for the invalid parameters of the model.

sample = model.get_sample(2)
print(sample)

doc strings

I am currently working on the documentation. Progress on documentation:

  • variables.py
  • distributions.py
  • utilities.py
  • standard_variables.py
  • config.py
  • functions.py
  • geometric_ranges.py
  • gradient_estimators.py
  • inference.py
  • modules.py
  • optimizers.py
  • pandas_interface.py
  • particle_inference_tools.py
  • stochastic_processes.py
  • transformations.py
  • visualizations.py

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.