Git Product home page Git Product logo

Comments (6)

abojchevski avatar abojchevski commented on September 1, 2024

Hi,

Thank you for your interest in our paper.

  1. That's right. I'm not sure I completely understand your question, so let me rephrase our goal. Essentially, for a node i, we want to parametrize the mean of its Gaussian distribution mu_i as a function of its attributes. That is, mu_i = f_theta(x_i), where f could be any function with some trainable parameters theta. You can choose any choice for f_theta, but in this work we let f_theta be a simple feedforward neural network.

  2. It's true that in general, the covariance matrix Sigma is an L x L matrix. However, for computational convenience we use a diagonal covariance matrix, that is, we have potentially non-zero values only on the diagonal and zeros everywhere else. Therefore, we only need to parametrize the diagonal of Sigma, which is an L dimensional vector. This has the added benefit of simplifying the computation of the KL divergence since its easy to invert a diagonal matrix.

Let me know if you have any more questions.

from graph2gauss.

KangyaHe avatar KangyaHe commented on September 1, 2024

Good idea. So in your model, you assumed that the parameters of the Gaussian distribution can be learned by the neural network. Is that right? So if there is no other information, how did you know the final output was the right mu and sigma of the Gaussian distribution, maybe we can call them 'latent representation' or something else. So my question is that how can you prove the sigma_i, mu_i =f_theta(x_i) is the parameters of the Gaussian distribution but not something else.

from graph2gauss.

abojchevski avatar abojchevski commented on September 1, 2024

Exactly. We learn the parameters of the Gaussian distribution.

We want mu and Sigma to be valid parameters. For mu there is no restriction on what it can be. Sigma on the other hand has to be a PSD matrix. For a diagonal Sigma we can easily enforce this by requiring the elements on the diagonal to be positive.

from graph2gauss.

KangyaHe avatar KangyaHe commented on September 1, 2024

OK
Let me have a conclusion. The main idea of the work is that the uncertainty of each nodes can be represented by a specific Gaussian model. Then if we can learn the right parameters of the model, then they would satisfy the "pairwise constraints" in the "Dissimilarity part".
A:"the parameters of Gaussian model would satisfy the pairwise constraints "
B: "if the parameters satisfy the pairwise constraints, then they are the parameters of the Gaussian model "
It can be seen that the mu and sigma are learned by the constraints. That means the idea was B. As far as I know, B may not set up.

from graph2gauss.

abojchevski avatar abojchevski commented on September 1, 2024

I'm not sure what you mean by "B may not set up".

Of course, it could be true that it is not even possible for all pairwise constraints to be satisfied (or we might need a very large dimensionallity to do so), e.g. a scenario where for any setting of the parameters there exist some constraints that are violated. This is a very interesting research question but we don't have any results on this.

We also don't claim that parameters satisfying pairwise constraints => parameters of a Gaussian. A Gaussian distribution is simply one convenient choice that captures uncertainty better than a point estimate.

from graph2gauss.

KangyaHe avatar KangyaHe commented on September 1, 2024

I got it. Thank you very much.

from graph2gauss.

Related Issues (11)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.