Git Product home page Git Product logo

Comments (2)

masaaldosey avatar masaaldosey commented on September 24, 2024

since the data present in burgers_shock.mat was generated using MATLAB, the initial and boundary condition are used to generate this data with the help of a suitable numerical method in MATLAB.

from pinns.

sajed-zarrinpour avatar sajed-zarrinpour commented on September 24, 2024

If you mean, where they are used in the neural network?
As far as I could make it by looking at the code, it uses the data implicitly to put bindings on the network.
Here is an example of it in the Schrodinger case:
In continuous time inference case, in the main function, we have the lower bound and upper bound of the region defined as:
lb = np.array([-5.0, 0.0])
ub = np.array([5.0, np.pi/2])
and we have
idx_t = np.random.choice(t.shape[0], N_b, replace=False)
tb = t[idx_t,:]
which is the sampling from those bounds. Moreover, the initial condition can be traced in
idx_x = np.random.choice(x.shape[0], N0, replace=False)
x0 = x[idx_x,:]
u0 = Exact_u[idx_x,0:1]
v0 = Exact_v[idx_x,0:1]
Now, if we look at the loss function, we can see this :
self.loss = tf.reduce_mean(tf.square(self.u0_tf - self.u0_pred)) + \
tf.reduce_mean(tf.square(self.v0_tf - self.v0_pred)) + \
tf.reduce_mean(tf.square(self.u_lb_pred - self.u_ub_pred)) + \
tf.reduce_mean(tf.square(self.v_lb_pred - self.v_ub_pred)) + \
tf.reduce_mean(tf.square(self.u_x_lb_pred - self.u_x_ub_pred)) + \
tf.reduce_mean(tf.square(self.v_x_lb_pred - self.v_x_ub_pred)) + \
tf.reduce_mean(tf.square(self.f_u_pred)) + \
tf.reduce_mean(tf.square(self.f_v_pred))
which is indicating that these samples from the lower bound and upper bound and the initial condition are also cooperating in the learning process of the network.
What happens is that the network tries to learn the data you provided for it, which also includes the data over the boundaries. Like so, the network will be forced to respect those conditions in some sense (of course, it treats them just like other data points and it would not be a strong condition since it has the same weight as the other losses. But it has its own impact. I believe one may try to associates some weights to those loss functions to adjust the importance of each of them.).

I would be happy that the community corrects me if I am wrong about it.

from pinns.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.