Git Product home page Git Product logo

gpt-pinn's Introduction

Generative Pre-Trained Physics-Informed Neural Network Implementation

GPT-PINN: Generative Pre-Trained Physics-Informed Neural Networks toward non-intrusive Meta-learning of parametric PDEs

Yanlai Chen1, Shawn Koohy1

Paper Links: arXiv | ResearchGate

Image 1 GPT-PINN Architecture

Talk / Presentation

YouTube Link or Click Thumbnail Below

Watch the video

Abstract:

Physics-Informed Neural Network (PINN) has proven itself a powerful tool to obtain the numerical solutions of nonlinear partial differential equations (PDEs) leveraging the expressivity of deep neural networks and the computing power of modern heterogeneous hardware. However, its training is still time-consuming, especially in the multi-query and real-time simulation settings, and its parameterization often overly excessive. In this paper, we propose the Generative Pre-Trained PINN (GPT-PINN) to mitigate both challenges in the setting of parametric PDEs. GPT-PINN represents a brand-new meta-learning paradigm for parametric systems. As a network of networks, its outer-/meta-network is hyper-reduced with only one hidden layer having significantly reduced number of neurons. Moreover, its activation function at each hidden neuron is a (full) PINN pre-trained at a judiciously selected system configuration. The meta-network adaptively “learns” the parametric dependence of the system and “grows” this hidden layer one neuron at a time. In the end, by encompassing a very small number of networks trained at this set of adaptively-selected parameter values, the meta-network is capable of generating surrogate solutions for the parametric system across the entire parameter domain accurately and efficiently.

1 University of Massachusetts Dartmouth, Department of Mathematics, North Dartmouth, MA

Requirements:

Python     = 3.9.12
NumPy      = 1.23.4
PyTorch    = 1.11.0
TensorFlow = 2.10.0
Matplotlib = 3.6.2

Combinations of different package versions (recent ones) will most likely be able to run the code with little to no change.

GPU and CPU Support:

The code was implemented with the intention of computation to be primarily preformed on the GPU. CPU computation can be done however, it will take much longer.

Usage:

The Klein-Gordon, Allen-Cahn, and Burgers' equation files are currently available. Running KG_main.py, B_main.py, or AC_main.py (with the other files in the folder located in the respective directory) will begin the training of the full-PINN and GPT-PINN, growing the GPT-PINN hidden layer size from 1 to 15 (Klein-Gordon) or 9 (Burgers' and Allen-Cahn). The Final GPT-PINN is then trained on the generated test cases.

Changing the number of neurons one may want to grow the GPT-PINN up to is very straightforward. Simply modify the number_of_neurons variable (Line 72 in AC_main.py, Line 58 in KG_main.py, and Line 48 in B_main.py). As a default setting, once the total number of neurons is achieved, the GPT-PINN is trained once more in order to find the largest loss obtained (at 200 epochs) using the final number of neurons. This is done to give more information about the final state of the GPT-PINN. In order to use the GPT-PINN (in its final form) there is no need to find the largest loss once the final activation function is added. This feature can be turned off by setting train_final_gpt=False (Line 71 in AC_main.py, Line 57 in KG_main.py, and Line 47 in B_main.py).

Image 2

Klein-Gordon Run Times

Image 2

Burgers' Run Times

Image 2

Allen-Cahn Run Times

Citation:

Below you can find the Bibtex citation:

@article{chen2024gpt,
  title={GPT-PINN: Generative Pre-Trained Physics-Informed Neural Networks toward non-intrusive Meta-learning of parametric PDEs},
  author={Chen, Yanlai and Koohy, Shawn},
  journal={Finite Elements in Analysis and Design},
  volume={228},
  pages={104047},
  year={2024},
  publisher={Elsevier}
}

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.