Git Product home page Git Product logo

data-driven-discretization-1d's Introduction

Learning data-driven discretizations for partial differential equations

Code associated with the paper:

Learning data-driven discretizations for partial differential equations. Yohai Bar-Sinai, Stephan Hoyer, Jason Hickey, Michael P. Brenner. Proceedings of the National Academy of Sciences Jul 2019, 116 (31) 15344-15349; DOI: 10.1073/pnas.1814058116.

Deprecation

This code for Data Driven Discretization was developed for and used in [https://arxiv.org/abs/1808.04930]. The code is fully functional, but is no longer maintained. It was deprecated by a new implementation that can natively handle higher dimensions and is better designed to be generalized. The new code is available here. If you want to implement our method on your favorite equation, please contact the authors.

Running the code

Local installation

If desired, you can install the code locally. You can also run using Google's hosted Colab notebook service (see below for examples).

Clone this repository and install in-place:

git clone https://github.com/google/data-driven-discretization-1d.git
pip install -e data-driven-discretization-1d

Note that Python 3 is required. Dependencies for the core library (including TensorFlow) are specified in setup.py and should be installed automatically as required. Also note that TensorFlow 1.x is required: this code has not been updated to use TensorFlow 2.0.

From the source directory, execute each test file:

cd data-driven-discretization-1d
python ./pde_superresolution/integrate_test.py
python ./pde_superresolution/training_test.py

Training your own models

We used the scripts in the pde_superresolution/scripts directly to run training. In particular, see run_training.py.

Training data was created with create_training_data.py, but can also be downloaded from Google Cloud Storage:

We have two notebooks showing how to train and run parts of our model. As written, these notebooks are intended to run in Google Colab, which can do by clicking the links below:

These notebooks install the code from scratch; skip those cells if running things locally. You will also need gsutil installed to download data from Google Cloud Storage.

Citation

@article {Bar-Sinai15344,
	author = {Bar-Sinai, Yohai and Hoyer, Stephan and Hickey, Jason and Brenner, Michael P.},
	title = {Learning data-driven discretizations for partial differential equations},
	volume = {116},
	number = {31},
	pages = {15344--15349},
	year = {2019},
	doi = {10.1073/pnas.1814058116},
	publisher = {National Academy of Sciences},
	abstract = {In many physical systems, the governing equations are known with high confidence, but direct numerical solution is prohibitively expensive. Often this situation is alleviated by writing effective equations to approximate dynamics below the grid scale. This process is often impossible to perform analytically and is often ad hoc. Here we propose data-driven discretization, a method that uses machine learning to systematically derive discretizations for continuous physical systems. On a series of model problems, data-driven discretization gives accurate solutions with a dramatic drop in required resolution.The numerical solution of partial differential equations (PDEs) is challenging because of the need to resolve spatiotemporal features over wide length- and timescales. Often, it is computationally intractable to resolve the finest features in the solution. The only recourse is to use approximate coarse-grained representations, which aim to accurately represent long-wavelength dynamics while properly accounting for unresolved small-scale physics. Deriving such coarse-grained equations is notoriously difficult and often ad hoc. Here we introduce data-driven discretization, a method for learning optimized approximations to PDEs based on actual solutions to the known underlying equations. Our approach uses neural networks to estimate spatial derivatives, which are optimized end to end to best satisfy the equations on a low-resolution grid. The resulting numerical methods are remarkably accurate, allowing us to integrate in time a collection of nonlinear equations in 1 spatial dimension at resolutions 4{\texttimes} to 8{\texttimes} coarser than is possible with standard finite-difference methods.},
	issn = {0027-8424},
	URL = {https://www.pnas.org/content/116/31/15344},
	eprint = {https://www.pnas.org/content/116/31/15344.full.pdf},
	journal = {Proceedings of the National Academy of Sciences}
}

data-driven-discretization-1d's People

Contributors

shoyer avatar yohai avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

data-driven-discretization-1d's Issues

Is "pip install -e pde-superresolution" the right command?

Hi, I'd like to point out that the instruction pip install related to the project might not be right one. Instead, one should type

pip install -e data-driven-discretization-1d

as it searches the setup.py file inside this folder. Am I wrong? At least this was the only way I succeeded installing the package...

Could you release the arguments used for generating all the datasets?

I could not reproduce the training set you provide when I ran "create_training_data.py" with the default arguments. Is it possible to release the arguments you used for generating all training data and exact data?

And I found that the shape of the training data mismatches your description in the paper: "To train the network we generate a set of 8000 high-resolution solutions to each equation, sampled at regular time intervals from 800 numerical integrations." However, the provided training data of Burger's equation has a shape of [10000, 512] instead of [8000, 512]. Could you explain the difference here?

Since I want to make some modifications based on the equations used in this paper, I want to keep the same parameters for generating data.

Thanks!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.