Git Product home page Git Product logo

gru_ode_bayes's Introduction

GRU-ODE-Bayes

This implementation completes the paper : GRU-ODE-Bayes : continuous modeling of sporadically-observed time series.

Modeling real-world multidimensional time series can be particularly challenging when these are sporadically observed (i.e., sampling is irregular both in time and across dimensions)---such as in the case of clinical patient data. To address these challenges, we propose (1) a continuous-time version of the Gated Recurrent Unit, building upon the recent Neural Ordinary Differential Equations (Chen et al. 2018), and (2) a Bayesian update network that processes the sporadic observations. We bring these two ideas together in our GRU-ODE-Bayes method.

This repository provides pytorch implementation of the GRU-ODE-Bayes paper.

Installation

Requirements

The code uses Python3 and Pytorch as auto-differentiation package. The following python packages are required and will be automatically downloaded when installing the gru_ode_bayes package:

numpy
pandas
sklearn
torch
tensorflow (for logging)
tqdm
argparse

Procedure

Install the main package :

pip install -e . 

And also the ODE numerical integration package :

cd torchdiffeq
pip install -e .

Run experiments

Experiments folder contains different cases study for the GRU-ODE-Bayes. Each trained model is then stored in the trained_models folder.

2-D Ornstein-Uhlenbeck SDE

Once in the double_OU folder, you can visualize some predictions of a previously trained model on newly generated data by running :

cd experiments/double_OU
python double_ou_gruode.py --demo

This will print 10 new realizations of the process along with the model predictions.

For retraining the full model, run:

python double_ou_gruode.py

Brusselator SDE

Similarly as for the 2D OU process,

cd experiments/Brusselator
python run_gruode.py --demo 

will plot 10 new realizations of the process along with the model predictions. For retraining the full model :

python run_gruode.py

USHCN daily (climate) data

For retraining the model, go to Climate folder and run

python climate_gruode.py

Datasets

The datasets for Brusselator, double OU and processed USHCN data have been uploaded on the repo for compatibility. The MIMIC dataset is not directly publicly available and was thus not pushed. It can be downloaded at : https://physionet.org/physiobank/database/mimic3cdb/

Folder 'data_preproc' contains all steps taken to preprocess the Climate and MIMIC datasets.

Acknowledgements and References

The torchdiffeq package has been extended from the original version proposed by (Chen et al. 2018)

Chen et al. Neural ordinary differential equations, NeurIPS, 2018.

For climate dataset :

Menne et al., Long-Term Daily Climate Records from Stations Across the Contiguous United States

gru_ode_bayes's People

Contributors

anonymperson avatar edebrouwer avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gru_ode_bayes's Issues

SAPSII referenced but not part of MIMIC-III

In the MIMIC pre-processing, a saps2.csv is attempted to be read, but this is not a part of the MIMIC-III dataset. Has this changed since the publication of this repository? Is there a workaround that still allows for reproduction of the results?

why labels are all zero?

Hi!
I have the following problem running your code:
May I ask why the label in the model input of climate experiment is all 0?What does it mean?
Thank you very much!

Add a License

Hello,

I have no legal background, but according to currently according to this discussion: https://opensource.stackexchange.com/questions/1720/what-can-i-assume-if-a-publicly-published-project-has-no-license,

If a repository has no license, then all rights are reserved and it is not Open Source or Free. You cannot modify or redistribute this code without explicit permission from the copyright holder.

It would be great to add an appropriate LICENSE to this repository so that others can re-use the code, if you want to allow that, or to clarify explicitly if you want to reserve all rights.

Final MIMIC preprocessing files missing

As mentioned in several prior issues (#1, #2, #3 and #4) the final MIMIC preprocessing files mimic_preproc.py and folds_split_mimic.py are not included in the data_preproc/MIMIC/ folder.

It'd be great to gain complete insight into how you set up the EHR data with this project. The text in Appendix K of your paper only goes so far.

Thanks!

Missing Files

The referenced files folds_split_mimic.py and mimic_prepoc.py are missing from the repo

Missing python files

Hi there!

The following files are referenced in the MIMIC preprocessing README
but are not present in the repo:

  • mimic_preproc.py
  • folds_split_mimic.py

Can you please point me to their location/can you add them to the repo?

Best, Dan

Meta data files missing i

while running double_OU from experiments/double_OU folder I get the following error

fileNotFoundError: [Errno 2] No such file or directory: '../../gru_ode_bayes/datasets/double_OU/double_OU_metadata.npy'

Production use for shorting stocks?

According to paperswithcode.com this seems to be the state of the art at time series forecasting (in general).
Is it the state of the art at forecasting of financial stock forecasting too?
If so, would it gain more money than it would loose in average?

If it would still not be lucrative, what value of MSE and NEGLL would be needed to ensure on average lucrativity (winning money)?

Missing trained models and other files

Is it just me or is this repo missing several files and folders? I don't see a trained_models directory for any of the datasets.

Was that a mistake or are the authors holding off on releasing the full repo until after NeurIPS?

Debugging in the model

I'm stuck at this stage while experimenting with the model, I'd appreciate it if you could help.

resim

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.