Git Product home page Git Product logo

bahmanmdd / hdlmf_gin-ga Goto Github PK

View Code? Open in Web Editor NEW
0.0 2.0 0.0 8.12 MB

This repository is related to the following article entitled "A hybrid deep-learning-metaheuristic framework for bi-level network design problems" Madadi and Correia (2023). For theory and methodology details, please refer to the article here: https://doi.org/10.48550/arXiv.2303.06024

Home Page: https://doi.org/10.48550/arXiv.2303.06024

License: MIT License

Python 100.00%
deep-learning hybrid-framework metaheuristics network-design transportation-network

hdlmf_gin-ga's Introduction

This repository is related to the following article entitled "A hybrid deep-learning-metaheuristic framework for bi-level network design problems" Madadi and Correia (2023)

For theory and methodology details, please refer to the article.

The reproducability of the results is verified by codeocean via the following computation capsule based on this repository: A hybrid deep-learning-metaheuristic framework for bi-level network design problems (GIN-GA23)

Setup

  1. Seting up the environment

    • Setup an environment using the requirements.txt file. This is a pip-friendly list of the high-level python packages required to setup an environment for this project including the version information. So simply create an empty environment and use the command pip install -r requirements.txt to setup the environment. But before installing packages, please check the "license requirements" section below.
    • Note that if you want the full functionality, you need to acquire a CPLEX license and setup CPLEX first. See more details below under "License requirements" heading.
  2. License requirements

    • To fully utilize this repository and reproduce the experiments, you will require a CPLEX license (academics can acquire an academic license for free). See instructions here. However, this is only necessary for implementing the SORB method. For the experiments reported in Madadi and Correia (2023), we have provided the SORB method results in csv files in the output directory here and the benchmarking code allows running other methods and benchmarking against the saved results of SORB.
  3. Datasets

    • You can reproduce the datasets used for experiments in Madadi and Correia (2023) or generate new datasets using data_due_generate (instructions below).
  4. Transport networks

    • Transport networks used for creating datasets are stored in 'TransportationNetworks' directory and are selected from here.
  5. Config files

    • Hyperparameters (for training GNNs) are saved in config files in the "configs" directory for each network and model, but you can change them.

Generating datasets

Use data_due_generate to generate new datasets with solved instances of DUE problems.

  1. In parameters (the first function), specify the following parameters (current values reproduce Madadi and Correia (2023)):
    1. dataset params (size, etc.) (note that different dataset sizes are used for different networks, for details refer to Madadi and Correia (2023))
    2. problem params (network, etc.)
    3. solution params (solver choice and options)
    4. variation params (perturbations for problem variation generation)
  2. Run data_due_generate. Be aware that this might take days or weeks depending on the size.
  3. Check the "DadatsetsDUE/network" directory for results.

Notes:

  • It is recommended to generate and prep datasets one at a time to avoid memory and other issues.
  • You can solve DUE instances with:
    1. ipopt (open source general solver), which is very reliable but slow (recommended for small networks), particularly for larger networks (e.g., Anaheim)
    2. aequilibriae (specialized solver for DUE), which is much faster (recommended for large networks) but less stable, so it might throw unpredictable errors. This means with this solver, sometimes you might have to run the code a few times to have a complete dataset.

Train-test pipeline

  • Hyperparameter tuning:
    • You can use the train_tune submodule for hyperparameter-tuning but the good parameters are already identified and saved in config files.
  • Using the train_test submodule for training:
    1. Specify the model (e.g., GIN), the problem (e.g., DUE), and the network (e.g., Anaheim) in problem_spec (the first function).
    2. Hyperparameters are saved in config files in the "configs" directory for each network and model, but you can change them.
    3. Run train_test and check out the "output/DUE" directory for training results
    4. Best models are copied in the "models" directory for inference (from "output/DUE/network/models") but if you train better models, you can replace them.

Benchmarking solutions

Use ndp_la_bm & ndp_ls_bm for benchmarking NDP-LA and NDP-LS problems respectively. They have a very similar structure except for minor parameter differences based on each problem.

Just define scenario and variation parameters in scenarios (the first function) and run. Parameters for each method are specified in the first function in problem_method submodules (e.g., ndp_la_sorb). Current values reproduce the experiments in Madadi and Correia (2023).

The output for each case study (network) will be saved in "output/problem/network". There will be a summary and a separate directory for detailed results of each solution method.

I recommend running the benchmarks one case study at a time to avoid any issue.

Code implementation order

  1. Dataset preparation with data_due_generate
    1. Generate datasets
    2. Convert datasets to dgl format (done within the same module)
  2. Training with train_test
    1. Hyperparameter tuning (optional)
    2. Train-test pipeline
  3. Computational experiemnts with ndp_la_bm & ndp_ls_bm (separately)
    1. Benchmark solutions
      1. SORB
      2. GIN-GA

hdlmf_gin-ga's People

Contributors

bahmanmdd avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.