Git Product home page Git Product logo

ebcf-cdem's Introduction

EBCF-CDEM

Description

A continuous DEM model. Please refer to our paper "A continuous digital elevation representation model for DEM super-resolution" (in ISPRS Journal of Photogrammetry and Remote Sensing).

Citation

If you find our work useful in your research, please cite:

@article{YAO20241,
title = {A continuous digital elevation representation model for DEM super-resolution},
journal = {ISPRS Journal of Photogrammetry and Remote Sensing},
volume = {208},
pages = {1-13},
year = {2024},
issn = {0924-2716},
doi = {https://doi.org/10.1016/j.isprsjprs.2024.01.001}
}

Installation

pip install -r requirements.txt

Instructions

Our code deeply relys on the "hydra" package, for every experiments it need to modify the corresponding config file. We provid the debug template for config files.

python hydra_run.py --cfg job --resolve ${config_file}

I. Prepare Data

  1. Download the data: TFASR30 and TFASR30to10 datasets, Pyrenees and Tyrol datasets.

  2. Split Pyrenees and Tyrol datasets:

python make_dataset.py --dirRawDataset ${data_path}/terrain/pyrenees
# It should generate the split files in "./temp" dir. Move and rename the dir if you want. Similar processing for Tyrol dataset.
  1. Generate json file for datasets: Modify your specific changes in "generate_json_for_dataset.py" file.

  2. Modify the config file in "configs/datasets/${dataset_name}.yaml"

  3. Generate the training and testing data. Note that you should specify the "run_dir" in "configs/exp_interpolation.yaml"

python hydra_run.py --multirun \
        experiments=exp_interpolation \
        datasets@dataset_spec=tyrol,pyrenees,tfasr \
        model_spec.interpolation=identity \
        dataset_spec.test_dataset.scale_min=1

II. Test

Using our trained model

Using the checkpoint of TFASR30 to generate super-resolution DEMs. Note that you should specify the "run_dir" in "configs/exp_test.yaml".

python hydra_run.py --multirun \
     experiments=exp_test \
     datasets@dataset_spec=tfasr \
     dataset_spec.test_dataset.scale_min=2,4 \
     exp_name="tfasr_edsrB-ebcf-nearest-pe16_best" \
     test_spec.model_pth="${abs_pth}/checkpoints/TFASR30_ebcf-nearest-pe16_best-epoch.pth" \
     device='cuda:0'

Similar to Pyrenees:

python hydra_run.py --multirun \
     experiments=exp_test \
     datasets@dataset_spec=pyrenees \
     dataset_spec.test_dataset.scale_min=2,4,6,8 \
     exp_name="pyrenees_edsrB-ebcf-nearest-pe16_best" \
     test_spec.model_pth="${abs_pth}/checkpoints/Pyrenees_ebcf-nearest-pe16_best-epoch.pth" \
     device='cuda:0'

and TFASR30to10:

python hydra_run.py --multirun \
     experiments=exp_test \
     datasets@dataset_spec=tfasr_30to10 \
     dataset_spec.test_dataset.scale_min=3 \
     exp_name="tfasr30to10_edsrB-ebcf-nearest-pe16_best" \
     test_spec.model_pth="${abs_pth}/checkpoints/TFASR30to10_ebcf-nearest-pe16_best-epoch.pth" \
     device='cuda:0'

Calculate metrics

For calculating metrics, you shoud specify the "gt_dir" and the "sr_dir" in "sr-tif.yaml". Note that the "gt_dir" means the "run_dir" in "configs/exp_interpolation.yaml" but shoud be more specific for the dataset. The "sr_dir" means the generated results of the super-resolution model. Also, it can use a flexible way to define vars in "sr-tif.yaml". More details please refer to the useage of "OmegaConf". Now, just run:

python cal_dem_metrics.py

III. Train your model

On TFASR30 dataset

Without the bias prediction:

python hydra_run.py \
        experiments=exp_ebcf \
        device='cuda:0' \
        datasets@dataset_spec=tfasr \
        model_spec.interp_mode='none' \
        dataset_spec.train_dataset.dataset.repeat=4 

Without the pos encoding:

python hydra_run.py \
        experiments=exp_ebcf \
        device='cuda:0' \
        datasets@dataset_spec=tfasr \
        model_spec.interp_mode='nearest' \
        dataset_spec.train_dataset.dataset.repeat=4

With the pos encoding:

python hydra_run.py \
        experiments=exp_ebcf-pe \
        device='cuda:0' \
        datasets@dataset_spec=tfasr \
        model_spec.interp_mode='nearest' \
        dataset_spec.train_dataset.dataset.repeat=4 \
        model_spec.posEmbeder.spec.n_harmonic_functions=16

On TFASR30to10 dataset

python hydra_run.py \
        experiments=exp_ebcf-pe \
        device='cuda:1' \
        datasets@dataset_spec=tfasr_30to10 \
        model_spec.interp_mode='nearest' \
        model_spec.posEmbeder.spec.n_harmonic_functions=16

ebcf-cdem's People

Contributors

alcibiadestophetscipio avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

Forkers

531201402

ebcf-cdem's Issues

代码问题

作者您好,请问在EBCFF模型中的pred函数中,make_coord_local的作用是什么,为True的情况下,会计算相对坐标rel_coord,然后相对坐标再乘以LR图像的尺寸,没看明白

论文表格中数据问题

image
请问图中表格红色部分的数据表达方式具体是什么意思,是指多次实验结果的最大和最小值吗?如果是的,误差是否偏大?并且tfasr-origin项与tfasr论文中原始数据并不匹配。这个问题困扰很久,谢谢!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.