Git Product home page Git Product logo

budyko-qgis's People

Contributors

j08lue avatar kittelc avatar radosuav avatar vlro avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

budyko-qgis's Issues

Automate SRTM download and preprocessing?

The steps are:

  1. Download relevant SRTM tiles from http://srtm.csi.cgiar.org/
  2. Merge SRTM tiles in QGIS using Raster – Miscellaneous – Merge. Specify no data value as -32768! Sometimes the algorithm crashes on the way for reasons that are not understood, but still produces the merged layer. In this case you can proceed and manually add the new layer to the project.
  3. Reproject to metric coordinate system (UTM) in QGIS using Raster-Projections-Warp. Specify no data value as -32768!
  4. Aggregate SRTM to a suitable spatial resolution, using GRAS command r.resamp.stats from the Processing Toolbox in QGIS
  5. Cut out region of interest in QGIS with Raster – Extraction – Clipper. Specify no data value as -32768!

OBS: Before running TauDEM algorithms, ensure that your no data value is a float (e.g. -32768) and not “nan” or similar. TauDEM does read the GDAL no-data tag (or similar) in the tiff metadata, but is unable to deal with non-float values in general. This can yield ridiculous results (particularly if you have ocean cells in your DEM).

Error in calibration due to Spotpy

Due to thouska/spotpy#200 the calibration is searching for the worst model performance instead of the best due to a sign issue of the likelihood function. I have submitted a pull request - the code works with Spotpy 1.13.10 otherwise (but it would of course be better to use the newest version...).

I also looked into getting the algorithm to use spotpy_multiobjective.py's objective function (it actually uses the default rmse, which amounts to the same, except it produces a positive number), but because the "negative likelihood" is hardcoded and not consistent, it seems hard to make a robust workaround. A part from this issue, the calibration implementation in QGIS runs smoothly with the test data, except a few tweaks to the calibration default settings (which I will also update in the code).

Automate TauDEM preprocessing

The steps (from @KittelC's Budyko guide) are:

OBS: Before running TauDEM algorithms, ensure that your no data value is a float (e.g. -32768) and not “nan” or similar. TauDEM does read the GDAL no-data tag (or similar) in the tiff metadata, but is unable to deal with non-float values in general. This can yield ridiculous results (particularly if you have ocean cells in your DEM).

  1. From the QGIS processing toolbox run Pit Remove (TauDEM – Basic Grid Analysis tools – Pit Remove)

  2. From the QGIS processing toolbox run Flow directions (TauDEM – Basic Grid Analysis tools – D8 Flow Directions)

  3. From the QGIS processing toolbox run Contributing area (TauDEM – Basic Grid Analysis tools – D8 Contributing Area). Please note that the color stretch of the map is not adjusted automatically and needs to be manually adjusted in order to see something.

  4. Load or generate requested outlet points in a point shapefile. Make sure outlet points are located on top of rivers. Re-run step 8. You can create a new shapefile from a table of coordinates using Layer – Add Layer – Add delimited text layer. Reprojection can be done with right click – save as and then changing the projection.

  5. From the QGIS processing toolbox run Stream definition (TauDEM – Stream Network Analysis Tools – Stream Definition by threshold)

  6. From the QGIS processing toolbox run Stream reach and watershed (TauDEM – Stream Network Analysis Tools – Stream Reach and Watershed)

  7. Polygonize the watershed file using the processing toolbox – GDAL Conversion – Polygonize. The output field name for the polygons is “DN”.

  8. Dissolve (merge) polygons with the same DN using processing toolbox – QGIS geoalgorithms – Vector geometry tools – Dissolve. Make sure you uncheck “Dissolve all”.

  9. Replace stream and catchment IDs with continuous IDs (IDs must start at 0). To do this, proceed in the following steps:

    a) Make a copy of the stream reaches file (in QGIS, right-click – export), retaining only the LINKNO field of the attribute table
    b) Add a new ID field to the attribute table. From the processing toolbox use QGIS – vector table tools – add autoincremental field
    c) Join the attribute tables of the stream reach file and the file generated in a-c (called id file from here onwards) using QGIS – vector general tools – join attribute tables. Use “LINKNO” as the join field in both tables.
    d) Using the field calculator, replace the original “LINKNO” field with the id field in the joint table
    e) Join attribute tables of the stream reach file and the id file again, using “DSLINKNO” as the join field in the stream reach file and “LINKNO” in the id file.
    f) Using the field calculator, replace the original “DSLINKNO” field with the id field in the joint table, for all values except -1 (use if statement)
    g) Proceed the same way with “USLINKNO1” and “USLINKNO2”.
    h) Join attribute tables of the watershed file and the id file (using “DN” and “LINKNO”) and replace the “DN” field with the id field in the joint table.

  10. All watershed delineation commands can be scripted as shown in the example script by Cécile (StreamNet_Edit_Taudem.py) – the script Stream_Subbasin_Cleanup.py shows how 0-length reaches can be removed and the reaches and subbasins re-numbered automatically.

  11. Go to Raster-Zonal Statistics-Zonal statistics and add a column with the mean elevation to the subcatchment shape file attribute table.

  12. Open the attribute table and click on the field calculator button. Calculate a new field containing the catchment area, using the $area operator.

  13. Reproject the subcatchment shapefile into latitude-longitude coordinates. Right-click on layer, save as and pick the EPSG 4326 projection.

  14. Create a point shapefile containing the centroids of the subcatchments using Vector-Geometry Tools-Polygon centroids and save the result into a file, e.g. centroids_ll.shp.

  15. Add columns with latitude and longitude to the centroid shape file attribute table, using Vector-Geometry Tools-Export/Add geometry columns. Or using the field calculator on the attribute table and the $x and $y operators.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.