sdfidk / dhmqc Goto Github PK
View Code? Open in Web Editor NEWProcessing suite for the Danish Digital Elevation Model
License: Other
Processing suite for the Danish Digital Elevation Model
License: Other
We should update to laspy 2.x, but this is currently blocked on the lack of LAZ backend packages on conda-forge.
Laspy 1.x handles LAZ files by piping them through the laszip
executable in LAStools, but this is no longer supported in laspy 2.x. To use laspy 2.x, we need to either:
pip
in the environment. Kind of hack-ish.Tests currently fail under Linux with the following message. Presumably, one of the Conda packages has broken dependency handling recently.
E
======================================================================
ERROR: Failure: ImportError (libpoppler.so.76: cannot open shared object file: No such file or directory)
----------------------------------------------------------------------
Traceback (most recent call last):
File "/home/peter/miniconda3/envs/dhmqc_test/lib/python3.7/site-packages/nose/failure.py", line 39, in runTest
raise self.exc_val.with_traceback(self.tb)
File "/home/peter/miniconda3/envs/dhmqc_test/lib/python3.7/site-packages/nose/loader.py", line 417, in loadTestsFromName
addr.filename, addr.module)
File "/home/peter/miniconda3/envs/dhmqc_test/lib/python3.7/site-packages/nose/importer.py", line 47, in importFromPath
return self.importFromDir(dir_path, fqname)
File "/home/peter/miniconda3/envs/dhmqc_test/lib/python3.7/site-packages/nose/importer.py", line 94, in importFromDir
mod = load_module(part_fqname, fh, filename, desc)
File "/home/peter/miniconda3/envs/dhmqc_test/lib/python3.7/imp.py", line 234, in load_module
return load_source(name, filename, file)
File "/home/peter/miniconda3/envs/dhmqc_test/lib/python3.7/imp.py", line 171, in load_source
module = _load(spec)
File "<frozen importlib._bootstrap>", line 696, in _load
File "<frozen importlib._bootstrap>", line 677, in _load_unlocked
File "<frozen importlib._bootstrap_external>", line 728, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/home/peter/develop/DHMQC/tests.py", line 6, in <module>
from qc.db import report
File "/home/peter/develop/DHMQC/qc/db/report.py", line 28, in <module>
from osgeo import ogr, osr, gdal
File "/home/peter/miniconda3/envs/dhmqc_test/lib/python3.7/site-packages/osgeo/__init__.py", line 21, in <module>
_gdal = swig_import_helper()
File "/home/peter/miniconda3/envs/dhmqc_test/lib/python3.7/site-packages/osgeo/__init__.py", line 17, in swig_import_helper
_mod = imp.load_module('_gdal', fp, pathname, description)
File "/home/peter/miniconda3/envs/dhmqc_test/lib/python3.7/imp.py", line 242, in load_module
return load_dynamic(name, filename, file)
File "/home/peter/miniconda3/envs/dhmqc_test/lib/python3.7/imp.py", line 342, in load_dynamic
return _load(spec)
ImportError: libpoppler.so.76: cannot open shared object file: No such file or directory
----------------------------------------------------------------------
Ran 1 test in 0.010s
FAILED (errors=1)
qc_wrap
appears to be sensitive to the filesystem location of the SQLite tile database used to coordinate processing (for example, it may hang when the database file is placed on a network drive). This is presumably caused by the individual processes simultaneously trying to obtain a lock on the file, which will deadlock with a file on a slow drive.
The glue code in delaunator_wrapper.cpp copies both the input vertex coordinates and the resulting vertex indices of the Delaunay triangulation, the latter in order to work around manual memory management. This copying causes excessive memory usage.
Windows: Python 3.8.1
Traceback (most recent call last):
File "C:\Users\b025527.conda\envs\DHMQC\lib\site-packages\nose\failure.py", line 39, in runTest
raise self.exc_val.with_traceback(self.tb)
File "C:\Users\b025527.conda\envs\DHMQC\lib\site-packages\nose\loader.py", line 416, in loadTestsFromName
module = self.importer.importFromPath(
File "C:\Users\b025527.conda\envs\DHMQC\lib\site-packages\nose\importer.py", line 47, in importFromPath
return self.importFromDir(dir_path, fqname)
File "C:\Users\b025527.conda\envs\DHMQC\lib\site-packages\nose\importer.py", line 94, in importFromDir
mod = load_module(part_fqname, fh, filename, desc)
File "C:\Users\b025527.conda\envs\DHMQC\lib\imp.py", line 234, in load_module
return load_source(name, filename, file)
File "C:\Users\b025527.conda\envs\DHMQC\lib\imp.py", line 171, in load_source
module = _load(spec)
File "", line 702, in _load
File "", line 671, in _load_unlocked
File "", line 783, in exec_module
File "", line 219, in call_with_frames_removed
File "C:\DHMQC\tests.py", line 8, in
from qc.thatsDEM import triangle
File "C:\DHMQC\qc\thatsDEM\triangle.py", line 48, in
delaunator_lib = ctypes.cdll.LoadLibrary(delaunator_lib_name)
File "C:\Users\b025527.conda\envs\DHMQC\lib\ctypes_init.py", line 451, in LoadLibrary
return self.dlltype(name)
File "C:\Users\b025527.conda\envs\DHMQC\lib\ctypes_init.py", line 373, in init
self._handle = _dlopen(self._name, mode)
FileNotFoundError: Could not find module 'C:\DHMQC\qc\thatsDEM\lib\libdelaunator.dll'. Try using the full path with constructor syntax.
Change a lot of prints to use the logging module. Should also remove the need for the "hackish" redirect_output class.
Triangle is a very nice library but it in terms of licensing it is horrible. A new gridding library would allow us to simplify the install-script and let us distribute a pre-build package.
Alternatives to Triangle includes:
We need to figure out if any of the alternatives are usable. Are they fast enough? Producing the desired results? How big a change to the existing code-base is needed? Any license issues?
dem_gen.py
fails when trying to triangulate very large pointclouds (> 45 million points). This is temporarily fixed by thinning the point cloud if it is too big: https://github.com/Kortforsyningen/DHMQC/blob/master/qc/dem_gen.py#L572-L578 . If it turns out that triangle can't be fixed, a better way of downsampling the pointcloud is needed.
This proposal is two-fold:
There is a lot of constants used in the QC-scripts, like tile_size, ground_classes, database connection-strings etc. It would be better to put them in a shared setup-file where the user can change them to their preference instead of changing it in the actual code.
A script like dem_gen.py
can take a lot of arguments (23 in fact) which is a bit of a pain to do from the command line. For this reason python "job-definitions" exists. They make life a lot easier, but also exposes the code to the user in a way that is less ideal. A general input-xml/yaml format for QC-scripts would solve this problem.
The model-coverage argument should let you define an area (polygon) where you want to calculate dtm/dsm. Raster cells outside the model-coverage would be set to NODATA.
One use case for this would be border-regions, for instance the Danish/German border. We are technically not allowed to map another country, which is why the models should not continue south of the border.
The model-coverage can also be used in other circumstances, i.e. creating terrain models for a municipality.
set_lake_z
uses psyopg2 as a database handler. This is nice when you have access to a postgres database, but otherwise sucks. With GDAL as a the database front you are also able to use other datasources, such as shape-files or spatialite-databases.
The current interface for setting up the ekstra columns in the lake table is very clumsy. Use "db" as las-input, then use "-db_action setup". After that a new similar call has to be made, this time with "-db_action reset" to initialize the new columns with data.
The help text for set_lake_z
specifically says that you can use the path to a shape-file as the db_connection. This is not possible with the way the script is set up at the moment.
Checks with z_accuracy_gcp
fails under the new Conda/Python 3 toolchain with the following message:
[qc_wrap]: Traceback:
Traceback (most recent call last):
File "C:\dev\DHMQC\qc_wrap.py", line 126, in run_check
return_code = test_func(send_args)
File "C:\dev\DHMQC\qc\z_accuracy_gcp.py", line 83, in main
feats=vector_io.get_features(pointname,pargs.layername,pargs.layersql,extent)
File "C:\dev\DHMQC\qc\thatsDEM\vector_io.py", line 221, in get_features
layer.SetSpatialFilterRect(*extent)
File "C:\Users\[REDACTED]\.conda\envs\DHMQC\lib\site-packages\osgeo\ogr.py", line 1345, in SetSpatialFilterRect
return _ogr.Layer_SetSpatialFilterRect(self, *args)
NotImplementedError: Wrong number or type of arguments for overloaded function 'Layer_SetSpatialFilterRect'.
Possible C/C++ prototypes are:
OGRLayerShadow::SetSpatialFilterRect(double,double,double,double)
OGRLayerShadow::SetSpatialFilterRect(int,double,double,double,double)
time.clock() is not working in python 3.8. Works in 3.7
0.4m seems to be hardcoded into the program at the moment.
Would be nice if it could be changed so we can run the code on faroe and greenland data as well.
Make a setup.py
that uses python's setuptools
. Ideally we want to get rid of the current build script and let setuptools build the code instead. This way it will be easier to distribute the package on PyPI.
We need to install dhmqc as a python package that can be compiled via setup.py. A command line app is also needed for this to make any sense. This way we can make a CLI that does something like:
dhmqc coverage *.las coverage.sqlite
and
dhmqc dem_gen coverage.sqlite jobdef.json
There is no reason for us to maintain a testing framework when several very good alternatives exists already. Migrating to a new framework is not a huge task but some thought needs to be put in how the framework is set up.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.