Git Product home page Git Product logo

datacube-dataset-config's People

Contributors

alexgleith avatar alfredo-ama avatar ananthul avatar andrewdhicks avatar awalshie avatar bellemae avatar ceholden avatar cronosnull avatar gypsybojangles avatar harshurampur avatar jeannettestrand avatar jeremyh avatar kieranricardo avatar kirill888 avatar mergify[bot] avatar mickwelli avatar nikitagandhi avatar omad avatar pindge avatar rokrak1 avatar santoshamohan avatar seffatchowdhury avatar simleo avatar simonaoliver avatar uchchwhash avatar v0lat1le avatar whatnick avatar woodcockr avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

datacube-dataset-config's Issues

stac-to-dc \ --catalog-href='https://earth-search.aws.element84.com/v0/' \ --bbox='25,20,35,30' \ --collections='sentinel-s2-l2a-cogs' \ --datetime='2020-01-01/2020-03-31'...running this command gives exception

stac-to-dc
--catalog-href='https://earth-search.aws.element84.com/v0/'
--bbox='25,20,35,30'
--collections='sentinel-s2-l2a-cogs'
--datetime='2020-01-01/2020-03-31'

I am running this command for indexing.But I get below mentioned output-

stac-to-dc --catalog-href='https://earth-search.aws.element84.com/v0/' --bbox='25,20,35,30' --collections='sentinel-s2-l2a-cogs' --datetime='2020-01-01/2020-03-31'
/home/bel/miniconda3/envs/cubeenv1/lib/python3.11/site-packages/pystac_client/client.py:186: NoConformsTo: Server does not advertise any conformance classes.
warnings.warn(NoConformsTo())
Traceback (most recent call last):
File "/home/bel/miniconda3/envs/cubeenv1/bin/stac-to-dc", line 8, in
sys.exit(cli())
^^^^^
File "/home/bel/miniconda3/envs/cubeenv1/lib/python3.11/site-packages/click/core.py", line 1157, in call
return self.main(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/bel/miniconda3/envs/cubeenv1/lib/python3.11/site-packages/click/core.py", line 1078, in main
rv = self.invoke(ctx)
^^^^^^^^^^^^^^^^
File "/home/bel/miniconda3/envs/cubeenv1/lib/python3.11/site-packages/click/core.py", line 1434, in invoke
return ctx.invoke(self.callback, **ctx.params)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/bel/miniconda3/envs/cubeenv1/lib/python3.11/site-packages/click/core.py", line 783, in invoke
return __callback(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/bel/miniconda3/envs/cubeenv1/lib/python3.11/site-packages/odc/apps/dc_tools/stac_api_to_dc.py", line 302, in cli
added, failed, skipped = stac_api_to_odc(
^^^^^^^^^^^^^^^^
File "/home/bel/miniconda3/envs/cubeenv1/lib/python3.11/site-packages/odc/apps/dc_tools/stac_api_to_dc.py", line 159, in stac_api_to_odc
search = client.search(**config)
^^^^^^^^^^^^^^^^^^^^^^^
File "/home/bel/miniconda3/envs/cubeenv1/lib/python3.11/site-packages/pystac_client/client.py", line 591, in search
raise DoesNotConformTo(
pystac_client.warnings.DoesNotConformTo: Server does not conform to ITEM_SEARCH, There is not fallback option available for search.

Clarification

Does the presence of common.py in datacube-dataset-config/blob/master/scripts indicate a change away from standalone ingestion scripts?

I was intending to contribute scripts for ingesting Sentinel-2 L2A data for PRODUCT_TYPE of original S2MSI2Ap and current S2MSI2A.

So I'd appreciate a statement on this.
BTW I think extracting commonalities will help maintainability.

Plus it would be useful to establish test cases for validating such scripts.
For example the L1C script outputs contain spectral band profiles which are not embedded in the L2A data.

generating metadata error

Hello,
I have downloaded Landsat 7 scene from Earth Explorer.
but when I'm going to run python script
python usgs_ls_ard_prepare.py /datacube/original_data/LE07_L1TP_231067
_20150802_20161022_01_T1

it is generating the following error:

2018-09-17 08:34:11,843 INFO Processing /datacube/original_data/LE07_L1TP_231067_20150802_20161022_01_T1
Traceback (most recent call last):
  File "usgs_ls_ard_prepare.py", line 320, in <module>
    main()
  File "/home/parul/Datacube/datacube_env/lib/python3.5/site-packages/click-6.7-py3.5.egg/click/core.py", line 722, in __call__
    return self.main(*args, **kwargs)
  File "/home/parul/Datacube/datacube_env/lib/python3.5/site-packages/click-6.7-py3.5.egg/click/core.py", line 697, in main
    rv = self.invoke(ctx)
  File "/home/parul/Datacube/datacube_env/lib/python3.5/site-packages/click-6.7-py3.5.egg/click/core.py", line 895, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/parul/Datacube/datacube_env/lib/python3.5/site-packages/click-6.7-py3.5.egg/click/core.py", line 535, in invoke
    return callback(*args, **kwargs)
  File "usgs_ls_ard_prepare.py", line 310, in main
    documents = prepare_datasets(path)
  File "usgs_ls_ard_prepare.py", line 286, in prepare_datasets
    r"(?P<collection_category>RT|T1|T2)"), nbar_path.stem).groupdict()
AttributeError: 'NoneType' object has no attribute 'groupdict'

Please suggest me a solution. Thanks in advance.

delete lvl1 dataset error

DELETE 0
psql:delete_odc_product.sql:96: ERROR:  update or delete on table "dataset" violates foreign key constraint "fk_dataset_source_source_dataset_ref_dataset" on table "dataset_source"
DETAIL:  Key (id)=(08d52c4f-42f7-5511-8b31-9138025ef3e2) is still referenced from table "dataset_source".
psql:delete_odc_product.sql:103: ERROR:  update or delete on table "dataset_type" violates foreign key constraint "fk_dataset_dataset_type_ref_dataset_type" on table "dataset"
DETAIL:  Key (id)=(82) is still referenced from table "dataset".
psql:delete_odc_product.sql:111: NOTICE:  view "dv_s2b_l1c_aws_pds_dataset" does not exist, skipping

Add ESRI Global land cover dataset

Kick off project index the ESRI Land Cover with a list of tiles, like this:

https://soilspackage-useast.s3.amazonaws.com/EsriLandCover/01C_20200101-20210101.tif
https://soilspackage-useast.s3.amazonaws.com/EsriLandCover/01G_20200101-20210101.tif
https://soilspackage-useast.s3.amazonaws.com/EsriLandCover/01K_20200101-20210101.tif
https://soilspackage-useast.s3.amazonaws.com/EsriLandCover/01L_20200101-20210101.tif
https://soilspackage-useast.s3.amazonaws.com/EsriLandCover/01U_20200101-20210101.tif
https://soilspackage-useast.s3.amazonaws.com/EsriLandCover/01W_20200101-20210101.tif
https://soilspackage-useast.s3.amazonaws.com/EsriLandCover/02C_20200101-20210101.tif

explorer product-audit page broken

  • after odc dataset deletion, explorer product-audit page is broken
stderr
KeyError: 'Unknown dataset type id 181'

This is due to non deleted entries in cubedash.dataset_spatial table. the mv cubedash.mv_dataset_spatial_quality is created based on dataset_spatial and returning an error as the dataset_type_ref has been deleted in agdc.dataset_type.

Incorrect bounds computation in `index_from_s3.py`

  1. MTL document lists extents as they apply to pixel centers, but datacube expects extents that cover whole raster edge to edge

def get_geo_ref_points(info):
return {
'ul': {'x': info['CORNER_UL_PROJECTION_X_PRODUCT'], 'y': info['CORNER_UL_PROJECTION_Y_PRODUCT']},
'ur': {'x': info['CORNER_UR_PROJECTION_X_PRODUCT'], 'y': info['CORNER_UR_PROJECTION_Y_PRODUCT']},
'll': {'x': info['CORNER_LL_PROJECTION_X_PRODUCT'], 'y': info['CORNER_LL_PROJECTION_Y_PRODUCT']},
'lr': {'x': info['CORNER_LR_PROJECTION_X_PRODUCT'], 'y': info['CORNER_LR_PROJECTION_Y_PRODUCT']},
}

So recorded span is 15 meters (half a pixel) smaller than actual span.

  1. The usual error of incorrectly computing bounding box in Lon/Lat domain that assumes that projecting bounding box to Lon/Lat is the same as computing bounding box after projecting the whole shape into Lon/Lat with extra points added and then computing the bounding box

def get_coords(geo_ref_points, spatial_ref):
t = osr.CoordinateTransformation(spatial_ref, spatial_ref.CloneGeogCS())
def transform(p):
lon, lat, z = t.TransformPoint(p['x'], p['y'])
return {'lon': lon, 'lat': lat}
return {key: transform(p) for key, p in geo_ref_points.items()}

Instructions to index Sentinel-2 COGS not working anymore

The instructions based on the element84 v0 STAC Catalog do not work anymore: https://github.com/opendatacube/datacube-dataset-config/blob/main/sentinel-2-l2a-cogs.md

Please refer to the issue opened by @pierocampa in the cube-in-a-box repo for more details:
opendatacube/cube-in-a-box#64

There are multiple other places where this STAC Catalog is used https://github.com/search?q=org%3Aopendatacube+earth-search.aws.element84.com%2Fv0%2F&type=code

Sen2cor Product Definition Document

Hi all, I am trying to index Sentinel-2 level 2A processed through sen2cor (same imagery you get from copenicus open access hub).

I am using this product definition yaml ga_s2_ard.yaml and this dataset preparation script sen2cor_prepare.py but I find that they do not match. Is there any product definition yaml that matches sentinel2-L2A imagery processed through sen2cor? Or where can I find the information needed to create one?

ls_public_bucket.py depends on non-existent create_dataset function

Expected behaviour

All of the imports used in ls_public_bucket.py exist somewhere.

Actual behaviour

from datacube.scripts.dataset import create_dataset, parse_match_rules_options

refers to a non existent function which was removed from datacube.scripts.dataset in commit 0f71e70f58a25f0e473fa872ff5938d8a95749b9

create_dataset first appears in release 1.6.0. It is removed in release 1.6.1 and then the ls_public_bucket.py script which imports create_dataset was added in release 1.7.0.

Steps to confirm the behaviour

git checkout tags/datacube-1.7
git log -G create_dataset

commits of interest are:
638ff76eb22c5f5c61549a58b54c59c901baa822
0f71e70f58a25f0e473fa872ff5938d8a95749b9

Environment information

  • Which datacube --version are you using?

Open Data Cube core, version 1.7+247.g8d517db0

  • What datacube deployment/enviornment are you running against?
    N/A, issue exists in the repository

Dataset creation time is older than start date

I'm trying to prepare Dataset Documents to upload it to datacube but getting just this:

$ python  /home/denis/git/datacube-dataset-config/scripts/s2prepare_SAFE_zip_L1C_S2A.PY S2A_MSIL2A_20190513T104031_N0212_R008_T31TDG_20190513T144546.zip --no-checksum
2019-10-24 15:20:22,566 INFO Dataset creation time 2019-06-19 16:24:14 is older than start date 2019-10-24 15:20:22.565625...SKIPPING

What am I overlooking here?

Index Sentinel 2 COG Error when using the datacube dataset config(Indexing Sentinel-2 Cloud-Optimised GeoTIFFs) @alexgleith

When I used the command in the document , I got the errors like this:

--stac --no-sign-request \

s3://sentinel-cogs/sentinel-s2-l2a-cogs/37/M/CS/2017/10/**/S2A_37MCS_20171016_0_L2A.json s2_l2a

ValueError: Region name is not supplied and default can not be found

stac-to-dc --catalog-href='https://earth-search.aws.element84.com/v0/' --bbox='25,20,35,30' --collections='sentinel-s2-l2a-cogs' --datetime='2020-01-01/2020-01-31'

pystac_client.warnings.DoesNotConformTo: Server does not conform to ITEM_SEARCH, There is not fallback option available for search.

@alexgleith

Open Data Cube core, version 1.8.16.dev22+g584db42c

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.