Git Product home page Git Product logo

worldbank / blackmarblepy Goto Github PK

View Code? Open in Web Editor NEW
28.0 10.0 4.0 25.01 MB

Georeferenced Rasters and Statistics of Nightlights from NASA Black Marble

Home Page: https://worldbank.github.io/blackmarblepy/

License: Mozilla Public License 2.0

Python 0.33% Jupyter Notebook 99.67%
nasa-data nighttime-lights raster-data nasa nasa-earth-data zonal-statistics viirs worldbank blackmarble nightlights

blackmarblepy's People

Contributors

dependabot[bot] avatar g4brielvs avatar pre-commit-ci[bot] avatar ramarty avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

blackmarblepy's Issues

Read Operation Timeout

Read Operation Timeout

Description

I tried running the code before and it works perfectly even though it took time to process a whole year's worth of data, hence, I divided the code by month. Now, I tried to rerun it and tried several ways like adjusting the timeout, and the size into a day, but it just usually takes hours until it stops progressing and stays into the same % of "PROCESSING TASKS", is there a way to manage this error? Moreover, the file_skip_if_exists doesn't work since I tried running the same query hoping it will skip the tile with available data, but it just resets the whole process.

Reproducibility

  • The bug is reproducible.
  • The bug is intermittent.
  • The bug occurs only under specific conditions.

Screenshots / Error Messages (if applicable)

image

Here is the screenshot where I ran the same code and it supposedly skipped the file since before it had xKB of size, now it resets to:
image

Environment

  • Operating System: Windows10

Bar chart visualization label improvement

In the charts towards the end of the example notebook that show trends in NTL by admin over time, is it possible to automate the x-axis date label, so that it shows the represented dates, rather than just "date"?

image

Title name in the documentation ("blackmarble package")

Is your feature request related to a problem? Please describe.

It took me some time to notice that the API interface is under "blackmarble package.

Describe the solution you'd like
It might be more intuitive if it was named something like "API Interface"

Describe alternatives you've considered
"Reference"

Additional context
N/A

[Documentation] Add information on building documentation and running tests locally

It'd be great if the documentation included "instructions for developers" on things like:

  • Setting up a "developer" environment with an editable installation of the package
  • Building the documentation locally
  • Running the tests and calculating the code coverage locally

Current guidance on the contributing page does a good job covering how contributions should be made. It is unfortunate that the hyperlink in the readme (this) does not work, but since it works in the hosted documentation I'm wondering if changing it in the readme would break that link...

Remove `wget` dependency

In the current implementation, wget in being invoked in a subprocess. Ideally, we can remove this dependency and use Python dependencies only (e.g., httpx).

Update logo

This issue is for updating the logo and favicon.

Expand Testing

We currently lack comprehensive unit tests. This issue is created to track and coordinate efforts towards implementing unit tests for the package.

Enhance Error Handling

This issue tracks a proposal to enhance error handling in a more pythonic way (instead of print everywhere).

Issue loading package / calling function

I'm getting an error loading the package / calling the function.

See error loading package
Screen Shot 2023-12-07 at 10 31 54 AM

This worked:

from blackmarble import bm_extract
from blackmarble import bm_raster

But then I get this error calling the function

load_raster

I installed the lastest version from pip, and am using python 3.11.5

NTL on specific city or municipality

Hello, I would like to ask for help regarding my research. I already have the visualization of the NTL radiance of a specific city or municipality but I was not satisfied with the results because the map is zoomed in and the NTL radiance is pixelated. What is the proper solution for this one? Thank you.

"Data may not be available" error

Hello,

I followed the exact steps from the documentation/README but running bm_raster or bm_extract always returns the following error:

Skipping XXX-XX-XX due to error. Data may not be available.

Using gadm install by default geopandas 0.9.0

Description

Hi, i get this error : AttributeError: 'GeoSeries' object has no attribute 'iteritems' when i run the code below see in the documentation of blackmarblepy.

Reproducibility

from gadm import GADMDownloader
from blackmarble.extract import bm_extract
from blackmarble.raster import bm_raster

gdf = GADMDownloader(version="4.0").get_shape_data_by_country_name(
country_name="GHA", ad_level=1
)
r_202110 = bm_raster(gdf, product_id="VNP46A4", date_range="2023-01-01", bearer="token")

Possible Fix

When I install gadm, it installs geopandas==0.9.0 by default which is the source of the error, but when I re-install geopandas==0.10.2, it solves the issue.

I don't know if we can mention this in the documentation if you confirm that as an issue.

ntl_mean

data = bm_extract(roi = gdf, 
                      product_id = "VNP46A2",
                      date_range = pd.date_range("2021-10-02", "2021-11-01", freq="D"), 
                      bearer = bearer,
                     )

When download data, it provides ntl_mean value.
I have found some additional information what does it mean: ntl_mean, but I'm not sure.
My question is: what is the exact interpretation of ntl_mean:
Thank you beforehand,

Add instructions, argument, or function to get cloud cover

BlackMarble includes information on cloud free observations that produced NTL data, which is especially relevant for monthly and daily data.

Right now it should be possible to get this information by changing the "variable " argument to the relevant cloud cover variable. However, getting NTL and cloud coverage would require two function calls -- which could be simplified when using bm_extract. For example, could add an argument in bm_extract that, when set to TRUE, adds a variable for cloud free observations (eg, average of cloud free observations across pixels). So a user would get a dataset of, for example, (1) Average NTL and (2) Cloud Free Observations --- which would enable findings of: "oh NTL is low because of clouds!"

Windows 11 - NotADirectoryError : [WinError 267] The directory name is not valid

Bug Report

Description

I'm starting to use the library and trying the blackmarblepy.ipynb tutorial.

When I execute the line corresponding to r_20210205 = bm_raster() at the end of the file download, this error appears NotADirectoryError : [WinError 267] The directory name is not valid

Regarding the installation in both cases (From PyPI and From Source) the same error occurs.

Reproducibility

  • The bug is reproducible.
  • The bug is intermittent.
  • The bug occurs only under specific conditions.

Screenshots / Error Messages (if applicable)

image

image

Environment

  • Operating System: Windows 11
  • Browser (if applicable): N/A
  • Application Version/Commit: N/A
  • Additional Environment Details: N/A

Additional Context

N/A

Possible Fix

N/A

Issues running bm_raster on Anaconda

Bug Report

Description

The issue arises when replicating the daily data example.
There are no problems up to gdf.explore(). But the bm_raster() function fails to process the files.

Reproducibility

This is the code:

gdf = geopandas.read_file(
    "https://geodata.ucdavis.edu/gadm/gadm4.1/json/gadm41_BHS_1.json.zip"
)
gdf.explore()

bearer = os.getenv(BLACKMARBLE_TOKEN)

r_20210205 = bm_raster(
    gdf, product_id="VNP46A2", date_range="2021-02-05", bearer=BLACKMARBLE_TOKEN
)

The output:

GETTING MANIFEST...: 100%
2/2 [00:00<00:00, 2.77it/s]
QUEUEING TASKS | Downloading...: 100%
2/2 [00:00<00:00, 331.33it/s]
PROCESSING TASKS | Downloading...: 100%
2/2 [00:35<00:00, 15.54s/it]

[2024-05-17 16:14:14 - backoff:105 - INFO] Backing off _download_file(...) for 0.7s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:14 - backoff:105 - INFO] Backing off _download_file(...) for 0.4s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:19 - backoff:105 - INFO] Backing off _download_file(...) for 0.1s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:19 - backoff:105 - INFO] Backing off _download_file(...) for 0.1s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:25 - backoff:105 - INFO] Backing off _download_file(...) for 0.8s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:25 - backoff:105 - INFO] Backing off _download_file(...) for 1.5s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:31 - backoff:105 - INFO] Backing off _download_file(...) for 5.3s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:31 - backoff:105 - INFO] Backing off _download_file(...) for 7.0s (httpx.ReadTimeout: The read operation timed out)

COLLECTING RESULTS | Downloading...: 100%
2/2 [00:00<00:00, 399.69it/s]
COLLATING RESULTS | Processing...: 0%
0/1 [00:00<?, ?it/s]

---------------------------------------------------------------------------
CPLE_BaseError                            Traceback (most recent call last)
File rasterio\\crs.pyx:775, in rasterio.crs.CRS.from_user_input()

File rasterio\\_err.pyx:209, in rasterio._err.exc_wrap_ogrerr()

CPLE_BaseError: OGR Error code 6

During handling of the above exception, another exception occurred:

CRSError                                  Traceback (most recent call last)
Cell In[9], line 4
      1 BLACKMARBLE_TOKEN = "[SUPRESSED]"
      2 bearer = os.getenv(BLACKMARBLE_TOKEN)
----> 4 r_20210205 = bm_raster(
      5     gdf, product_id="VNP46A2", date_range="2021-02-05", bearer=BLACKMARBLE_TOKEN
      6 )

File ~\anaconda3\Lib\site-packages\pydantic\validate_call_decorator.py:59, in validate_call.<locals>.validate.<locals>.wrapper_function(*args, **kwargs)
     57 @functools.wraps(function)
     58 def wrapper_function(*args, **kwargs):
---> 59     return validate_call_wrapper(*args, **kwargs)

File ~\anaconda3\Lib\site-packages\pydantic\_internal\_validate_call.py:81, in ValidateCallWrapper.__call__(self, *args, **kwargs)
     80 def __call__(self, *args: Any, **kwargs: Any) -> Any:
---> 81     res = self.__pydantic_validator__.validate_python(pydantic_core.ArgsKwargs(args, kwargs))
     82     if self.__return_pydantic_validator__:
     83         return self.__return_pydantic_validator__(res)

File ~\anaconda3\Lib\site-packages\blackmarble\raster.py:283, in bm_raster(gdf, product_id, date_range, bearer, variable, quality_flag_rm, check_all_tiles_exist, file_directory, file_prefix, file_skip_if_exists)
    279 filenames = _pivot_paths_by_date(pathnames).get(date)
    281 try:
    282     # Open each GeoTIFF file as a DataArray and store in a list
--> 283     da = [
    284         rioxarray.open_rasterio(
    285             h5_to_geotiff(
    286                 f,
    287                 variable=variable,
    288                 quality_flag_rm=quality_flag_rm,
    289                 output_prefix=file_prefix,
    290                 output_directory=d,
    291             ),
    292         )
    293         for f in filenames
    294     ]
    295     ds = merge_arrays(da)
    296     ds = ds.rio.clip(gdf.geometry.apply(mapping), gdf.crs, drop=True)

File ~\anaconda3\Lib\site-packages\blackmarble\raster.py:285, in <listcomp>(.0)
    279 filenames = _pivot_paths_by_date(pathnames).get(date)
    281 try:
    282     # Open each GeoTIFF file as a DataArray and store in a list
    283     da = [
    284         rioxarray.open_rasterio(
--> 285             h5_to_geotiff(
    286                 f,
    287                 variable=variable,
    288                 quality_flag_rm=quality_flag_rm,
    289                 output_prefix=file_prefix,
    290                 output_directory=d,
    291             ),
    292         )
    293         for f in filenames
    294     ]
    295     ds = merge_arrays(da)
    296     ds = ds.rio.clip(gdf.geometry.apply(mapping), gdf.crs, drop=True)

File ~\anaconda3\Lib\site-packages\blackmarble\raster.py:129, in h5_to_geotiff(f, variable, quality_flag_rm, output_directory, output_prefix)
    121 height, width = data.shape
    122 transform = from_origin(
    123     left,
    124     top,
    125     (right - left) / width,
    126     (top - bottom) / height,
    127 )
--> 129 with rasterio.open(
    130     output_path,
    131     "w",
    132     driver="GTiff",
    133     height=height,
    134     width=width,
    135     count=1,
    136     dtype=data.dtype,
    137     crs="EPSG:4326",
    138     transform=transform,
    139 ) as dst:
    140     dst.write(data, 1)
    141     dst.update_tags(**attrs)

File ~\anaconda3\Lib\site-packages\rasterio\env.py:451, in ensure_env_with_credentials.<locals>.wrapper(*args, **kwds)
    448     session = DummySession()
    450 with env_ctor(session=session):
--> 451     return f(*args, **kwds)

File ~\anaconda3\Lib\site-packages\rasterio\__init__.py:327, in open(fp, mode, driver, width, height, count, crs, transform, dtype, nodata, sharing, **kwargs)
    325 writer = get_writer_for_driver(driver)
    326 if writer is not None:
--> 327     dataset = writer(
    328         path,
    329         mode,
    330         driver=driver,
    331         width=width,
    332         height=height,
    333         count=count,
    334         crs=crs,
    335         transform=transform,
    336         dtype=dtype,
    337         nodata=nodata,
    338         sharing=sharing,
    339         **kwargs
    340     )
    341 else:
    342     raise DriverCapabilityError(
    343         "Writer does not exist for driver: %s" % str(driver)
    344     )

File rasterio\\_io.pyx:1563, in rasterio._io.DatasetWriterBase.__init__()

File rasterio\\_io.pyx:1592, in rasterio._io.DatasetWriterBase._set_crs()

File rasterio\\crs.pyx:777, in rasterio.crs.CRS.from_user_input()

CRSError: The WKT could not be parsed. OGR Error code 6

Environment

  • Operating System: Windows 11
  • Additional Environment Details: Jupyter Lab, Anaconda3,

I would appreciate any advice.

OSError: Unable to synchronously open file (file signature not found)

Bug Report

Description

When bm.extract() or bm.raster() methods are used, it can not generate the data.

Reproducibility

  • The bug is reproducible.

Steps to Reproduce

Calling the methods on jupyter notebook produces this error. I tried both on my computer and google colab. It looks like it is an OS error related to h5py


OSError                                   Traceback (most recent call last)
Cell In[16], line 2
      1 # f.close()
----> 2 ntl_r = bm_raster(
      3     continental_us,
      4     product_id="VNP46A2",
      5     date_range="2023-01-01",
      6     bearer=bearer,
      7     variable="Gap_Filled_DNB_BRDF-Corrected_NTL",
      8 )

File /opt/anaconda3/lib/python3.11/site-packages/pydantic/validate_call_decorator.py:60, in validate_call.<locals>.validate.<locals>.wrapper_function(*args, **kwargs)
     58 @functools.wraps(function)
     59 def wrapper_function(*args, **kwargs):
---> 60     return validate_call_wrapper(*args, **kwargs)

File /opt/anaconda3/lib/python3.11/site-packages/pydantic/_internal/_validate_call.py:96, in ValidateCallWrapper.__call__(self, *args, **kwargs)
     95 def __call__(self, *args: Any, **kwargs: Any) -> Any:
---> 96     res = self.__pydantic_validator__.validate_python(pydantic_core.ArgsKwargs(args, kwargs))
     97     if self.__return_pydantic_validator__:
     98         return self.__return_pydantic_validator__(res)

File /opt/anaconda3/lib/python3.11/site-packages/blackmarble/raster.py:355, in bm_raster(gdf, product_id, date_range, bearer, variable, drop_values_by_quality_flag, check_all_tiles_exist, output_directory, output_skip_if_exists)
    351 filenames = _pivot_paths_by_date(pathnames).get(date)
    353 try:
    354     # Open each GeoTIFF file as a DataArray and store in a list
--> 355     da = [
    356         rioxarray.open_rasterio(
    357             h5_to_geotiff(
    358                 f,
    359                 variable=variable,
    360                 drop_values_by_quality_flag=drop_values_by_quality_flag,
    361                 output_directory=d,
    362             ),
    363         )
    364         for f in filenames
    365     ]
    366     ds = merge_arrays(da)
    367     clipped_dataset = ds.rio.clip(
    368         gdf.geometry.apply(mapping), gdf.crs, drop=True
    369     )

File /opt/anaconda3/lib/python3.11/site-packages/blackmarble/raster.py:357, in <listcomp>(.0)
    351 filenames = _pivot_paths_by_date(pathnames).get(date)
    353 try:
    354     # Open each GeoTIFF file as a DataArray and store in a list
    355     da = [
    356         rioxarray.open_rasterio(
--> 357             h5_to_geotiff(
    358                 f,
    359                 variable=variable,
    360                 drop_values_by_quality_flag=drop_values_by_quality_flag,
    361                 output_directory=d,
    362             ),
    363         )
    364         for f in filenames
    365     ]
    366     ds = merge_arrays(da)
    367     clipped_dataset = ds.rio.clip(
    368         gdf.geometry.apply(mapping), gdf.crs, drop=True
    369     )

File /opt/anaconda3/lib/python3.11/site-packages/blackmarble/raster.py:177, in h5_to_geotiff(f, variable, drop_values_by_quality_flag, output_directory)
    174 if variable is None:
    175     variable = VARIABLE_DEFAULT.get(product_id)
--> 177 with h5py.File(f, "r") as h5_data:
    178     attrs = h5_data.attrs
    179     data_field_key = "HDFEOS/GRIDS/VNP_Grid_DNB/Data Fields"

File /opt/anaconda3/lib/python3.11/site-packages/h5py/_hl/files.py:567, in File.__init__(self, name, mode, driver, libver, userblock_size, swmr, rdcc_nslots, rdcc_nbytes, rdcc_w0, track_order, fs_strategy, fs_persist, fs_threshold, fs_page_size, page_buf_size, min_meta_keep, min_raw_keep, locking, alignment_threshold, alignment_interval, meta_block_size, **kwds)
    558     fapl = make_fapl(driver, libver, rdcc_nslots, rdcc_nbytes, rdcc_w0,
    559                      locking, page_buf_size, min_meta_keep, min_raw_keep,
    560                      alignment_threshold=alignment_threshold,
    561                      alignment_interval=alignment_interval,
    562                      meta_block_size=meta_block_size,
    563                      **kwds)
    564     fcpl = make_fcpl(track_order=track_order, fs_strategy=fs_strategy,
    565                      fs_persist=fs_persist, fs_threshold=fs_threshold,
    566                      fs_page_size=fs_page_size)
--> 567     fid = make_fid(name, mode, userblock_size, fapl, fcpl, swmr=swmr)
    569 if isinstance(libver, tuple):
    570     self._libver = libver

File /opt/anaconda3/lib/python3.11/site-packages/h5py/_hl/files.py:231, in make_fid(name, mode, userblock_size, fapl, fcpl, swmr)
    229     if swmr and swmr_support:
    230         flags |= h5f.ACC_SWMR_READ
--> 231     fid = h5f.open(name, flags, fapl=fapl)
    232 elif mode == 'r+':
    233     fid = h5f.open(name, h5f.ACC_RDWR, fapl=fapl)

File h5py/_objects.pyx:54, in h5py._objects.with_phil.wrapper()

File h5py/_objects.pyx:55, in h5py._objects.with_phil.wrapper()

File h5py/h5f.pyx:106, in h5py.h5f.open()

OSError: Unable to open file (file signature not found)

Environment

  • Operating System: macOS, GoogleColab
  • Browser: Google Chrome
  • Application Version/Commit: 2024.8.1

Additional Context

Possible Fix

Initially i moved the project folder to the desktop for possible read&write permission issues. It worked the first run then the error persisted.

Geopandas package installation

In the example notebook, when installing geopandas, pip automatically installed 0.9.0, but it seems version (at least) 0.10.2 is required for .explore() to work. May consider as a comment or in the !pip install code the minimum version requirements for geopandas.

Issue with `quality_flag_rm`

It seems there's an issue with removing cells based on quality.

For example, this:

ntl_r = bm_raster(
    gdf,
    product_id="VNP46A2",
    date_range="2023-01-01",
    bearer=bearer,
    variable="DNB_BRDF-Corrected_NTL",
    quality_flag_rm=[2, 255],
)

fig, ax = plt.subplots()

# Plot
ntl_r["DNB_BRDF-Corrected_NTL"].sel(time="2023-01-01").plot(
    ax=ax, cmap=cc.cm.bmy, robust=True
)
cx.add_basemap(ax, crs=gdf.crs.to_string(), source=cx.providers.CartoDB.Positron)

plt.axis("off")
plt.tight_layout()

displays this:

Screen Shot 2023-12-07 at 11 11 56 AM

But when removing "0" as well, it looks the same -- when most of these pixels are good quality (0), so these should be removed / set to na as well.

ntl_r = bm_raster(
    gdf,
    product_id="VNP46A2",
    date_range="2023-01-01",
    bearer=bearer,
    variable="DNB_BRDF-Corrected_NTL",
    quality_flag_rm=[0, 2, 255],
) 

Same plotting code as above, and figure looks the same.

@g4brielvs I'll take a look here, but not sure if you have any quick thoughts?

Broken NASA link

The https://ladsweb.modaps.eosdis.nasa.gov/archive/allData/5000/VNP46A3 link in:

Before downloading and extracting Black Marble data, define the [NASA LAADS archive](https://ladsweb.modaps.eosdis.nasa.gov/archive/allData/5000/VNP46A3/) `bearer` token, and define a region of interest (i.e., `gdf` as a [`geopandas.GeoDataFrame`](https://geopandas.org/en/stable/docs/reference/api/geopandas.GeoDataFrame.html)).

Gives me this page with a 500 error. Is that a temporary error? Or is it something I am doing wrong?

image

For a future FAQ section -- note on Jupyter Lab versioning.

While everything in the notebook was working well, I would receive this recurring error in lieu of the progress bars:

image

To fix, I needed to update my Jupyter Lab installation (after trying various other fixes). Maybe this can be included in a future FAQ or trouble-shooting guide.

`_retrieve_manifest` is slow

The current implementation of _retrieve_manifest is really slow depending on the numbers of files to retrieve. Also, an error will be raised if any "manifest" doesn't exist, so we can make a bit more user friendly.

OSError: Unable to synchronously open file (file signature not found)

Describe the bug
OSError: Unable to synchronously open file (file signature not found)

To Reproduce
Steps to reproduce the behavior:
Just running a simple bm_raster call from the examples is giving this error:

GETTING MANIFEST...: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4/4 [00:03<00:00, 1.10it/s]
QUEUEING TASKS | Downloading...: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████| 360/360 [00:00<00:00, 2926.85it/s]
PROCESSING TASKS | Downloading...: 75%|█████████████████████████████████████████[2024-02-10 22:28:56 - backoff:105 - INFO] Backing off _download_file(...) for 0.0s (httpx.ConnectTimeout: timed out)
PROCESSING TASKS | Downloading...: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████| 360/360 [00:30<00:00, 11.77it/s]
COLLECTING RESULTS | Downloading...: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 360/360 [00:00<?, ?it/s]
COLLATING RESULTS | Processing...: 0%| | 0/90 [00:00<?, ?it/s]
Traceback (most recent call last):
File "C:\pythonTest\blackMarble.py", line 37, in
daily = bm_raster(
File "C:\Users\Python310\lib\site-packages\pydantic\validate_call_decorator.py", line 58, in wrapper_function
return validate_call_wrapper(*args, **kwargs)
File "C:\Users\Python310\lib\site-packages\pydantic_internal_validate_call.py", line 81, in call
res = self.pydantic_validator.validate_python(pydantic_core.ArgsKwargs(args, kwargs))
File "C:\Users\Python310\lib\site-packages\blackmarble\raster.py", line 283, in bm_raster
da = [
File "C:\Users\Python310\lib\site-packages\blackmarble\raster.py", line 285, in
h5_to_geotiff(
File "C:\Users\Python310\lib\site-packages\blackmarble\raster.py", line 66, in h5_to_geotiff
with h5py.File(f, "r") as h5_data:
File "C:\Users\Python310\lib\site-packages\h5py_hl\files.py", line 562, in init
fid = make_fid(name, mode, userblock_size, fapl, fcpl, swmr=swmr)
File "C:\Users\Python310\lib\site-packages\h5py_hl\files.py", line 235, in make_fid
fid = h5f.open(name, flags, fapl=fapl)
File "h5py_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py\h5f.pyx", line 102, in h5py.h5f.open
OSError: Unable to synchronously open file (file signature not found)

Any help is appreciated

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Desktop (please complete the following information):

  • OS: [e.g. iOS]
  • Version [e.g. 22]

Additional context
Add any other context about the problem here.

Issues when downloading many files

I was downloading data for multiple days and the "retrieving" failed a few times. The function still worked, as it's designed to retry. But not sure if there's something going on where it's trying to download too much at one time? Not sure if limiting to, say, 2-3 concurrent downloads would help this issue? (I never seem to get these red bars when downloading just a couple tiles at once).

Screen Shot 2023-12-20 at 6 43 16 PM

Add Quality flag for cloud mask

Bit 0: Day/Night
0: Night
1: Day
Bits 1-3: Land/Water Background
0: Land & Desert
1: Land no Desert
2: Inland Water
3: Sea Water
5: Coastal
Bits 4-5: Cloud Mask Quality
0: Poor
1: Low
2: Medium
3: High
Bits 6-7: Cloud Detection Results & Confidence Indicator
0: Confident Clear
1: Probably Clear
2: Probably Cloudy
3: Confident Cloudy
Bit 8: Shadow Detected
0: No
1: Yes
Bit 9: Cirrus Detection (IR) (BTM15 - BTM16)
0: No cloud
1: Cloud
Bit 10: Snow/Ice Surface
0: No Snow/Ice
1: Snow/Ice

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.