Git Product home page Git Product logo

xee's Introduction

Xee: Xarray + Google Earth Engine

Xee Logo

An Xarray extension for Google Earth Engine.

image image Conda Recipe image Conda Downloads

How to use

Install with pip:

pip install --upgrade xee

Install with conda:

conda install -c conda-forge xee

Then, authenticate Earth Engine:

earthengine authenticate --quiet

Now, in your Python environment, make the following imports:

import ee
import xarray

Next, initialize the EE client with the high volume API:

ee.Initialize(opt_url='https://earthengine-highvolume.googleapis.com')

Open any Earth Engine ImageCollection by specifying the Xarray engine as 'ee':

ds = xarray.open_dataset('ee://ECMWF/ERA5_LAND/HOURLY', engine='ee')

Open all bands in a specific projection (not the Xee default):

ds = xarray.open_dataset('ee://ECMWF/ERA5_LAND/HOURLY', engine='ee',
                         crs='EPSG:4326', scale=0.25)

Open an ImageCollection (maybe, with EE-side filtering or processing):

ic = ee.ImageCollection('ECMWF/ERA5_LAND/HOURLY').filterDate('1992-10-05', '1993-03-31')
ds = xarray.open_dataset(ic, engine='ee', crs='EPSG:4326', scale=0.25)

Open an ImageCollection with a specific EE projection or geometry:

ic = ee.ImageCollection('ECMWF/ERA5_LAND/HOURLY').filterDate('1992-10-05', '1993-03-31')
leg1 = ee.Geometry.Rectangle(113.33, -43.63, 153.56, -10.66)
ds = xarray.open_dataset(
    ic,
    engine='ee',
    projection=ic.first().select(0).projection(),
    geometry=leg1
)

Open multiple ImageCollections into one xarray.Dataset, all with the same projection:

ds = xarray.open_mfdataset(['ee://ECMWF/ERA5_LAND/HOURLY', 'ee://NASA/GDDP-CMIP6'],
                           engine='ee', crs='EPSG:4326', scale=0.25)

Open a single Image by passing it to an ImageCollection:

i = ee.ImageCollection(ee.Image("LANDSAT/LC08/C02/T1_TOA/LC08_044034_20140318"))
ds = xarray.open_dataset(i, engine='ee')

Open any Earth Engine ImageCollection to match an existing transform:

raster = rioxarray.open_rasterio(...) # assume crs + transform is set
ds = xr.open_dataset(
    'ee://ECMWF/ERA5_LAND/HOURLY',
    engine='ee',
    geometry=tuple(raster.rio.bounds()), # must be in EPSG:4326
    projection=ee.Projection(
        crs=str(raster.rio.crs), transform=raster.rio.transform()[:6]
    ),
)

See examples or docs for more uses and integrations.

How to run integration tests

The Xee integration tests only pass on Xee branches (no forks). Please run the integration tests locally before sending a PR. To run the tests locally, authenticate using earthengine authenticate and run the following:

USE_ADC_CREDENTIALS=1 python -m unittest xee/ext_integration_test.py

License

This is not an official Google product.

Copyright 2023 Google LLC

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at

    https://www.apache.org/licenses/LICENSE-2.0

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

xee's People

Contributors

12rambau avatar alxmrs avatar arunsathiya avatar boothmanrylan avatar dabhicusp avatar dependabot[bot] avatar giswqs avatar kmarkert avatar ljstrnadiii avatar mahrsee1997 avatar naschmitz avatar raybellwaves avatar schwehr avatar shoyer avatar tylere avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

xee's Issues

Fill out the documentation outline.

Here my documentation plans:

  • Why Xee?
  • Core features
    • open_dataset()
    • open_mfdatasets()
    • Projections & Geometry
    • Xarray slicing & indexing 101
    • Combining ee.ImageCollection and Xarray APIs.
    • Plotting
    • Lazy Evaluation & load()
  • Advanced projections
  • Performance tuning: A tale of two chunks
  • Walkthrough: calculating NDVI
  • Integration with Xarray-Beam
  • Integration with ML pipeline clients

Add support for Python 3.8

The xee package requires Python >= 3.9. Do all xee dependencies require Python >= 3.9? If not, it would be great to relax the lower bound to 3.8. That would make it easier for downstream packages to add xee as a dependency. The Python 3.8 secuiry support ends on 14 Oct 2024. There are users still using Python 3.8.

unable to open LANDSAT/LC08/C02/T1_TOA/LC08_044034_20140318

Thanks a lot for your work here @alxmrs !

I just started playing with this and I wanted to understand the scale parameter. I came across some information at https://developers.google.com/earth-engine/guides/scale

I was thinking what a python version (using xee) of the JaveScript code snippet on the page would look like:

var image = ee.Image('LANDSAT/LC08/C02/T1_TOA/LC08_044034_20140318').select('B4');

var printAtScale = function(scale) {
  print('Pixel value at '+scale+' meters scale',
    image.reduceRegion({
      reducer: ee.Reducer.first(),
      geometry: image.geometry().centroid(),
      // The scale determines the pyramid level from which to pull the input
      scale: scale
  }).get('B4'));
};

printAtScale(10); // 0.10394100844860077
printAtScale(30); // 0.10394100844860077
printAtScale(50); // 0.09130698442459106
printAtScale(70); // 0.1150854229927063
printAtScale(200); // 0.102478988468647
printAtScale(500); // 0.09072770178318024

I'm having trouble opening the asset LANDSAT/LC08/C02/T1_TOA/LC08_044034_20140318.

I've tried

i = ee.Image("LANDSAT/LC08/C02/T1_TOA/LC08_044034_20140318")
ds = xarray.open_dataset(i, engine="ee")

and got

>>> ds = xarray.open_dataset(i, engine="ee")
Traceback (most recent call last):
  File "/Users/ray/miniforge3/envs/test_env/lib/python3.10/site-packages/ee/data.py", line 354, in _execute_cloud_call
    return call.execute(num_retries=num_retries)
  File "/Users/ray/miniforge3/envs/test_env/lib/python3.10/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper
    return wrapped(*args, **kwargs)
  File "/Users/ray/miniforge3/envs/test_env/lib/python3.10/site-packages/googleapiclient/http.py", line 938, in execute
    raise HttpError(resp, content, uri=self.uri)
googleapiclient.errors.HttpError: <HttpError 400 when requesting https://earthengine-highvolume.googleapis.com/v1/projects/ee-rayjohnbell0/value:compute?prettyPrint=false&alt=json returned "ImageCollection.load: ImageCollection asset 'ee.Image({  "functionInvocationValue": {    "functionName": "Image.load",    "arguments": {      "id": {        "constantValue": "LANDSAT/LC08/C02/T1_TOA/LC08_044034_20140318"      }    }  }})' not found (does not exist or caller does not have access).". Details: "ImageCollection.load: ImageCollection asset 'ee.Image({  "functionInvocationValue": {    "functionName": "Image.load",    "arguments": {      "id": {        "constantValue": "LANDSAT/LC08/C02/T1_TOA/LC08_044034_20140318"      }    }  }})' not found (does not exist or caller does not have access).">

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/ray/miniforge3/envs/test_env/lib/python3.10/site-packages/xarray/backends/api.py", line 573, in open_dataset
    backend_ds = backend.open_dataset(
  File "/Users/ray/miniforge3/envs/test_env/lib/python3.10/site-packages/xee/ext.py", line 928, in open_dataset
    store = EarthEngineStore.open(
  File "/Users/ray/miniforge3/envs/test_env/lib/python3.10/site-packages/xee/ext.py", line 149, in open
    return cls(
  File "/Users/ray/miniforge3/envs/test_env/lib/python3.10/site-packages/xee/ext.py", line 186, in __init__
    self.n_images = self.get_info['size']
  File "/Users/ray/miniforge3/envs/test_env/lib/python3.10/functools.py", line 981, in __get__
    val = self.func(instance)
  File "/Users/ray/miniforge3/envs/test_env/lib/python3.10/site-packages/xee/ext.py", line 290, in get_info
    info = ee.List([rpc for _, rpc in rpcs]).getInfo()
  File "/Users/ray/miniforge3/envs/test_env/lib/python3.10/site-packages/ee/computedobject.py", line 105, in getInfo
    return data.computeValue(self)
  File "/Users/ray/miniforge3/envs/test_env/lib/python3.10/site-packages/ee/data.py", line 1021, in computeValue
    return _execute_cloud_call(
  File "/Users/ray/miniforge3/envs/test_env/lib/python3.10/site-packages/ee/data.py", line 356, in _execute_cloud_call
    raise _translate_cloud_exception(e)  # pylint: disable=raise-missing-from
ee.ee_exception.EEException: ImageCollection.load: ImageCollection asset 'ee.Image({  "functionInvocationValue": {    "functionName": "Image.load",    "arguments": {      "id": {        "constantValue": "LANDSAT/LC08/C02/T1_TOA/LC08_044034_20140318"      }    }  }})' not found (does not exist or caller does not have access).

The Trackback eludes to access issues but I can access other LANDSAT_LC08_C02_T1_TOA data:

ic = ee.ImageCollection("LANDSAT/LC08/C02/T1_TOA")
ds = xarray.open_dataset(ic, engine="ee")

`FileNotFoundError: [Errno 2] No such file or directory: 'ECMWF/ERA5_LAND/HOURLY'`

When using engine='ee' and not engine=xee.EarthEngineBackendEntrypoint, when we follow the main readme docs, we get the following error:

Traceback (most recent call last):
  File "/Users/alxrsngrtn/test.py", line 7, in <module>
    ds = xarray.open_dataset('ECMWF/ERA5_LAND/HOURLY', engine='ee', chunks={'time': 10})
  File "/Users/alxrsngrtn/miniconda3/envs/xee/lib/python3.10/site-packages/xarray/backends/api.py", line 579, in open_dataset
    ds = _dataset_from_backend_dataset(
  File "/Users/alxrsngrtn/miniconda3/envs/xee/lib/python3.10/site-packages/xarray/backends/api.py", line 371, in _dataset_from_backend_dataset
    ds = _chunk_ds(
  File "/Users/alxrsngrtn/miniconda3/envs/xee/lib/python3.10/site-packages/xarray/backends/api.py", line 325, in _chunk_ds
    mtime = _get_mtime(filename_or_obj)
  File "/Users/alxrsngrtn/miniconda3/envs/xee/lib/python3.10/site-packages/xarray/backends/api.py", line 230, in _get_mtime
    mtime = os.path.getmtime(os.path.expanduser(filename_or_obj))
  File "/Users/alxrsngrtn/miniconda3/envs/xee/lib/python3.10/genericpath.py", line 55, in getmtime
    return os.stat(filename).st_mtime
FileNotFoundError: [Errno 2] No such file or directory: 'ECMWF/ERA5_LAND/HOURLY'

Issue with default ee_mask_value with float32 datasets

XEE's open_dataset() method uses a default integer mask value as commented in the code below

      ee_mask_value (optional): Value to mask to EE nodata values. By default,
        this is 'np.iinfo(np.int32).max' i.e. 2147483647.

For image bands that are integer or float64 this works fine. But for float32 values this results in the nodata values having a large number 2.14748365e+09 causing all sort of problems. There are 2 workaround that fixes this.

Workaround 1: Specify a custom nodata value in open_dataset() such as ee_mask_value=-9999

ds = xarray.open_dataset(
    withNdvi.select('ndvi'),
    engine='ee',
    crs='EPSG:3857',
    scale=10,
    geometry=geometry,
    ee_mask_value=-9999
)

Workaround 2: Cast the input bands to float64 using toDouble()

ndviCasted = withNdvi.select('ndvi').map(lambda img: img.toDouble())
ds = xarray.open_dataset(
    ndviCasted,
    engine='ee',
    crs='EPSG:3857',
    scale=10,
    geometry=geometry,
)

It would be good to fix this so the workaround is not required. Below is a notebook demonstrating the problem and the workaround.
https://colab.research.google.com/drive/1jDcgQ3LDsHxyvTmAmnqyXAGYLUgwRwAN?usp=sharing

Error when calling `to_netcdf()`: `Invalid value for attr 'data_type': {'type': 'PixelType', 'precision': 'double'}.`

TypeError                                 Traceback (most recent call last)
[<ipython-input-6-1a770b2cec3c>](https://localhost:8080/#) in <cell line: 1>()
----> 1 birthday.to_netcdf('birthday.nc')

3 frames
[/usr/local/lib/python3.10/dist-packages/xarray/backends/api.py](https://localhost:8080/#) in check_attr(name, value, valid_types)
    192 
    193         if not isinstance(value, valid_types):
--> 194             raise TypeError(
    195                 f"Invalid value for attr {name!r}: {value!r}. For serialization to "
    196                 "netCDF files, its value must be of one of the following types: "

TypeError: Invalid value for attr 'data_type': {'type': 'PixelType', 'precision': 'double'}. For serialization to netCDF files, its value must be of one of the following types: str, Number, ndarray, number, list, tuple

Default chunks seem to exceed 48 MB request limit: `EEException: Total request size (56623104 bytes) must be less than or equal to 50331648 bytes.`

I wrote a small test in a colab to write a bit of data to a local file (via to_netcdf()) and got an EE error about the request limit:

---------------------------------------------------------------------------
HttpError                                 Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/ee/data.py](https://localhost:8080/#) in _execute_cloud_call(call, num_retries)
    353   try:
--> 354     return call.execute(num_retries=num_retries)
    355   except googleapiclient.errors.HttpError as e:

30 frames
HttpError: <HttpError 400 when requesting https://earthengine-highvolume.googleapis.com/v1/projects/earthengine-legacy/image:computePixels? returned "Total request size (56623104 bytes) must be less than or equal to 50331648 bytes.". Details: "Total request size (56623104 bytes) must be less than or equal to 50331648 bytes.">

During handling of the above exception, another exception occurred:

EEException                               Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/ee/data.py](https://localhost:8080/#) in _execute_cloud_call(call, num_retries)
    354     return call.execute(num_retries=num_retries)
    355   except googleapiclient.errors.HttpError as e:
--> 356     raise _translate_cloud_exception(e)  # pylint: disable=raise-missing-from
    357 
    358 

EEException: Total request size (56623104 bytes) must be less than or equal to 50331648 bytes.

`SerializationWarning: saving variable None with floating point data as an integer dtype without any _FillValue to use for NaNs`

The following code from our micro benchmark produces the warning in the title:

def open_and_write() -> None:
  with tempfile.TemporaryDirectory() as tmpdir:
    ds = xarray.open_dataset(
        'NASA/GPM_L3/IMERG_V06',
        crs='EPSG:4326',
        scale=0.25,
        chunks={'time': 24, 'lon': 1440, 'lat': 720},
        engine=xee.EarthEngineBackendEntrypoint,
    )
    ds = ds.isel(time=slice(0, 24))
    ds.to_zarr(os.path.join(tmpdir, 'imerg.zarr'))

Warning:

xarray/core/dataset.py:2060: SerializationWarning: saving variable None with floating point data as an integer dtype without any _FillValue to use for NaNs

This should not happen by default and be configurable by the end user.

A DataArray within a Dataset has different CRS

Hello!

I'm accessing the Dataset MODIS/061/MCD64A1 (MODIS Burned Area Monthly Global 500m) and am noticing the Dataset and DataArray report different projections. From what I understand, the Dataset appears to be EPSG:4326, but the DataArray contained within it is reported as SR-ORG:6974. This seems strange as the default appears to be EPSG:4326.

The output seems to be getting closer to usable after the commit Add 'm' as another synonym for 'meter'

Is this intentional behaviour or am I misunderstanding or doing something silly here?

Here's a minimum example:

import xarray as xr
import rioxarray
import ee


ee.Initialize(project="FILL ME")


ic = ee.ImageCollection('MODIS/061/MCD64A1') \
    .filterDate('2019-09-01', '2020-03-01') \
    .select('BurnDate')

ds = xr.open_dataset(ic, engine='ee', scale=0.25)

print(f"Dataset:\n{ds}")

print(f"Dataset CRS: {ds.crs}")
print(f"DataArray CRS: {ds['BurnDate'].crs}")

Standard out:

Dataset:
<xarray.Dataset>
Dimensions:   (time: 6, lon: 1440, lat: 720)
Coordinates:
  * time      (time) datetime64[ns] 2019-09-01 2019-10-01 ... 2020-02-01
  * lon       (lon) float32 -179.9 -179.6 -179.4 -179.1 ... 179.4 179.6 179.9
  * lat       (lat) float32 89.88 89.62 89.38 89.12 ... -89.38 -89.62 -89.88
Data variables:
    BurnDate  (time, lon, lat) int32 ...
Attributes:
    crs:      EPSG:4326
Dataset CRS: EPSG:4326
DataArray CRS: SR-ORG:6974

When written using ds.to_netcdf(..), and examining it with gdalinfo:

Driver: netCDF/Network Common Data Format
Files: via_xarray.nc
Size is 720, 1440
Metadata:
  BurnDate#bounds={-180,-90,180,90}
  BurnDate#crs=SR-ORG:6974
  BurnDate#crs_transform={463.3127165279165,0,-20015109.354,0,-463.3127165279167,7783653.637664001}
  BurnDate#data_type={'type': 'PixelType', 'precision': 'int', 'min': -32768, 'max': 32767}
  BurnDate#dimensions={86400,31200}
  BurnDate#id=BurnDate
  BurnDate#scale_factor=0.25
  lat#_FillValue=nan
  lon#_FillValue=nan
  NC_GLOBAL#crs=EPSG:4326
  NETCDF_DIM_EXTRA={time}
  NETCDF_DIM_time_DEF={6,10}
  NETCDF_DIM_time_VALUES={0,30,61,91,122,153}
  time#calendar=proleptic_gregorian
  time#units=days since 2019-09-01 00:00:00
Corner Coordinates:
Upper Left  (    0.0,    0.0)
Lower Left  (    0.0, 1440.0)
Upper Right (  720.0,    0.0)
Lower Right (  720.0, 1440.0)
Center      (  360.0,  720.0)
Band 1 Block=720x1 Type=Int32, ColorInterp=Undefined
  NoData Value=-2147483647
  Offset: 0,   Scale:0.25
  Metadata:
    bounds={-180,-90,180,90}
    crs=SR-ORG:6974
    crs_transform={463.3127165279165,0,-20015109.354,0,-463.3127165279167,7783653.637664001}
    data_type={'type': 'PixelType', 'precision': 'int', 'min': -32768, 'max': 32767}
    dimensions={86400,31200}
    id=BurnDate
    NETCDF_DIM_time=0
    NETCDF_VARNAME=BurnDate
    scale_factor=0.25

** TRUNCATED, REMAINING BANDS ARE SIMILAR**

Displaying it in QGIS, which reports "Layer has no coordinate reference set", so forcing EPSG:4326:
image

Versions:

Any hints for me?

export to local directly

This tool is pretty awesome. But I have no GCP.
Just wondering whether is it possible to export data to local directly.

Thanks.

Documentation on how to initialize/authenticate on distributed cluster

When I try to run xarray.open_dataset with chunks set and an active distributed dask cluster, it correctly distributes the chunks to workers. But then it fails because each individual worker needs to ee.Initialize().

Is this a supported use case, and is there documentation or examples on how to set up the authentication/initialization across the whole cluster?

xarray.open_mfdataset() example in Xee README fails

Is dask required for the open_mfdataset() example to work?

To reproduce

mamba create -n xee-test -c conda-forge python=3.11 xee 

Open a python shell and run the open_mfdataset example:

(xee-test) tylere@Tylers-MBP xee-test % python
Python 3.11.6 | packaged by conda-forge | (main, Oct  3 2023, 10:37:07) [Clang 15.0.7 ] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import ee
>>> import xarray
>>> ee.Initialize(opt_url='https://earthengine-highvolume.googleapis.com')
>>> ds = xarray.open_mfdataset(['ee://ECMWF/ERA5_LAND/HOURLY', 'ee://NASA/GDDP-CMIP6'],
...                            engine='ee', crs='EPSG:4326', scale=0.25)

Traceback

/Users/tylere/mambaforge/envs/xee-test/lib/python3.11/site-packages/xee/ext.py:565: UserWarning: Converting non-nanosecond precision datetime values to nanosecond precision. This behavior can eventually be relaxed in xarray, as it is an artifact from pandas which is now beginning to support non-nanosecond precision values. This warning is caused by passing non-nanosecond np.datetime64 or np.timedelta64 values to the DataArray or Variable constructor; it can be silenced by converting the values to nanosecond precision ahead of time.
  xarray.Variable(self.primary_dim_name, primary_coord),
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/tylere/mambaforge/envs/xee-test/lib/python3.11/site-packages/xarray/backends/api.py", line 1027, in open_mfdataset
    datasets = [open_(p, **open_kwargs) for p in paths]
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tylere/mambaforge/envs/xee-test/lib/python3.11/site-packages/xarray/backends/api.py", line 1027, in <listcomp>
    datasets = [open_(p, **open_kwargs) for p in paths]
                ^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tylere/mambaforge/envs/xee-test/lib/python3.11/site-packages/xarray/backends/api.py", line 579, in open_dataset
    ds = _dataset_from_backend_dataset(
         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tylere/mambaforge/envs/xee-test/lib/python3.11/site-packages/xarray/backends/api.py", line 371, in _dataset_from_backend_dataset
    ds = _chunk_ds(
         ^^^^^^^^^^
  File "/Users/tylere/mambaforge/envs/xee-test/lib/python3.11/site-packages/xarray/backends/api.py", line 319, in _chunk_ds
    chunkmanager = guess_chunkmanager(chunked_array_type)
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/Users/tylere/mambaforge/envs/xee-test/lib/python3.11/site-packages/xarray/core/parallelcompat.py", line 93, in guess_chunkmanager
    raise ValueError(
ValueError: unrecognized chunk manager dask - must be one of: []
>>>

Environment:

Pip package versions:

earthengine-api               0.1.377
xarray                        2023.10.1
xee                           0.0.2
pip list Package Version ------------------------ ------------ affine 2.4.0 aiohttp 3.8.6 aiosignal 1.3.1 async-timeout 4.0.3 attrs 23.1.0 Brotli 1.1.0 cachetools 5.3.2 certifi 2023.7.22 cffi 1.16.0 charset-normalizer 3.3.2 cryptography 41.0.5 earthengine-api 0.1.377 frozenlist 1.4.0 google-api-core 2.12.0 google-api-python-client 2.106.0 google-auth 2.23.4 google-auth-httplib2 0.1.1 google-cloud-core 2.3.3 google-cloud-storage 2.13.0 google-crc32c 1.1.2 google-resumable-media 2.6.0 googleapis-common-protos 1.61.0 grpcio 1.59.2 httplib2 0.22.0 idna 3.4 multidict 6.0.4 numpy 1.26.0 packaging 23.2 pandas 2.1.2 pip 23.3.1 protobuf 4.24.4 pyasn1 0.5.0 pyasn1-modules 0.3.0 pycparser 2.21 pyOpenSSL 23.3.0 pyparsing 3.1.1 pyproj 3.6.1 PySocks 1.7.1 python-dateutil 2.8.2 pytz 2023.3.post1 pyu2f 0.1.5 requests 2.31.0 rsa 4.9 setuptools 68.2.2 six 1.16.0 typing_extensions 4.8.0 tzdata 2023.3 uritemplate 4.1.1 urllib3 2.0.7 wheel 0.41.3 xarray 2023.10.1 xee 0.0.2 yarl 1.9.2
conda list # packages in environment at /Users/tylere/mambaforge/envs/xee-test: # # Name Version Build Channel affine 2.4.0 pyhd8ed1ab_0 conda-forge aiohttp 3.8.6 py311h05b510d_1 conda-forge aiosignal 1.3.1 pyhd8ed1ab_0 conda-forge async-timeout 4.0.3 pyhd8ed1ab_0 conda-forge attrs 23.1.0 pyh71513ae_1 conda-forge brotli-python 1.1.0 py311ha891d26_1 conda-forge bzip2 1.0.8 h93a5062_5 conda-forge c-ares 1.21.0 h93a5062_0 conda-forge ca-certificates 2023.7.22 hf0a4a13_0 conda-forge cachetools 5.3.2 pyhd8ed1ab_0 conda-forge certifi 2023.7.22 pyhd8ed1ab_0 conda-forge cffi 1.16.0 py311h4a08483_0 conda-forge charset-normalizer 3.3.2 pyhd8ed1ab_0 conda-forge cryptography 41.0.5 py311h71175c2_0 conda-forge earthengine-api 0.1.377 pyhd8ed1ab_0 conda-forge frozenlist 1.4.0 py311heffc1b2_1 conda-forge google-api-core 2.12.0 pyhd8ed1ab_0 conda-forge google-api-python-client 2.106.0 pyhd8ed1ab_0 conda-forge google-auth 2.23.4 pyhca7485f_0 conda-forge google-auth-httplib2 0.1.1 pyhd8ed1ab_0 conda-forge google-cloud-core 2.3.3 pyhd8ed1ab_0 conda-forge google-cloud-storage 2.13.0 pyhca7485f_0 conda-forge google-crc32c 1.1.2 py311h533d1a3_5 conda-forge google-resumable-media 2.6.0 pyhd8ed1ab_0 conda-forge googleapis-common-protos 1.61.0 pyhd8ed1ab_0 conda-forge grpcio 1.59.2 py311h79dd126_0 conda-forge httplib2 0.22.0 pyhd8ed1ab_0 conda-forge idna 3.4 pyhd8ed1ab_0 conda-forge krb5 1.21.2 h92f50d5_0 conda-forge lerc 4.0.0 h9a09cb3_0 conda-forge libabseil 20230802.1 cxx17_h13dd4ca_0 conda-forge libblas 3.9.0 19_osxarm64_openblas conda-forge libcblas 3.9.0 19_osxarm64_openblas conda-forge libcrc32c 1.1.2 hbdafb3b_0 conda-forge libcurl 8.4.0 h2d989ff_0 conda-forge libcxx 16.0.6 h4653b0c_0 conda-forge libdeflate 1.19 hb547adb_0 conda-forge libedit 3.1.20191231 hc8eb9b7_2 conda-forge libev 4.33 h642e427_1 conda-forge libexpat 2.5.0 hb7217d7_1 conda-forge libffi 3.4.2 h3422bc3_5 conda-forge libgfortran 5.0.0 13_2_0_hd922786_1 conda-forge libgfortran5 13.2.0 hf226fd6_1 conda-forge libgrpc 1.59.2 hbcf6334_0 conda-forge libjpeg-turbo 3.0.0 hb547adb_1 conda-forge liblapack 3.9.0 19_osxarm64_openblas conda-forge libnghttp2 1.55.1 h2b02ca0_0 conda-forge libopenblas 0.3.24 openmp_hd76b1f2_0 conda-forge libprotobuf 4.24.4 hc9861d8_0 conda-forge libre2-11 2023.06.02 h1753957_0 conda-forge libsqlite 3.44.0 h091b4b1_0 conda-forge libssh2 1.11.0 h7a5bd25_0 conda-forge libtiff 4.6.0 ha8a6c65_2 conda-forge libwebp-base 1.3.2 hb547adb_0 conda-forge libzlib 1.2.13 h53f4e23_5 conda-forge llvm-openmp 17.0.4 hcd81f8e_0 conda-forge multidict 6.0.4 py311he2be06e_1 conda-forge ncurses 6.4 h463b476_2 conda-forge numpy 1.26.0 py311hb8f3215_0 conda-forge openssl 3.1.4 h0d3ecfb_0 conda-forge packaging 23.2 pyhd8ed1ab_0 conda-forge pandas 2.1.2 py311h6e08293_0 conda-forge pip 23.3.1 pyhd8ed1ab_0 conda-forge proj 9.3.0 h52fb9d0_2 conda-forge protobuf 4.24.4 py311h4d1eceb_0 conda-forge pyasn1 0.5.0 pyhd8ed1ab_0 conda-forge pyasn1-modules 0.3.0 pyhd8ed1ab_0 conda-forge pycparser 2.21 pyhd8ed1ab_0 conda-forge pyopenssl 23.3.0 pyhd8ed1ab_0 conda-forge pyparsing 3.1.1 pyhd8ed1ab_0 conda-forge pyproj 3.6.1 py311h20a9b75_4 conda-forge pysocks 1.7.1 pyha2e5f31_6 conda-forge python 3.11.6 h47c9636_0_cpython conda-forge python-dateutil 2.8.2 pyhd8ed1ab_0 conda-forge python-tzdata 2023.3 pyhd8ed1ab_0 conda-forge python_abi 3.11 4_cp311 conda-forge pytz 2023.3.post1 pyhd8ed1ab_0 conda-forge pyu2f 0.1.5 pyhd8ed1ab_0 conda-forge re2 2023.06.02 h6135d0a_0 conda-forge readline 8.2 h92ec313_1 conda-forge requests 2.31.0 pyhd8ed1ab_0 conda-forge rsa 4.9 pyhd8ed1ab_0 conda-forge setuptools 68.2.2 pyhd8ed1ab_0 conda-forge six 1.16.0 pyh6c4a22f_0 conda-forge sqlite 3.44.0 hf2abe2d_0 conda-forge tk 8.6.13 h5083fa2_1 conda-forge typing-extensions 4.8.0 hd8ed1ab_0 conda-forge typing_extensions 4.8.0 pyha770c72_0 conda-forge tzdata 2023c h71feb2d_0 conda-forge uritemplate 4.1.1 pyhd8ed1ab_0 conda-forge urllib3 2.0.7 pyhd8ed1ab_0 conda-forge wheel 0.41.3 pyhd8ed1ab_0 conda-forge xarray 2023.10.1 pyhd8ed1ab_0 conda-forge xee 0.0.2 pyhd8ed1ab_0 conda-forge xz 5.2.6 h57fd34a_0 conda-forge yarl 1.9.2 py311h05b510d_1 conda-forge zstd 1.5.5 h4f39d0f_0 conda-forge

ee to_xarray() method

I know it's outside the scope of this project and It wasn't obvious how to make a feature requeset to https://github.com/google/earthengine-api

I think a good long term goal would be to add a .to_xarray() method to ee objects e.g. ee.Image('LANDSAT/LC08/C02/T1_TOA/LC08_044034_20140318').select('B4').to_xarray() which I would imagine uses xee under the hood

XEE does not return expected coordinate ranges/coordinates are offset

Thanks to #57 the returned coordinates are now correct. However, the coordinate ranges are not what would be expected. Take these two examples.
image
Which returns the following coordinates
image

And

image which returns image

(I think we can ignore the 0.07 difference in lat here, probably an EE rounding issue?)

First issues here:

  1. In the first example, lat only goes back to 88.5 instead of 89.5 (which would be expected for a global geometry).
  2. In the second example. even though the geometry is the same in lat/lon, the shape is (11, 10).

I believe this is due to https://github.com/google/Xee/blob/main/xee/ext.py#L239 which changes y_max. Removing the + self.scale_y produces the expected output for the first example, i.e. lat = [89.5, ..., -89.5] and returns a (11, 11) shape for the second example with the following coordinates.
image
However, I am not sure exactly why this was added in the first place and don't know if this would fail in other cases.

Second issue is about the 0.5 offset with respect to the specified geometry:

  1. In the first example with a bbox of (-180.0, -90.0, 180.0, 90.0), the array starts at -179.5 and 89.5.
  2. In the second example with a bbox of (9.0, 10.0, 20.0, 20.070308685302734), the array starts at 9.5 and 19.5

I am wondering what the x_start and y_start parameters in https://github.com/google/Xee/blob/main/xee/ext.py#L422 should do. In my cases they are always zero. I played around with self.scale_x * 0.5, etc. but the issue is that the direction of the offset is different depending on whether the bbox is at the edge (e.g. -180) or not (e.g. 10). At the edge the offset is positive (-180 + 0.5 = -179.5) while in other cases it is negative (10 - 0.5 = 9.5).

Note that I can "fix" the second case myself by specifying the following geometry: ee.Geometry.Rectangle(10.5, 10, 20, 20.5). But this doesn't work for the global case.

Why is the offset there in the first place? Is there a way to specify in the grid that you'd actually want the values at the given grid points rather than in the middle?

Alternatively, one could write some logic that computes whether the offset should be negative or positive.

PS: In the end, I would love to get an array that matches, e.g. the WB2 Zarr files starting at -180 and 90 (or-90).

Issue installing Xee

Hi all,

First of all, this looks great.

My issue is probably very basic. I'm trying to install Xee:

I do: pip install --upgrade xee (I did also pip install xee)

and I get: ERROR: Could not find a version that satisfies the requirement xee (from versions: none)
ERROR: No matching distribution found for xee

Any help would be appreciated.

`EEException: Invalid value at 'file_format'`

Summary

Unable to fetch pixels from an image asset using the format NUMPY_NDARRAY with Earth Engine.

Description

When trying to fetch pixels from an image asset using the NUMPY_NDARRAY format with Earth Engine through xarray.open_dataset, an HttpError occurs. The error points to an invalid value in the 'file_format'.

The key error message returned is:

Invalid value at 'file_format' (type.googleapis.com/google.earthengine.v1alpha.ImageFileFormat), "NUMPY_NDARRAY"

Steps to Reproduce

  1. Initialize an ee.Geometry.Rectangle object for a specified AOI.
aoi = ee.Geometry.Rectangle(77, 12, 79, 15)
  1. Attempt to open the dataset with the 'ee' engine using xarray.open_dataset.
ds = xarray.open_dataset(
    maskedCol,
    engine='ee',
    crs='EPSG:32643',
    scale=250,
    geometry=aoi
)

Expected Behavior

Pixels from the specified image asset should be fetched in the NUMPY_NDARRAY format without any errors.

Actual Behavior

An HttpError is raised indicating an invalid value for 'file_format' related to NUMPY_NDARRAY.

Error Traceback

HttpError: <HttpError 400 when requesting https://earthengine-highvolume.googleapis.com/v1alpha/projects/earthengine-legacy/image:computePixels? returned "Invalid value at 'file_format' (type.googleapis.com/google.earthengine.v1alpha.ImageFileFormat), "NUMPY_NDARRAY"".
...
EEException: Invalid value at 'file_format' (type.googleapis.com/google.earthengine.v1alpha.ImageFileFormat), "NUMPY_NDARRAY"

Environment

  • Python Version: 3.9

  • Library Version:

    • earthengine-api==0.1.374
    • xarray==2023.6.0
    • xee==0.0.1

`UserWarning: Converting non-nanosecond precision datetime values to nanosecond precision`

On following the examples in the README, we get the following warning:

/Users/alxrsngrtn/git/xee/xee/ext.py:339: UserWarning: Converting non-nanosecond precision datetime values to nanosecond precision. This behavior can eventually be relaxed in xarray, as it is an artifact from pandas which is now beginning to support non-nanosecond precision values. This warning is caused by passing non-nanosecond np.datetime64 or np.timedelta64 values to the DataArray or Variable constructor; it can be silenced by converting the values to nanosecond precision ahead of time.

With paragraph breaks:

/Users/alxrsngrtn/git/xee/xee/ext.py:339: UserWarning: Converting non-nanosecond 
precision datetime values to nanosecond precision. This behavior can eventually be 
relaxed in xarray, as it is an artifact from pandas which is now beginning to support 
non-nanosecond precision values. This warning is caused by passing non-nanosecond
 np.datetime64 or np.timedelta64 values to the DataArray or Variable constructor; 
it can be silenced by converting the values to nanosecond precision ahead of time.

Opening a dataset with a projected CRS does not convert the dataset's coordinates

This seems to lead to errors later on when attempting to write results back to a geotiff using rioxarray.

Full notebook to reproduce: https://colab.research.google.com/drive/149vNtVZ0lGJn5PXxfn7oaTS1VoJX0Qec?usp=sharing

Opening a dataset without specifying a crs:

unprojected_ds = xr.open_dataset(
    col,
    engine="ee",
    scale=0.0025,  # degrees
    geometry=geometry,
)

And then writing to a geotiff using:

def to_raster(ds, fname, crs="epsg:32610", x_dim="X", y_dim="Y"):
    ds = ds.transpose(y_dim, x_dim)
    ds = ds.rio.set_spatial_dims(x_dim=x_dim, y_dim=y_dim)
    ds = ds.rio.write_crs(crs)
    ds = ds.rio.reproject(crs)
    ds.rio.to_raster(fname)

to_raster(
    unprojected_ds.mean(dim="time"),
    "unprojected_dataset.tif",
    crs="epsg:4326",
    x_dim="lon",
    y_dim="lat",
)

And then uploading that geotiff to earth engine produces an image that has the right dimensions and is located properly:
xarray_unprojected_on_map

Opening a dataset with a specific crs:

ds = xr.open_dataset(
    col,
    engine="ee",
    scale=10,
    crs="epsg:32610",
    geometry=geometry,
)
to_raster(ds.mean(dim="time"), "dataset.tif")

Writing to geotiff and then uploading to earth engine produces an image that is located in the wrong location (over the equator in the pacific ocean, should be over san francisco), has the wrong dimensions (225x181 vs 203x206), and the wrong scale (0.000103m vs 10m):
xarray_original

Transforming the coordinates of a dataset opened with a crs:

def transform_coords(dataset):
    x = dataset.coords["X"].values
    y = dataset.coords["Y"].values

    _x = np.linspace(np.min(x), np.max(x), y.shape[0])
    _y = np.linspace(np.min(y), np.max(y), x.shape[0])

    transformer = pyproj.Transformer.from_crs("epsg:4326", crs, always_xy=True)
    x_prime = transformer.transform(x, _y)[0]
    y_prime = transformer.transform(_x, y)[1]

    return dataset.assign_coords({"X": x_prime, "Y": y_prime})

transformed_ds = transform_coords(ds)
to_raster(transformed_ds.mean(dim="time"), "transformed_dataset.tif")

And then uploading to earth engine, produces an image that is properly located and has the right dimensions (the scale is slightly off, but I think that has to do with the hacky method I'm using to transform the coordinates here):
xarray_transformed_on_map

Looks like fun!

This projects is along similar lines to what I am trying to do in Intake, and I think we can collaborate. Specifically: Intake should allow you to list the available datasets and get their metadata details for searching, and then pass them to your code to the the actual opening when the user has found what they want.

Do you have any interest in this kind of thing?

handling discontinuous temporal datasets

This may be more of a constraint of gee than xee. I find it interesting that:

ic = ee.ImageCollection("LANDSAT/LC08/C02/T1_TOA")
ds = xarray.open_dataset(ic, engine="ee")

provides data for 2014-06-09T14:08:03.992 to 2015-03-03T14:26:15.166:

>>> ds["time"]
<xarray.DataArray 'time' (time: 1715353)>
array(['2014-06-09T14:08:03.992000000', '2014-06-25T14:08:05.834000000',
       '2014-07-11T14:08:14.718000000', ..., '2015-03-03T14:25:27.235000000',
       '2015-03-03T14:25:51.203000000', '2015-03-03T14:26:15.166000000'],
      dtype='datetime64[ns]')
Coordinates:
  * time     (time) datetime64[ns] 2014-06-09T14:08:03.992000 ... 2015-03-03T...

Other data exists:

ic = ee.ImageCollection("LANDSAT/LC08/C02/T1_TOA").filterDate("2023-01-01", "2023-01-31")
ds = xarray.open_dataset(ic, engine="ee")

and i'm guessing the prior data comes from constant availible data in time (~2014 - ~2015)?

This may go hand-in-hand with #47 with an internal (or intake) data catalog tool.

Improve performance of Xee

I've noticed a regression in performance of Xee from the time we've introduced micro benchmarks.

Before:

open_dataset():avg=11.89,std=3.99,best=6.71,worst=22.73
open_and_chunk():avg=11.44,std=3.40,best=7.46,worst=20.65
open_and_write():avg=58.49,std=12.15,best=48.75,worst=84.94

Today:

open_dataset():avg=58.46,std=11.26,best=44.83,worst=76.46
open_and_chunk():avg=51.79,std=8.26,best=39.26,worst=69.36
open_and_write():avg=102.41,std=11.80,best=90.74,worst=129.80

cache=True not working when calling xarray.open_dataset

I think there is a bug in the caching or I am misunderstanding how it should work. I am creating a dataset as shown below. Note the arg cache=True. However every time I do this ds.B8.values it takes a handful of seconds to complete. To get around this I am creating a new dataset directly from the values with this method.

Additionally, Note below I tried 2 versions of creating the uncached-dataset. The first I didn't filter by the area first and it took much much longer. I think this is probably expected but I was a bit surprised since xr.open_dataset is querying over a specific geom.

import ee
ee.Initialize()
import xarray as xr

IC=ee.ImageCollection("COPERNICUS/S2_HARMONIZED")
GEOM=ee.Geometry.Rectangle(-92.38201846776445,34.10974829658343,-92.38097240624865,34.11021909957634)
SCALE=10
EE_CRS='EPSG:3857'


IC=IC.filterDate('2021-01-01','2022-01-01').filterBounds(GEOM).map(lambda im: ee.Image(im).normalizedDifference(['B8','B4']).rename(['ndvi']))

                                                                   
def get_ee_xrr(ic,geom):
    xrr=xr.open_dataset(
        ic,
        engine='ee',
        crs=EE_CRS,
        scale=SCALE,
        geometry=geom,
        cache=True)
    return xrr.chunk().sortby('time')


def cache_ds(ds,bands=['ndvi']):
    attrs=ds.attrs
    coords=ds.coords
    data={}
    for b in bands:
        data[b]=xr.DataArray(
            attrs=attrs,
            coords=coords,
            data=ds[b].values)
    return xr.Dataset(data) 
%time ds=get_ee_xrr(IC,GEOM)
# Wall time: 1.38 s
%time ds_cached=cache_ds(ds)
# Wall time: 863 ms
%timeit ds.ndvi.values
# 558 ms ± 57.2 ms per loop (mean ± std. dev. of 7 runs, 1 loop each)
%timeit ds_cached.ndvi.values
# 5.79 µs ± 19.8 ns per loop (mean ± std. dev. of 7 runs, 100,000 loops each)

This is huge increase. Note if I only needed the values once it would be faster not to cache and in this toy example waiting a fraction of a second doesn't make much of a difference but once you start using larger geometries and doing cloud masking accessing the data multiple times becomes a big problem.

Is this expected behavior? Am I misunderstanding the cache param in xr.open_dataset? Is there another way to keep the downloaded data without explicitly recreating the dataset?

Thanks,
Brookie

XEE doesn't recognize new bands added to existing images

Discovered this serious issue with how XEE deals with image bands. A typical GEE workflow is to add new bands to existing images after computing indices or for other dependent vairables for classificaiton. When one adds a new band, the 'system:id' doesn't change and XEE uses that to fetch the list of bands for further processing. For example, consider the following snippet

def addNDVI(image):
  ndvi = image.normalizedDifference(['B8', 'B4']).rename('ndvi')
  return image.addBands(ndvi)

# Map the function over the collection
withNdvi = filtered.map(addNDVI)

This results in errors such as

EEException: Image.addBands: Cannot add band 'ndvi' because it doesn't appear in srcImg.

If one modifies the image in any way, the system:id gets overwritten and this problem doesn't arise. See example below

# Modify the function to create a 'new' image
def addNDVI(image):
  ndvi = image.normalizedDifference(['B8', 'B4']).rename('ndvi')
  return image.multiply(0.0001).addBands(ndvi).copyProperties(image, ['system:time_start'])

# Map the function over the collection
withNdvi = filtered.map(addNDVI)

My guess is due to the fact that XEE gets the metadata based on 'system:id'. This behavior is problematic and can cause a lot of very hard-to-debug errors.

Here's a full notebook that reproduces this error https://colab.research.google.com/drive/1MvayipJAwiYWWyMfGwiRiPy8RmUaCqc6?usp=sharing

Test Xee example e2e on Dataflow

Let's make sure that Xee works well with Dataflow in addition to internal infrastructure. It would also be nice to list specific performance characteristics; for example, how fast can we export 20 TiBs of IMERG to Zarr on GCP?

Estimating EECU hours

This is definitely a nice-to-have, but I am wondering if there is a reliable way to estimate the number of eecu-hours. Considering we pay for these hours, it would be nice to avoid the "oops just spent 1k+$ on an experimental dataset" issue.

We currently export data to fixed-size tiles and take a sample of tiles, run an export task to cloud storage, and get summary stats of the eecu-hours with ee.data.getOperation(f"projects/earthengine-legacy/operations/{task_id}"). This allows us to roughly estimate the cost of "ingest".

I think this is hard in the general case, but maybe we could build a recipe to sample/slice in time/x/y/dims in order to build an estimate of eecu-cost? In reality, this would be a nice-to-have function on image collections themselves, but I am guessing ee.data.getPixels, export to cloud storage, or other options vary in eecu-time. Thoughts?

Publish the package on conda-forge

I am trying to get xee published on conda-forge. However, the source distribution of the package is not available on PyPI. It would be great to include it.

image

Pixel grid dimensions (40818x1) must be less than or equal to 32768.

Repro:

import ee
import xarray
import numpy as np

ee.Initialize()

geom = ee.Geometry.Rectangle([-61, -7, -50, -18])
county = ee.Feature(geom)
image = ee.Image('projects/mapbiomas-workspace/TRANSVERSAIS/ZONACOSTEIRA6/mosaic_1985').clip(county)
coll = ee.ImageCollection(image)
ds = xarray.open_dataset(
coll.select(['NDVI']), crs='EPSG:3857', scale=30,geometry=geom,
chunks={'time':1,'X': 512, 'Y': 512}, engine='ee')

Add feature flag that allows users to interpolate the primary dimension.

This is related to #29. There is a single costly EE RPC call that we make that may not be essential. Right now, we need to access all the system:time_start properties from each image in the collection, which is slow. If instead, we read the first and last (few) values and interpolated the rest, we could save time in the EE backend and avoid the biggest bottleneck for Xee. The tradeoff would be that the interpolated values may differ from the actual values, which would definitely cause data errors.

A good scenario seems to be that we add this capability behind a feature flag (name TBD). For users that understand their data well and know that this is safe, they will get a faster means of opening data. For datasets that are hard to interpolate time (or another primary dimension), users have a fallback.

Once this feature flag exists, we'd also need to rely on the fallback for slicing image ids in _slice_collection().

Support reading more than three dimensions from Google Earth Engine

This is a really common use case for climate and weather data. For example, for surface forecasts we will want to read 4d data (time, lat, lon, init_time, valid_time). Atmospheric data will inherently be more than 3d [e.g. (time, lat, lon, level, init_time, valid_time).

`EEException: Invalid value at 'file_format'`

I am pretty stoked on the idea of this repo! Thanks for putting this out there 🙏 . This could be a huge game changer in the way people use earth engine!

Anyways, here is my setup and issue:

import ee
import xarray as xr

ee.Initialize(opt_url='https://earthengine-highvolume.googleapis.com')

bbox = ee.Geometry.Rectangle(113.33, -43.63, 153.56, -10.66)

ds = xr.open_dataset(
    "ECMWF/ERA5_LAND/HOURLY", 
    engine='ee',
    geometry=bbox, 
    crs="EPSG:4326", 
)

# get some reasonable subset
subset = ds.isel(lon=slice(0,100), lat=slice(0,100))

and then I select a subset to run the mean to test actual data streaming:

subset[['surface_solar_radiation_downwards_hourly']].mean()

and I get EEException: Invalid value at 'file_format' (type.googleapis.com/google.earthengine.v1.ImageFileFormat), "NUMPY_NDARRAY"

Any ideas?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.