Git Product home page Git Product logo

planetary-computer-apis's Introduction

Microsoft Planetary Computer APIs

Note: This repository serves as a reference implementation for deploying APIs on Azure. This code supports the production deployment of the Planetary Computer APIs. This repository is not meant to be reusable in other situations without significant modification, and the repository maintainers will not provide any support for non-development deployments of this code.

That said, feel free to crib any code that is useful!

STAC API + Tiler

This repository contains two components of the Planetary Computer APIs: the STAC API and the Tiler. These are implementations of the open source stac-fastapi and titiler projects. It uses titiler-pgstac to connect the tiler and database.

The pcstac project provides a STAC API which indexes Microsoft's publicly available geospatial data and an API for searching through this large collection. The pctiler provides visualization and data access capabilities for the data in the Planetary Computer.

Azure Functions

This repository also contains Azure Functions that provide additional endpoints for working with Planetary Computer data and metadata. This includes Function endpoints for generating images and animations based on STAC searches, using the tiler to render mosaiced data from Collections

Collection configuration

See collection config for more on developing collection configurations.

Deployment

This repository hosts the code that is deployed in the Planetary Computer. It contains deployment code for hosting these services in Azure through running the published docker images and Helm charts in Azure Kubernetes Service (AKS), which we used to stand up a development version of the services. The production deployment code is not contained in this repository.

For documentation of how you can deploy your own test version of these services, refer to docs/01-deployment.md.

Development URLs

After building the project locally using the instructions below, you can access the development version of the services by pointing your browser to the following URLs:

STAC API (via nginx) http://localhost:8080/stac
Tiler (via nginx) http://localhost:8080/data
Funcs (vai nginx) http://localhost:8080/f/image, etc..
STAC API (direct) http://localhost:8081
Tiler (direct) http://localhost:8082
Funcs (direct) http://localhost:8083

To see the HTTP endpoints available for FastAPI servers, visit the OpenAPI documentation for each service:

STAC API http://localhost:8080/stac/docs
Tiler API http://localhost:8080/data/docs

The development data only includes a single collection naip, with a few items in it. You can verify the data is loaded correctly by visiting the following URL:

http://localhost:8080/stac/collections/naip

Building and Testing Locally

Requirements

The development environment is run almost entirely through docker containers. Developing locally requires docker-compose v1.27+.

Running the Planetary Computer API services in a local development environment

This project uses a variation on scripts to rule them all.

Environment setup and building images

To set up a local environment, use

> ./scripts/setup

This will build containers, apply database migrations, and load the development data.

After migrations and development database loading are in place, you can rebuild the docker images with

> ./scripts/update

pip dependencies in setup.py are collected and installed through requirements files. If you modify dependencies, run ./scripts/generate-requirements to regenerate requirements-*.txt used by Dockerfiles otherwise your dependency change will not be realized.

Running the services

There is a local proxy service that facilitates a local "managed identity" functionality, run as your local identity. Make sure to run

az login

To run the servers, use

> ./scripts/server

This will bring up the development database, STAC API, Tiler, Azure Functions, and other services.

Testing and and formatting

To run tests, use

./scripts/test

To format code, use

./scripts/format

Changing environments

By default, the stac, tiler, funcs, and supporting services will run against the development containers brought up by scripts/server. It can sometimes be convenient to test against other services, e.g. a test database deployed on Azure. To do that, you can create a new environment file for the services based on ./pc-stac.dev.env, ./pc-tiler.dev.env, and/or ./pc-funcs.dev.env. Any environment file named similarly will be .gitignore'd, so you can leave them in your local clone and avoid committing (e.g. ./pc-stac.testing.env). You then need to set the PC_STAC_ENV_FILE, PC_TILER_ENV_FILE, and PC_FUNCS_ENV_FILE to the environment files you want to use before running scripts/server. Note: Be careful not to run migrations with a non-dev database set - avoid scripts/setup, or ensure the migration connection is still using the local dev database even if using a remote test db.

Published images, charts, and functions

This project publishes images and helm charts, which are used in the deployment of the Planetary Computer.

Images

The following images are hosted in the Microsoft Container Registry:

  • mcr.microsoft.com/planetary-computer-apis/stac
  • mcr.microsoft.com/planetary-computer-apis/tiler

Only tagged builds will be published to MCR, untagged builds will only be published to the internal ACR pcccr.

Charts

See the Helm chart repository published to GitHub pages for the published charts.

Functions

See the Function package repository published to GitHub pages for the published Azure Functions.

planetary-computer-apis's People

Contributors

alexamici avatar aliasmrchips avatar dependabot[bot] avatar gadomski avatar ghidalgo3 avatar jisantuc avatar joshimai avatar kylemann16 avatar lossyrob avatar m-cappi avatar microsoftopensource avatar mmcfarland avatar moradology avatar pholleway avatar pjhartzell avatar tomaugspurger avatar vincentsarago avatar yuvalherziger avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

planetary-computer-apis's Issues

Unable to dump collection from azurite container

I've successfully deployed dev environment and check that naip collection exist. I've tried to dump collection from inner azurite storage, following steps described in docs/collection-config.md using pcapis from ./scripts/console

pcapis dump -t collection --account=devstoreaccount1 --table=collectionconfig --sas=$SAS --output=collectionconfig.json --account-url=http://*.*.*.*:10002/devstoreaccount1

and get following error:

root@ed21df10fee0:/opt/src# pcapis dump -t collection --account=devstoreaccount1 --table=collectionconfig --sas=$SAS --output=collectionconfig.json --account-url=http://*.*.*.*:10002/devstoreaccount1
Traceback (most recent call last):
  File "/usr/local/lib/python3.9/site-packages/azure/data/tables/_models.py", line 363, in _get_next_cb
    return self._command(
  File "/usr/local/lib/python3.9/site-packages/azure/data/tables/_generated/operations/_table_operations.py", line 386, in query_entities
    raise HttpResponseError(response=response, model=error)
azure.core.exceptions.HttpResponseError: Operation returned an invalid status 'Bad Request'
Content: {"odata.error":{"code":"InvalidInput","message":{"lang":"en-US","value":"The query condition specified in the request is invalid.\nRequestId:83d710d6-6f7e-4f91-98f6-d64218398d4a\nTime:2022-07-22T09:19:56.753Z"}}}

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/local/bin/pcapis", line 33, in <module>
    sys.exit(load_entry_point('pccommon', 'console_scripts', 'pcapis')())
  File "/opt/src/pccommon/pccommon/cli.py", line 234, in cli
    return dump(**args)
  File "/opt/src/pccommon/pccommon/cli.py", line 67, in dump
    for (_, collection_id, col_config) in col_config_table.get_all():
  File "/opt/src/pccommon/pccommon/tables.py", line 204, in get_all
    for entity in table_client.query_entities(""):
  File "/usr/local/lib/python3.9/site-packages/azure/core/paging.py", line 128, in __next__
    return next(self._page_iterator)
  File "/usr/local/lib/python3.9/site-packages/azure/core/paging.py", line 76, in __next__
    self._response = self._get_next(self.continuation_token)
  File "/usr/local/lib/python3.9/site-packages/azure/data/tables/_models.py", line 372, in _get_next_cb
    _process_table_error(error)
  File "/usr/local/lib/python3.9/site-packages/azure/data/tables/_error.py", line 153, in _process_table_error
    _reraise_error(decoded_error)
  File "/usr/local/lib/python3.9/site-packages/azure/data/tables/_error.py", line 145, in _reraise_error
    raise decoded_error.with_traceback(exc_traceback)
  File "/usr/local/lib/python3.9/site-packages/azure/data/tables/_models.py", line 363, in _get_next_cb
    return self._command(
  File "/usr/local/lib/python3.9/site-packages/azure/data/tables/_generated/operations/_table_operations.py", line 386, in query_entities
    raise HttpResponseError(response=response, model=error)
azure.core.exceptions.HttpResponseError: The query condition specified in the request is invalid.
RequestId:83d710d6-6f7e-4f91-98f6-d64218398d4a
Time:2022-07-22T09:19:56.753Z
ErrorCode:InvalidInput
Content: {"odata.error":{"code":"InvalidInput","message":{"lang":"en-US","value":"The query condition specified in the request is invalid.\nRequestId:83d710d6-6f7e-4f91-98f6-d64218398d4a\nTime:2022-07-22T09:19:56.753Z"}}}

pcapis load function works correctly:

pcapis load -t collection --account=devstoreaccount1 --table=collectionconfig --sas=$SAS --file=alos-palsar-config.json --account-url=http://localhost:10002/devstoreaccount1

I can see that collection appeared in collectionconfig table in inner storage, but it is invisible in explorer and also is missing in stac api request http://*.*.*.*:8080/stac/collections/alos-palsar-mosaic

{"code":"NotFoundError","description":"No collection with id 'alos-palsar-mosaic' found!"}

P. S. Still having a lot of troubles as a lack of documentation maybe I can contact someone from your team, and contribute a better tutorial for beginners in future?)

Errors during deployment process

We received several bugs during the deployment described in README.md.

  1. images stack and tiler in docker compose linked to docker.io hub instead instead mcr.microsoft.com
  2. the pccommon util id missing from api stac image and needs to be installed by pyhon
  3. STAC api functions return "not found" instead json collections

remove pypgstac from pcstac requirements

previously raised in #25

Right now pypgstac is added to pcstac requirements but only really useful in testing. It's also use to

ps: I've done it in a previous (closed) PR: developmentseed#1

setup script fails with psycopg connection error

Describe the bug
Running ./scripts/setup fails with error connecting in 'pool-1': connection is bad: Temporary failure in name resolution. Per git bisect, this problem exists from #147 onward.

Partial shell output:

... snip ...
[+] Building 19.8s (9/9) FINISHED                                                                                                                                                                                               
 => [internal] load build definition from Dockerfile.dev                                                                                                                                                                   0.0s 
 => => transferring dockerfile: 194B                                                                                                                                                                                       0.0s 
 => [internal] load .dockerignore                                                                                                                                                                                          0.0s 
 => => transferring context: 2B                                                                                                                                                                                            0.0s
 => [internal] load metadata for docker.io/library/pc-apis-tiler:latest                                                                                                                                                    0.0s
 => [internal] load build context                                                                                                                                                                                          0.0s
 => => transferring context: 42B                                                                                                                                                                                           0.0s
 => [1/4] FROM docker.io/library/pc-apis-tiler                                                                                                                                                                             0.2s
 => [2/4] COPY requirements-dev.txt requirements-dev.txt                                                                                                                                                                   0.1s
 => [3/4] RUN pip install -r requirements-dev.txt                                                                                                                                                                         13.9s
 => [4/4] RUN pip install -e ./pccommon -e ./pctiler                                                                                                                                                                       4.7s 
 => exporting to image                                                                                                                                                                                                     0.8s 
 => => exporting layers                                                                                                                                                                                                    0.8s 
 => => writing image sha256:a7ed39a9cc859d4bf5209d73bdb50c18f2ec65de2b219018ed58e5e7d0b25d46                                                                                                                               0.0s 
 => => naming to docker.io/library/pc-apis-tiler-dev                                                                                                                                                                       0.0s 
migrating...                                                                                                                                                                                                                    
WARN[0000] The "APP_INSIGHTS_INSTRUMENTATION_KEY" variable is not set. Defaulting to a blank string.                                                                                                                            
WARN[0000] The "APP_INSIGHTS_INSTRUMENTATION_KEY" variable is not set. Defaulting to a blank string. 
[+] Running 4/4
 ⠿ Network pc-apis-dev-network                Created                                                                                                                                                                      0.0s
 ⠿ Container planetary-computer-apis-redis-1  Created                                                                                                                                                                      0.3s
 ⠿ Container pc-stac-db                       Created                                                                                                                                                                      0.3s
 ⠿ Container pcapis-azurite                   Created                                                                                                                                                                      0.3s
[+] Running 3/3
 ⠿ Container pcapis-azurite                   Started                                                                                                                                                                      1.2s
 ⠿ Container pc-stac-db                       Started                                                                                                                                                                      1.3s
 ⠿ Container planetary-computer-apis-redis-1  Started                                                                                                                                                                      1.0s
error connecting in 'pool-1': connection is bad: Temporary failure in name resolution
error connecting in 'pool-1': connection is bad: Temporary failure in name resolution
error connecting in 'pool-1': connection is bad: Temporary failure in name resolution
error connecting in 'pool-1': connection is bad: Temporary failure in name resolution
error connecting in 'pool-1': connection is bad: Temporary failure in name resolution
error connecting in 'pool-1': connection is bad: Temporary failure in name resolution
error connecting in 'pool-1': connection is bad: Temporary failure in name resolution
error connecting in 'pool-1': connection is bad: Temporary failure in name resolution
Traceback (most recent call last):
  File "/usr/local/bin/pypgstac", line 8, in <module>
    sys.exit(cli())
  File "/usr/local/lib/python3.9/site-packages/pypgstac/pypgstac.py", line 121, in cli
    fire.Fire(PgstacCLI)
  File "/usr/local/lib/python3.9/site-packages/fire/core.py", line 141, in Fire
    component_trace = _Fire(component, args, parsed_flag_args, context, name)
  File "/usr/local/lib/python3.9/site-packages/fire/core.py", line 466, in _Fire
    component, remaining_args = _CallAndUpdateTrace(
  File "/usr/local/lib/python3.9/site-packages/fire/core.py", line 681, in _CallAndUpdateTrace
    component = fn(*varargs, **kwargs)
  File "/usr/local/lib/python3.9/site-packages/pypgstac/pypgstac.py", line 50, in pgready
    self._db.wait()
  File "/usr/local/lib/python3.9/site-packages/pypgstac/db.py", line 125, in wait
    raise psycopg.errors.CannotConnectNow
psycopg.errors.CannotConnectNow

Additional context
os = Windows 11 / WSL2
docker version = Docker version 20.10.23, build 7155243
docker-compose version = Docker Compose version v2.15.1

setup script requires private images

I tried running the setup script per the instructions, but its failing because some of the images are private, e.g. pc-apis-tiler-dev internal

$ ./scripts/setup
==Building images...
[+] Building 5.0s (22/30)
 => [pc-apis-nginx internal] load build definition from Dockerfile                                                                                                0.0s
 => => transferring dockerfile: 32B                                                                                                                               0.0s
 => [pc-apis-tiler internal] load build definition from Dockerfile                                                                                                0.0s
 => => transferring dockerfile: 1.39kB                                                                                                                            0.0s
 => [pc-apis-stac-dev internal] load build definition from Dockerfile.dev                                                                                         0.0s
 => => transferring dockerfile: 192B                                                                                                                              0.0s
 => [pc-apis-tiler-dev internal] load build definition from Dockerfile.dev                                                                                        0.0s
 => => transferring dockerfile: 194B                                                                                                                              0.0s
 => [pc-apis-funcs internal] load build definition from Dockerfile                                                                                                0.1s
 => => transferring dockerfile: 627B                                                                                                                              0.0s
 => [pc-apis-stac internal] load build definition from Dockerfile                                                                                                 0.1s
 => => transferring dockerfile: 32B                                                                                                                               0.0s
 => [pc-apis-stac-db internal] load build definition from Dockerfile                                                                                              0.1s
 => => transferring dockerfile: 32B                                                                                                                               0.0s
 => [pc-apis-nginx internal] load .dockerignore                                                                                                                   0.1s
 => => transferring context: 2B                                                                                                                                   0.0s
 => [pc-apis-tiler internal] load .dockerignore                                                                                                                   0.0s
 => => transferring context: 2B                                                                                                                                   0.0s
 => [pc-apis-stac-dev internal] load .dockerignore                                                                                                                0.0s
 => => transferring context: 2B                                                                                                                                   0.0s
 => [pc-apis-tiler-dev internal] load .dockerignore                                                                                                               0.1s
 => => transferring context: 2B                                                                                                                                   0.0s
 => [pc-apis-funcs internal] load .dockerignore                                                                                                                   0.0s
 => => transferring context: 2B                                                                                                                                   0.0s
 => [pc-apis-stac internal] load .dockerignore                                                                                                                    0.0s
 => => transferring context: 2B                                                                                                                                   0.0s
 => [pc-apis-stac-db internal] load .dockerignore                                                                                                                 0.0s
 => => transferring context: 2B                                                                                                                                   0.0s
 => CANCELED [pc-apis-nginx internal] load metadata for docker.io/library/nginx:1.10                                                                              4.5s
 => CANCELED [pc-apis-tiler internal] load metadata for docker.io/library/python:3.9-slim                                                                         4.4s
 => CANCELED [pc-apis-stac-dev internal] load metadata for docker.io/library/pc-apis-stac:latest                                                                  4.4s
 => ERROR [pc-apis-tiler-dev internal] load metadata for docker.io/library/pc-apis-tiler:latest                                                                   4.3s
 => [pc-apis-funcs internal] load metadata for mcr.microsoft.com/azure-functions/python:4-python3.8                                                               0.3s
 => CANCELED [pc-apis-stac-db internal] load metadata for docker.io/library/postgres:13                                                                           4.3s
 => CANCELED [pc-apis-funcs 1/9] FROM mcr.microsoft.com/azure-functions/python:4-python3.8@sha256:89a82768164b22834854b97c194aff9d4494d26969f3673fe4b3a462c36236  4.1s
 => => resolve mcr.microsoft.com/azure-functions/python:4-python3.8@sha256:89a82768164b22834854b97c194aff9d4494d26969f3673fe4b3a462c3623682                       0.0s
 => => sha256:7ae7bb2a534b251b3501f6c25d5f154e848e923441e08454f10dfaa75d62dfdb 10.05kB / 10.05kB                                                                  0.0s
 => => sha256:a603fa5e3b4127f210503aaa6189abf6286ee5a73deeaab460f8f33ebc6b64e2 7.34MB / 31.41MB                                                                   4.2s
 => => sha256:b00aaacf759c581712fa578a6b4e8e0b9fc780919a5d835a168457b754755644 1.08MB / 1.08MB                                                                    1.1s
 => => sha256:372d780866a7e569457582348fbf850edc018b6b015335a4a56403fe299ff04b 7.34MB / 11.34MB                                                                   4.2s
 => => sha256:89a82768164b22834854b97c194aff9d4494d26969f3673fe4b3a462c3623682 2.43kB / 2.43kB                                                                    0.0s
 => => sha256:feb836cf9ff261a0d9feb57f8808540cbb140d6d9e957af5daad2767d65fec36 232B / 232B                                                                        1.3s
 => => sha256:6a0e4abca74a97205cba7ecb141fd4210bfab67dddb55e3a1fdaa6bcefbc44de 3.18MB / 3.18MB                                                                    3.0s
 => => sha256:f185cb5182255c6a037ffb84ac09b3826bebf5f56ed56c2ea08d3ac5211120eb 1.05MB / 131.85MB                                                                  4.2s
 => [pc-apis-funcs internal] load build context                                                                                                                   0.1s
 => => transferring context: 1.93kB                                                                                                                               0.0s
------
 > [pc-apis-tiler-dev internal] load metadata for docker.io/library/pc-apis-tiler:latest:
------
failed to solve: rpc error: code = Unknown desc = failed to solve with frontend dockerfile.v0: failed to create LLB definition: pull access denied, repository does not exist or may require authorization: server message: insufficient_scope: authorization failed

Possibly incorrect `self` link on `collections/<collection-id>/items` page

Spotted by @geospatial-jeff, the "self" link at https://planetarycomputer.microsoft.com/api/stac/v1/collections/sentinel-2-l2a/items?limit=1 links to the collection "https://planetarycomputer.microsoft.com/api/stac/v1/collections/sentinel-2-l2a".

❯ curl --silent 'https://planetarycomputer.microsoft.com/api/stac/v1/collections/sentinel-2-l2a/items?limit=1' | jq .links[3]
{
  "rel": "self",
  "type": "application/json",
  "href": "https://planetarycomputer.microsoft.com/api/stac/v1/collections/sentinel-2-l2a"
}

Should that instead be the URL itself, like https://planetarycomputer.microsoft.com/api/stac/v1/collections/sentinel-2-l2a/items?limit=1?

Avoid $ref in /queryables

Neither a bug nor really a feature, so opening a blank issue.

The /queryables endpoints refer to external schemas using $ref. This leads to a lot of downstream HTTP requests in clients, e.g. STAC Browser. For example requesting https://planetarycomputer.microsoft.com/api/stac/v1/queryables has a single field defined, which pretty much just refers to a very simple JSON Schema of like 30 bytes: { "type": "string", "minLength": 1 }. For this simple schema. the client has to sends 8(!) http requests returning a total of nearly 10.000 bytes. Could this be improved?

Two alternative solutions could be considered:

  1. Inline at least the simple schemas.
  2. Bundle the schema on the server-side and return the bundled schema.

I'd prefer version 2 as it just is much more lightweight on the clients, which don't need to carry a full $ref parser library. $refs itself are also pretty ill-defined with a lot of different implementation, so the interoperability is pretty bad. The JSON Schema community is currently discussing this, but that will take more time: https://github.com/json-schema-org/referencing https://phil.tech/2022/bundling-openapi-with-javascript/

Search based on temporal extent not working?

Describe the bug
I'm requesting https://planetarycomputer.microsoft.com/api/stac/v1/collections/sentinel-2-l2a/items?limit=12&datetime=2022-04-30T00:00:00Z/2022-05-30T00:00:00Z&sortby=-properties.datetime with a temporal extent covering mostly May 2022, but I'm getting data from June as result (the first Item right now has 2022-06-05T01:27:21.024000Z as datetime)

To reproduce
See above.

Expected behavior
I'd expect that only data up until 2022-05-30T00:00:00Z is returned.

API error when trying to download various datasets from planetary computer APIs

I am trying to download the sentinel-2 dataset via STAC API, but midway through the download I am getting an error like this

APIError: <!DOCTYPE html PUBLIC '-//W3C//DTD XHTML 1.0 Transitional//EN' '[http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd'><html](http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd'%3E%3Chtml) xmlns='[http://www.w3.org/1999/xhtml'><head><meta](http://www.w3.org/1999/xhtml'%3E%3Chead%3E%3Cmeta) content='text/html; charset=utf-8' http-equiv='content-type'/><style type='text/css'>body {font-family:Arial; margin-left:40px; }img { border:0 none; }#content { margin-left: auto; margin-right: auto }#message h2 { font-size: 20px; font-weight: normal; color: #000000; margin: 34px 0px 0px 0px }#message p { font-size: 13px; color: #000000; margin: 7px 0px 0px0px}#errorref { font-size: 11px; color: #737373; margin-top: 41px }</style><title>Service unavailable</title></head><body><div id='content'><div id='message'><h2>Our services aren't available right now</h2><p>We're working to restore all services as soon as possible. Please check back soon.</p></div><div id='errorref'><span>0CURnZAAAAACaXluYute5QouGLuN7eMExQlJVMzBFREdFMTExNgA5MjdhYmZhNi0xOWY2LTRhZjEtYTA5ZC1jOTU5ZDlhMWU2NDQ=</span></div></div></body></html>

I have downloaded the dataset the previous day without any issue, but I am not able to do it now. And the error appears at random places midway through the download. This error appears when trying to download many such datasets like JRC-GSW, TerraClimate, COP-DEM-GLO-30 and ESRI-10 meter Land cover.

Can someone please help me with this error.

Any help to resolve this issue would be greatly appreciated.

Thank You.

Update titiler-pgstac to 2.2.3 to avoid breaking change from starlette/Fastapi

Starting with 0.93, FastAPI has updated the Starlette requirement which broke both titiler and titiler-pgstac (because Request.url_for now return a URL object, not a string), we fixed this upstream in titiler and titiler-pgstac.

"titiler.pgstac==0.2.2",

ref: stac-utils/titiler-pgstac#80

Note: the FastAPI/Starlette requirements is set in multiple place in the project (pctiler, pcstac, pccommon) so extra care should be taking when changing it

STAC API and Tiler not using gunicorn worker and worker configuration

I just noticed that the Tiler API and STAC APIs are launching uvicorn directly to start FastAPI and not using gunicorn with a server worker.

This is fine for development, but for a production FastAPI instance it would create a queueing bottleneck for concurrent requests.

Secondly, there is no default worker processes and/or thread configuration for GUnicorn configured. This would significantly improvement throughput of the application (like a 4x improvement in requests/sec).

See https://www.uvicorn.org/deployment/#using-a-process-manager and https://fastapi.tiangolo.com/deployment/server-workers/

Spotted this in these places:

https://github.com/microsoft/planetary-computer-apis/blob/main/pctiler/Dockerfile#L47

https://github.com/microsoft/planetary-computer-apis/blob/main/pcstac/Dockerfile#L17

https://github.com/microsoft/planetary-computer-apis/blob/main/docker-compose.yml#L19

https://github.com/microsoft/planetary-computer-apis/blob/main/docker-compose.yml#L38

Happy to submit a PR with benchmark, unless I'm missing something

Request: Add sensor viewing angles to Sentinel-2 scene properties from STAC

We are currently using pystac-client to retrieve Sentinel-2 metadata (L2A) from Planetary computer and its STAC. While we found the mean solar azimuth angle (s2:mean_solar_azimuth) and mean solar zenith angle (s2:mean_solar_zenith) in the scene properties, the data about mean sensor azimuth and mean sensor zenith angle seems to be missing.

In our project, however, we'd require scene-wide angular information to run radiative transfer model simulations. In addition, we think that providing the entire angular data might be very helpful also for other users.

So, the proposal is to add the sensor azimuth and sensor zenith angle data as further entries to the scene properties. We are aware that the STAC specifications are undergoing, therefore, this feature request might be even placed wrong here, but we think it is an important addition to the current catalog.

We are looking forward to your answer. Thanks in advance.

Search errors messages should be JSON, not plain text

Describe the bug
As described stac-utils/stac-fastapi#463 (comment), the error response from the /search endpoint (and others) should be JSON. It looks like the Planetary Computer is opting-in to plaintext responses:

@app.exception_handler(RequestValidationError)
async def validation_exception_handler(
request: Request, exc: RequestValidationError
) -> PlainTextResponse:
return PlainTextResponse(str(exc), status_code=400)

Is there a reason to not return JSON here?

To reproduce

$ curl -i --json @query.json https://planetarycomputer.microsoft.com/api/stac/v1/search
HTTP/2 400 
content-length: 124
content-type: text/plain; charset=utf-8
strict-transport-security: max-age=15724800; includeSubDomains
access-control-allow-origin: *
access-control-allow-credentials: true
x-cache: CONFIG_NOCACHE
x-azure-ref: 0UNLaYwAAAADvTHj17Z8kSZvAjSBV+G7DV1NURURHRTA4MTUAOTI3YWJmYTYtMTlmNi00YWYxLWEwOWQtYzk1OWQ5YTFlNjQ0
date: Wed, 01 Feb 2023 20:57:52 GMT

1 validation error for Request
body -> intersects
  intersects and bbox parameters are mutually exclusive (type=value_error)

Expected behavior

$ curl -si --json @query.json http://localhost:8080/search
HTTP/1.1 400 Bad Request
date: Thu, 02 Feb 2023 14:32:46 GMT
server: uvicorn
content-length: 176
content-type: application/json

{"code":"RequestValidationError","description":"1 validation error for Request\nbody -> intersects\n  intersects and bbox parameters are mutually exclusive (type=value_error)"}

Search by id in CQL TEXT

Describe the bug
I request a specific Item via cql2-text encoding:
https://planetarycomputer.microsoft.com/api/stac/v1/search?filter-lang=cql2-text&filter=id%20=%20%22MCD43A4.A2021189.h30v12.061.2021198054449%22

Instead of a single result, I get an error:

"column "MCD43A4.A2021189.h30v12.061.2021198054449" does not exist"

Similarly, I'd assume it would work for items?

https://planetarycomputer.microsoft.com/api/stac/v1/collections/modis-43A4-061/items?filter-lang=cql2-text&filter=id+%3D+%22MCD43A4.A2021189.h30v12.061.2021198054449%22

I know there are other ways to filter by ID, but I'm looking at CQL right now in my client implementations.

To reproduce
See above.

Expected behavior
See above.

Search by datetime in CQL Text

Describe the bug
As datetime is not supported directly, as seen in #100, I'm now trying to use CQL which was mentioned as alternative.

I want to filter by timestamp on the datetime field, e.g.
https://planetarycomputer.microsoft.com/api/stac/v1/collections/chloris-biomass/items?limit=12&filter-lang=cql2-text&filter=datetime < TIMESTAMP('2010-01-01T01%3A00%3A00.000Z') (also tried the URL encoded variant)

Instead of results that are older than 2010, I also get results that are newer than 2010. It looks like the filter is not taken into account at all.

Is CQL not supported on the items endpoint? Some of the conformance classes around filtering are a bit of a mess in stac-fastapi and stac-api-spec currently, so I'm just guessing...

To reproduce
See above.

Expected behavior
See above.

serving tiles in react frontend

Hello,

I am wondering if it is possible to use Planetary API as a WSM endpoint to serve sentinel & landsat tile in a front page.

Ali

Queryables link missing

Describe the bug

The queryables link is missing from the root endpoint, and it seems also from individual collections.
It has been there before (with a wrong rel type, but anyway).

Expected behavior
Add links with relation type http://www.opengis.net/def/rel/ogc/1.0/queryables again to the root endpoint and the individual collections.

Do not alter database entries when running tests

When running pcstac tests, we assume

docker-compose \
-f docker-compose.yml \
run --rm \
stac \
python /opt/src/pcstac/tests/loadtestdata.py
has being ran first.

This could be replaced by something like: https://github.com/developmentseed/planetary-computer-apis/blob/refactorTests/pcstac/tests/conftest.py#L19-L115

Note: ☝️ approach also remove the code duplication with pcstac.main

This is also linked to #31 because loadtestdata.py assume pypgstac is available in the environment (which should only be in the dev environment)

500 from GEOS error during spatial search

Note: This repository contains a reference implementation; as such, bugs in deployment or usage of this project in other environments besides the Microsoft Planetary Computer have no guarantees of support. See SUPPORT.md for more information.

Describe the bug
The following conditions cause a GEOS error in PostGIS:

  • GET Search with bbox=100.0,0.0,0.0,105.0,1.0,1.0
  • POST Search with bbox:[100.0, 0.0, 0.0, 105.0, 1.0, 1.0]

Expected behavior
A 500 should not occur.

CI build break due to unpinned lxml version

We don't pin the version of lxml that we use, and over the weekend that package was upgraded with a breaking change.

Describe the bug
Trying to start either the tiler or stac server produces this import error:

Traceback (most recent call last):
tiler_1        |   File "/opt/conda/envs/myenv/lib/python3.9/multiprocessing/process.py", line 315, in _bootstrap
tiler_1        |     self.run()
tiler_1        |   File "/opt/conda/envs/myenv/lib/python3.9/multiprocessing/process.py", line 108, in run
tiler_1        |     self._target(*self._args, **self._kwargs)
tiler_1        |   File "/opt/conda/envs/myenv/lib/python3.9/site-packages/uvicorn/subprocess.py", line 76, in subprocess_started
tiler_1        |     target(sockets=sockets)
tiler_1        |   File "/opt/conda/envs/myenv/lib/python3.9/site-packages/uvicorn/server.py", line 60, in run
tiler_1        |     return asyncio.run(self.serve(sockets=sockets))
tiler_1        |   File "/opt/conda/envs/myenv/lib/python3.9/asyncio/runners.py", line 44, in run
tiler_1        |     return loop.run_until_complete(main)
tiler_1        |   File "uvloop/loop.pyx", line 1517, in uvloop.loop.Loop.run_until_complete
tiler_1        |   File "/opt/conda/envs/myenv/lib/python3.9/site-packages/uvicorn/server.py", line 67, in serve
tiler_1        |     config.load()
tiler_1        |   File "/opt/conda/envs/myenv/lib/python3.9/site-packages/uvicorn/config.py", line 458, in load
tiler_1        |     self.loaded_app = import_from_string(self.app)
tiler_1        |   File "/opt/conda/envs/myenv/lib/python3.9/site-packages/uvicorn/importer.py", line 24, in import_from_string
tiler_1        |     raise exc from None
tiler_1        |   File "/opt/conda/envs/myenv/lib/python3.9/site-packages/uvicorn/importer.py", line 21, in import_from_string
tiler_1        |     module = importlib.import_module(module_str)
tiler_1        |   File "/opt/conda/envs/myenv/lib/python3.9/importlib/__init__.py", line 127, in import_module
tiler_1        |     return _bootstrap._gcd_import(name[level:], package, level)
tiler_1        |   File "<frozen importlib._bootstrap>", line 1030, in _gcd_import
tiler_1        |   File "<frozen importlib._bootstrap>", line 1007, in _find_and_load
tiler_1        |   File "<frozen importlib._bootstrap>", line 986, in _find_and_load_unlocked
tiler_1        |   File "<frozen importlib._bootstrap>", line 680, in _load_unlocked
tiler_1        |   File "<frozen importlib._bootstrap_external>", line 850, in exec_module
tiler_1        |   File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
tiler_1        |   File "/opt/src/pctiler/pctiler/main.py", line 29, in <module>
tiler_1        |     from pctiler.endpoints import (
tiler_1        |   File "/opt/src/pctiler/pctiler/endpoints/item.py", line 5, in <module>
tiler_1        |     from html_sanitizer.sanitizer import Sanitizer
tiler_1        |   File "/opt/conda/envs/myenv/lib/python3.9/site-packages/html_sanitizer/sanitizer.py", line 8, in <module>
tiler_1        |     import lxml.html.clean
tiler_1        |   File "/opt/conda/envs/myenv/lib/python3.9/site-packages/lxml/html/clean.py", line 18, in <module>
tiler_1        |     raise ImportError(
tiler_1        | ImportError: lxml.html.clean module is now a separate project lxml_html_clean.
tiler_1        | Install lxml[html_clean] or lxml_html_clean directly.

To reproduce
Steps to reproduce the behavior:

  1. Start the development servers from main after 3/30/2024.

Expected behavior
Servers start.

Screenshots and shell session dumps
No need.

Additional context
No need.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.