openaq / openaq-api-v2 Goto Github PK
View Code? Open in Web Editor NEWOpenAQ API
Home Page: https://api.openaq.org
OpenAQ API
Home Page: https://api.openaq.org
From a community member:
The OpenAQ system is recognizing -999 values as concentrations, when they actually represent missing data. An example is the Addis Ababa central site (https://openaq.org/#/location/9714 ) where the average concentration shows up as -150 ug/m3 due to the inclusion of the -999's in the calculation. Would it be possible to ignore all ‘-999’ values in calculations?
orjson.JSONDecodeError: trailing characters in db.py
When trying to use /measurements
to get more than ~10,000 results like so:
(v1) https://api.openaq.org/v1/measurements?limit=6000&page=8
(v2) https://api.openaq.org/v2/measurements?limit=6000&page=4&sensorType=reference%20grade
the API returns the following error:
message: "Service Unavailable"
When an API request with just the limit
parameter is used like so
https://api.openaq.org/v2/measurements?sensorType=reference%20grade&limit=10000
it works fine but not when the limit is increased to 20000
. Then we just get
detail : ""
Perhaps DB is timing out?
I like the schema used by OpenAddresses for licensing info here:
https://github.com/openaddresses/openaddresses/blob/master/CONTRIBUTING.md#optional-address-tags
I think we could return a JSON object very similar to this from the providers
endpoint. From the database side I'm thinking we may want the option to support multiple licenses per provider (or project). This could be relevant for when a license changes over time, e.g. Provider A had license CC by 4.0 2012-2016 and CC by 1.0 2016-. I would think a separate DB table licenses
with columns mapping to those listed on openadresses would facilitate this. @caparker
Not sure if the former part is actually a bug, but I couldn't find any information on it. Was support for order_by=distance
removed?
Also, I can no longer get any query involving coordinates
to work. I've tried using the UI here too: https://docs.openaq.org/#/v1/locationsv1_get_v1_locations_get.
Sample URL which used to work:
https://api.openaq.org/v1/locations?coordinates=35.8001%2C-78.7119&limit=5&order_by=distance
If you remove order_by=distance
it just gives an unspecified server error.
If I understand it correctly a response like this shouldn't be possible:
{
"meta": {
"name": "openaq-api",
"license": "CC BY 4.0d",
"website": "api.openaq.org",
"page": 2,
"limit": 10000,
"found": 2216436
},
"results": []
}
This result is produced by following this link https://api.openaq.org/v2/measurements?coordinates=42.698029%2C23.322718&radius=10000&page=2&limit=10000¶meter=pm25&order_by=datetime&sort=asc
I dont know whether the found parameter is wrong or pagination is working?
This may not be an issue, depending on what you are going for but I thought I would point it out. Based on what I can see in the file with the rollup methods it looks like the daily rollups are done directly on the measurement table but then the monthly and annual rollups are done on the daily and monthly values respectively. This would give you:
But not the daily, monthly and annual averages of the measurements.
Which is fine of course, if that is what you were after, but I couldn't find anything that stated what specifically the aggregate values were expected to be and so I wanted to point that out.
some API queries are returning negative numbers in the meta
count
field.
e.g.
/v2/measurements?date_from=2020-10-20T00%3A00%3A00%2B00%3A00&date_to=2022-11-21T16%3A49%3A00%2B00%3A00&limit=2000&page=1&offset=0&sort=desc&radius=1000&city=Chicago&order_by=datetime
returns:
{
"meta": {
"name": "openaq-api",
"license": "CC BY 4.0d",
"website": "test.openaq.org",
"page": 1,
"limit": 2000,
"found": -8371
},
"results": []
}
orjson.JSONDecodeError: trailing characters in db.py
Averaging period value is null for all locations in v1 and v2
Example: https://api.openaq.org/v1/latest
https://api.openaq.org/v2/measurements?include_fields=averagingPeriod
The v2/averages API is extremely useful. I am able to ask for monthly location average to get a historic view of a bunch of locations. I am using the temporal=month
and spatial=location
. So, I can get a year of location monthly averages for a parameter quickly (about 40s) -- this is awesome!
Once I have the averages, I want to get the coordinates and metadata of the locations. The name
field seems to correspond to the locations.id
, which generally makes sense. Right now, I use the v2/location API to find all locations that measure the parameter. Then, I can join averages based on averages.name == locations.id
. Getting the location data takes about 6 minutes, which I suspect is limited by the fact that it returns the latest observations.
It seems odd that I can get the averages for a year faster than I can get the coordinates. Any recommendations on how to get the coordinates and metadata faster?
p.s., In addition to the speed, I suspect I am putting unnecessary pressure on the server.
upgrade codebase to python 3.9
I added additional environmental variables into the docker-compose.yml
in order for the build to pass:
- DATABASE_READ_USER=postgres
- DATABASE_READ_PASSWORD=postgres
- DATABASE_WRITE_USER=postgres
- DATABASE_WRITE_PASSWORD=postgres
- DATABASE_HOST=172.17.0.2
- DATABASE_PORT=5432
- DATABASE_DB=postgres
- DOMAIN_NAME="api.openaq.org"
I am trying to build the db
and the openaqapi
locally via the instructions provided in the "Historic Method" section of the readme.
After running the following:
cd .devcontainer
docker-compose build
docker-compose up
The API docker container keeps exiting with:
RuntimeError: Directory '/usr/local/lib/python3.9/site-packages/openaq_fastapi/static' does not exist
I have tried the following:
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)
logger.info("relative path of static folder: %s", static_dir)
this returns:
relative path of static folder: /usr/local/lib/python3.9/site-packages/openaq_fastapi/static
After commenting out the L159 in openaq-api-v2/openaq_fastapi/openaq_fastapi/main.py
I get the following traceback:
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 691, in _create_ssl_connection
api_1 | tr, pr = await loop.create_connection(
api_1 | File "/usr/local/lib/python3.9/asyncio/base_events.py", line 1041, in create_connection
api_1 | sock = await self._connect_sock(
api_1 | File "/usr/local/lib/python3.9/asyncio/base_events.py", line 955, in _connect_sock
api_1 | await self.sock_connect(sock, address)
api_1 | File "/usr/local/lib/python3.9/asyncio/selector_events.py", line 502, in sock_connect
api_1 | return await fut
api_1 | asyncio.exceptions.CancelledError
api_1 |
api_1 | During handling of the above exception, another exception occurred:
api_1 |
api_1 | Traceback (most recent call last):
api_1 | File "/usr/local/lib/python3.9/asyncio/tasks.py", line 492, in wait_for
api_1 | fut.result()
api_1 | asyncio.exceptions.CancelledError
api_1 |
api_1 | The above exception was the direct cause of the following exception:
api_1 |
api_1 | Traceback (most recent call last):
api_1 | File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 635, in lifespan
api_1 | async with self.lifespan_context(app):
api_1 | File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 530, in __aenter__
api_1 | await self._router.startup()
api_1 | File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 612, in startup
api_1 | await handler()
api_1 | File "/usr/local/lib/python3.9/site-packages/openaq_fastapi/main.py", line 113, in startup_event
api_1 | app.state.pool = await db_pool(None)
api_1 | File "/usr/local/lib/python3.9/site-packages/openaq_fastapi/db.py", line 47, in db_pool
api_1 | pool = await asyncpg.create_pool(
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 413, in _async__init__
api_1 | await self._initialize()
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 441, in _initialize
api_1 | await first_ch.connect()
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 133, in connect
api_1 | self._con = await self._pool._get_new_connection()
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 511, in _get_new_connection
api_1 | con = await connection.connect(
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/connection.py", line 2085, in connect
api_1 | return await connect_utils._connect(
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 895, in _connect
api_1 | raise last_error
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 881, in _connect
api_1 | return await _connect_addr(
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 781, in _connect_addr
api_1 | return await __connect_addr(params, timeout, True, *args)
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 825, in __connect_addr
api_1 | tr, pr = await compat.wait_for(connector, timeout=timeout)
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/compat.py", line 66, in wait_for
api_1 | return await asyncio.wait_for(fut, timeout)
api_1 | File "/usr/local/lib/python3.9/asyncio/tasks.py", line 494, in wait_for
api_1 | raise exceptions.TimeoutError() from exc
api_1 | asyncio.exceptions.TimeoutError
api_1 |
api_1 | [2022-06-12 15:19:56,665] ERROR [uvicorn.error:119] Traceback (most recent call last):
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 691, in _create_ssl_connection
api_1 | tr, pr = await loop.create_connection(
api_1 | File "/usr/local/lib/python3.9/asyncio/base_events.py", line 1041, in create_connection
api_1 | sock = await self._connect_sock(
api_1 | File "/usr/local/lib/python3.9/asyncio/base_events.py", line 955, in _connect_sock
api_1 | await self.sock_connect(sock, address)
api_1 | File "/usr/local/lib/python3.9/asyncio/selector_events.py", line 502, in sock_connect
api_1 | return await fut
api_1 | asyncio.exceptions.CancelledError
api_1 |
api_1 | During handling of the above exception, another exception occurred:
api_1 |
api_1 | Traceback (most recent call last):
api_1 | File "/usr/local/lib/python3.9/asyncio/tasks.py", line 492, in wait_for
api_1 | fut.result()
api_1 | asyncio.exceptions.CancelledError
api_1 |
api_1 | The above exception was the direct cause of the following exception:
api_1 |
api_1 | Traceback (most recent call last):
api_1 | File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 635, in lifespan
api_1 | async with self.lifespan_context(app):
api_1 | File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 530, in __aenter__
api_1 | await self._router.startup()
api_1 | File "/usr/local/lib/python3.9/site-packages/starlette/routing.py", line 612, in startup
api_1 | await handler()
api_1 | File "/usr/local/lib/python3.9/site-packages/openaq_fastapi/main.py", line 113, in startup_event
api_1 | app.state.pool = await db_pool(None)
api_1 | File "/usr/local/lib/python3.9/site-packages/openaq_fastapi/db.py", line 47, in db_pool
api_1 | pool = await asyncpg.create_pool(
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 413, in _async__init__
api_1 | await self._initialize()
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 441, in _initialize
api_1 | await first_ch.connect()
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 133, in connect
api_1 | self._con = await self._pool._get_new_connection()
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/pool.py", line 511, in _get_new_connection
api_1 | con = await connection.connect(
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/connection.py", line 2085, in connect
api_1 | return await connect_utils._connect(
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 895, in _connect
api_1 | raise last_error
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 881, in _connect
api_1 | return await _connect_addr(
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 781, in _connect_addr
api_1 | return await __connect_addr(params, timeout, True, *args)
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/connect_utils.py", line 825, in __connect_addr
api_1 | tr, pr = await compat.wait_for(connector, timeout=timeout)
api_1 | File "/usr/local/lib/python3.9/site-packages/asyncpg/compat.py", line 66, in wait_for
api_1 | return await asyncio.wait_for(fut, timeout)
api_1 | File "/usr/local/lib/python3.9/asyncio/tasks.py", line 494, in wait_for
api_1 | raise exceptions.TimeoutError() from exc
api_1 | asyncio.exceptions.TimeoutError
api_1 |
api_1 | ERROR: Application startup failed. Exiting.
api_1 | [2022-06-12 15:19:56,665] ERROR [uvicorn.error:56] Application startup failed. Exiting.
Not sure what is up with the COPY
of the staticfile. Thanks!
Building the local development version fails with on OSX. It successfully builds and runs on linux. Error seems to be around building jq
wheel in the pip install process.
#17 118.8 ERROR: Could not build wheels for jq, which is required to install pyproject.toml-based projects
Might be connected to the slim-buster image
https://api.openaq.org/v2/latest?city=London
Does not return any data with an error : -
{
"loc": [
"response",
"results",
"15",
"coordinates"
],
"msg": "none is not an allowed value",
"type": "type_error.none.not_allowed"
}
update the /v1/cities endpoint to match the v1 schema and fully deprecate name
property:
{
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "Generated schema for Root",
"type": "object",
"properties": {
"country": {
"type": "string"
},
"city": {
"type": "string"
},
"count": {
"type": "number"
},
"locations": {
"type": "number"
}
},
"required": [
"country",
"city",
"count",
"locations"
],
"additionalProperties": false
}
it woudl be nice to have bounding box queries in addition to the lat,lng + radius option
remove jq dependency and map v2 data models to v1 data models instead
decorators.py uses the access pandas.io.json for the function json_normalize but the function cannot be accessed this way.
When I change the code by directly calling json_normalize like so:
import pandas as pd
pd.json_normalize()
it works.
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
File [.../decorators.py:57), in pandasize..decorator..decorated_function(*args, **kwargs)
53 d.append(tmp)
55 resp = d
---> 57 data = pd.io.json.json_normalize(resp)
59 # If there are any datetimes, make them datetimes!
60 for each in [i for i in data.columns if 'date' in i]:
AttributeError: module 'pandas.io.json' has no attribute 'json_normalize'
update /v2/parameters endpoint queries to work on v3 database schema
Since moving to this new version of the API, some of the information that existed on the old API docs did not make it over.
We need to update the current docs to make sure the API is easy to use:
default
sectionSchemas
section or provide more documentation about itThe cool thing about the Swagger docs is that they are created directly from the API code. For example, to modify the cities
endpoint description you would start at the city router code. Also check out the Swagger documentation for more info.
To make the AWS open data bucket a bit easier to navigate I think there should be a way to get a list of countries by provider since the folder structure goes:
/provider={provider_name}/country={country_code}
We could either provider a query parameter or subresource
e.g.
countries?providers_id=42
or
countries/42/providers
(this option feels odd to me))
The other option would be include an array of countries in the providers response object
"countries": ["us", "uk", "gh"]
Which is similar to the existing bounds
key but a bit more descriptive.
There will likely also be an inverse situation where a user is interested in a country but first must know the providers that cover that country. we currently provide a providersCount
key in the country response. Should we also include a providers array? or provide separate endpoint based solutions similar to the kind of ideas above?
Any thoughts @caparker @majesticio
Update the v1 and v2 queries for sources
to the new v3 schema.
The entire nodes.py file seems to be unused. Let's double check usage and possible future utility, if not relevant we can remove.
In my case, I can't get 'parameter_id' or 'parameter' to work. I am trying to get PM2.5 but also getting O3. I was ok with that initially, but the order in which they appear varies. Sometimes PM2.5 is first, and others, it is O3.
Here are my queries
https://api.openaq.org/v2/latest/2211?limit=1&page=1&offset=0&sort=desc¶meter_id=2&radius=1000&order_by=lastUpdated
https://api.openaq.org/v2/latest/2211?limit=1&page=1&offset=0&sort=desc¶meter_id=2¶meter=pm25&radius=1000&order_by=lastUpdated
https://api.openaq.org/v2/latest/2211?limit=1&page=1&offset=0&sort=desc¶meter=pm25&radius=1000&order_by=lastUpdated
I have a similar issue with /v2/measurements which I will log separately.
Thanks.
Python 3.10 is now available as a runtime for AWS Lambda:
https://aws.amazon.com/about-aws/whats-new/2023/04/aws-lambda-python-3-10/
I am running into an issue where the averages
interface works for most requests, but fails for a very specific record. I started by paging with a limit of 1000 per page. The second page failed, but the rest did not. I then changed the page size (500, 250, 50, etc). I am only able to reproduce the problem with page sizes of 3 or more. The URL below shows the problem with just 3 records.
https://api.openaq.org/v2/averages?date_from=2020-01-01&date_to=2020-12-31¶meter=pm25&limit=3&page=594&offset=0&sort=desc&spatial=location&temporal=month&group=false
Returns
{"message":"internal server error"}
Note that the same URL with page=593
or page=595
works. The version below reproduces the problem with 1000 items per page.
https://api.openaq.org/v2/averages?date_from=2020-01-01&date_to=2020-12-31¶meter=pm25&limit=1000&page=2&offset=0&sort=desc&spatial=location&temporal=month&group=false
Returns
{"message":"internal server error"}
Note that the same url with page=1
or page=3
works just fine. I don't know if this is related to a corrupt record or the API itself.
update the v1 and v2 measurements endpoint queries work with the new v3 database schema
V2 locations is erroring:
ValueError: column "sources" does not exist
I know it's quite new api and probably far to be finished, but I was playing with it and the tile calls end up being rejected becuase of the cors policy.
rest of the calls seem to work fine.
Compare
, which returns the requested 5 results, compared to
, which returns none.
I am receiving "Internal Server Error" when trying to get averages using country code "DE".
Is it a bug? Are there any plans to fix it?
using Pydantic for validating and build the query parameter objects has been causing some issues mostly around @validator
steps. Rasing a ValueError
results in a HTTP 500 instead of a HTTP 422. This has been encountered by other users e.g.:
One solution is to raise a FastAPI HTTPException
as discussed in the issue, but this has some reusability issues it seems.
This issue:
fastapi/fastapi#1474 (comment)
Seems to present some other solutions that may be worth exploring
adding format=csv to a query does not return csv data, only json data
When taking for example the following query on the API: https://api.openaq.org/v2/measurements?country=LU&city=Luxemburg&location=LU0102A¶meter=o3&date_to=2021-05-10&date_from=2021-05-08
, only 21 data points are shown in the output, but according to the meta section it should include 107 data points:
"meta": {
"name": "openaq-api",
"license": "CC BY 4.0d",
"website": "https://u50g7n0cbj.execute-api.us-east-1.amazonaws.com/",
"page": 1,
"limit": 100,
"found": 107
}
Not sure where the 107 comes from, this seems to be faulty as well as in my opinion there should be one data point per hour, so the result should contain 48 data points.
I have the same for almost every other day that some of the hourly data points are not returned.
What am I missing here?
My queries were working a few days ago but now I'm getting 422 errors on requests.
yields:
{"detail":[{"loc":["results","3","city"],"msg":"none is not an allowed value","type":"type_error.none.not_allowed"},{"loc":["results","10","city"],"msg":"none is not an allowed value","type":"type_error.none.not_allowed"},{"loc":["results","11","city"],"msg":"none is not an allowed value","type":"type_error.none.not_allowed"},{"loc":["results","14","city"],"msg":"none is not an allowed value","type":"type_error.none.not_allowed"},{"loc":["results","15","city"],"msg":"none is not an allowed value","type":"type_error.none.not_allowed"}]}%
The 'sort' of ASC or DESC does not work. And, 'order by' also does not work.
In the following example, PM2.5 results start from date_from followed by O3 results also starting from date_from.
https://api.openaq.org/v2/measurements?location_id=2211¶meter_id=2&date_from=2023-06-07T14:12:39-04:00&date_to=2023-06-08T14:12:39-04:00&limit=100&sort=desc&order_by=datetime
My Python is not great, but I see 'order_by' and 'sort' is not being used in the SQL statement in the source code (measurements.py)
Thanks
The date
field (as seen in the beta version) is no longer there.
Instead you see year
, month
, or day
, depending on the temporal
query parameter.
And the value of that field is always in the format YYYY-MM-DD
.
Part of the response:
{
"id": 58364,
"name": "MN",
"unit": "µg/m³",
"year": "2021-01-01",
"average": 91.8671,
"subtitle": "Mongolia",
"parameter": "pm25",
"displayName": "PM2.5",
"parameterId": 2,
"measurement_count": 277960
}
The date field in the response should stay as date
, regardless of temporal parameter.
The date format should correspond to the temporal parameter. For example:
year
it should be "date": "2021"
month
it should be "date": "2021-01"
day
it should be "date": "2021-01-01"
Temporal parameter: Year
https://api.openaq.org/v2/averages?spatial=country&temporal=year&country=MN¶meter=pm25
Temporal parameter: Month
https://api.openaq.org/v2/averages?spatial=country&temporal=month&country=MN¶meter=pm25
Temporal parameter: Day
https://api.openaq.org/v2/averages?spatial=country&temporal=day&country=MN¶meter=pm25
It seems that the api docs located at https://docs.openaq.org/ don't include any schemas for returned data.
Each response links to a relevant schema for successful returns but these seem to only include the meta fields and the result field is either left blank or left as an array with no details of its content.
Am I missing something here or is something broken.
When requesting data from the latest endpoint the results are returned in a seemingly arbitrary order
Hello,
I am using the v2 api to collect historical measurements in a specific area. For some location_ids (e.g. 8567) the API returns a greater number of results than the limit. For example, when the limit is 5, 8 results are returned.
Then the second page (page=2) returns no results, even though many measurements are found:
{"meta":{"name":"openaq-api","license":"CC BY 4.0d","website":"api.openaq.org","page":2,"limit":5,"found":61293},"results":[]}
However, for other location_ids, such as 14, the API behaves as expected, returning 5 results on both page 1 and 2.
This seems to happen in both v1 and v2
Thanks
I am currently unable to get averagingPeriod with v2 and output json. I believe that this was previously operational.
For replication, I am copying the URLs built with the "try it" documentation interface (https://api.openaq.org/docs). Each is designed to output one (the same) measurement. Both request averagingPeriod, but only v1 receives it.
{
"meta": {
"name": "openaq-api",
"license": "CC BY 4.0d",
"website": "api.openaq.org",
"page": 1,
"limit": 1,
"found": 73204
},
"results": [
{
"location": "St. Maries",
"parameter": "pm25",
"value": 10,
"date": {
"utc": "2022-10-03T16:00:00Z",
"local": "2022-10-03T09:00:00-07:00"
},
"unit": "µg/m³",
"coordinates": {
"latitude": 47.3167,
"longitude": -116.570297
},
"country": "US",
"city": "BENEWAH",
"averagingPeriod": {
"unit": "seconds",
"value": 3600
}
}
]
}
{
"meta": {
"name": "openaq-api",
"license": "CC BY 4.0d",
"website": "api.openaq.org",
"page": 1,
"limit": 1,
"found": 73204
},
"results": [
{
"locationId": 1774,
"location": "St. Maries",
"parameter": "pm25",
"value": 10,
"date": {
"utc": "2022-10-03T16:00:00+00:00",
"local": "2022-10-03T09:00:00-07:00"
},
"unit": "µg/m³",
"coordinates": {
"latitude": 47.3167,
"longitude": -116.570297
},
"country": "US",
"city": "BENEWAH",
"isMobile": false,
"isAnalysis": false,
"entity": "government",
"sensorType": "reference grade"
}
]
}
Thanks for any advice.
pydantic is adding a new major version with what sounds like some nice performance enhancements:
https://pydantic-docs.helpmanual.io/blog/pydantic-v2/
once 2.0 is available let's update.
When I was trying to hit this API::
https://api.openaq.org/v1/measurements?country=IN&include_fields=sourceName&format=csv
Hello @sruti,
In Measurements get API,
include_fields not shown as expect when we select format is equal to CSV, Without format API is working fine.
Thank You
Heet Bhimani
India
I was wondering how about mentioning what values are to be substituted and how to create them or find them for each environment variable while setting it up locally. And if any other repo is needed or used to setup openaq-api-v2 locally. It will help beginners.
update the v1 and v2 locations endpoint queries to work with the new v3 database schema
some errors are raising an HTTP 500 when they should raise a 422 for incorrect query parameters. This seems to originate from raising a ValueError which then turns into a ValidationError and not a RequestValidationError.
See some discussion about this issue:
and possible solutions:
Update the /v2/summary endpoint query to work with the new v3 database schema.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.