Git Product home page Git Product logo

openeo-r-client's Introduction

openeo: Client Interface for openEO Servers

R-CMD-check CRAN status

openEO Background

The amount of available Earth Observation data (EO data) is steadily increasing due to different space missions such as Landsat and Sentinel. Resulting data products are often too large to be processed locally and therefore require new processing tools and functionalities. The core concept of "openEO" is related to big data processing strategies. "openEO"" defines a unified API for back-end and client software as well as a number of common processes for manipulating spatio-temporal data cubes. The basic idea is to distinguish between computation (back-end server) and workflow definition (client software). While some back-ends were developed in the main openEO project, others were currently improved or developed within the ESA project openEO Platform. Those back-ends offer access to their data collections and processing platform while the client software (e.g., R, Python, JavaScript, QGIS) help creating processing workflows in a programming environment familiar to the user.

openEO client in R

This R package contains functions and classes that allow interactions with openEO back-end server. The main goals of this package are:

  • enable an R-user to explore openEO back-ends for distributed data and implemented operations.
  • aid an R-user to create processing workflows on EO data executable on openEO back-ends.
  • retrieve results for further analysis in R

Installation

The most recent code is located on Github. To install it, you can use the following code:

if (!require(devtools)) {
  install.packages("devtools",dependencies=TRUE)
  library(devtools)
}
install_github(repo="Open-EO/openeo-r-client",dependencies=TRUE)
library(openeo)

Otherwise, once accepted by CRAN, you can install the latest stable version by:

install.packages("openeo")
library(openeo)

If you want use a different package version we recommend to use the parameter ref. Define this parameter as "master", "develop" or another version specified in releases.

Currently, the package complies to the major openEO API version 1.1.x. It is also possible to manually install older versions that comply to the API version 0.4.2. This is not recommended since most - if not all - back-ends won't support this version anymore. The old versions are stated here for historic reasons. Starting with the stable API version 1.0.0 the package will be backward compatible within the semantic versioning.

openeo R client version openEO API version openEO API status
v1.3.x v1.1.x stable
v1.2.x v1.1.x stable
v1.1.x v1.1.x stable
v1.0.x v1.0.x stable
v0.6.x v0.4.2 deprecated
v0.5.x v0.4.2 deprecated
v0.4.x v0.4.2 deprecated

Requirements

The 'openeo' package won't process anything on the local machine. It will always interact with a designated back-end. Data storage and computations are performed directly at the back-end. Therefore, please make sure that you are registered with any of the available openEO back-ends in order to obtain credentials and the access URLs (see: openEO Hub for getting an overview about available back-ends).

Getting Started

After installing and loading the package, you need to connect to the openEO back-end you want to use. The object returned by the connect function is essential for the interaction with this particular back-end. Afterwards, users can explore the data and processes and start creating a processing workflows, free of charge. To start processing data or publishing web services, however, the user needs to be registered and authenticated with the openEO back-end provider. The provider offers different execution plans the user can choose. These may include free-of-charge plans or other pricing concepts.

Exemplary back-end providers are:

The Google Earth Engine (GEE) interface for openEO is not actively maintained. The credentials for accessing and testing are included in the demo section at the openEO GEE Github repository. Please bear in mind that the access is free, but Google might revoke the rights if the processing load is too high. Use it only for playing around with the different openEO clients and not for productive purposes. "openeo.cloud" is the link to ESAs "openEO Platform" project, for which you have to be signed up via EGI and openEO platform.

Examples

The following code sample shows how to create a processing workflow that calculates the minimum NDVI of a spatial and temporal subset on Sentinel-2 data and perform a linear scaling to store the results as PNG file.

library(openeo)
connect(host="https://earthengine.openeo.org")

# list collection and processes
colls = list_collections()
list_processes()

# get detailed descriptions
describe_collection("COPERNICUS/S2")
describe_process("load_collection")

# create a process graph / task
p = processes()

data = p$load_collection(id = colls$`COPERNICUS/S2`,
                             spatial_extent = list(
                               west=16.1,
                               east=16.6,
                               north=48.6,
                               south= 47.2
                             ),
                             temporal_extent = list(
                               "2018-04-01", "2018-05-01"
                             ),
                             bands=list("B8","B4")))

spectral_reduce = p$reduce_dimension(data = data, dimension = "bands",reducer = function(data,context) {
  B08 = data[1]
  B04 = data[2]
  return((B08-B04)/(B08+B04))
})

temporal_reduce = p$reduce_dimension(data=spectral_reduce,dimension = "t", reducer = function(x,y){
  p$min(x)
})

apply_linear_transform = p$apply(data=temporal_reduce,process = function(value,...) {
  p$linear_scale_range(x = value, 
                           inputMin = -1, 
                           inputMax = 1, 
                           outputMin = 0, 
                           outputMax = 255)
})

result = p$save_result(data=apply_linear_transform,format="PNG")

At this point (latest here) you need to log in with your personal credentials for further computations.

login(user="",password="")
job_id = create_job(graph=result, title="Example graph", description="This graph is just a general example",format="png")

start_job(job_id)

result_obj = list_results(job_id)

download_results(job = job_id, folder = ".")

To get an overview of the functions offered by the packages and to access the function documentation you can navigate to the "Packages" tab in RStudio, select the "openeo" package and click on the function you are interested in. Another option is to use the following command line operations:

library(help="openeo")

# ?<function_name>, e.g.
?connect

If you are interested in more information, you can have a look at some example scripts that were created during the Proof-of-Concept phase to get a feeling on how to use the package. Some of the scripts are outdated and will be replaced in the future.

Contributions

The authors acknowledge the financial support for the development of this package during the H2020 project "openEO" (Oct 2017 to Sept 2020) by the European Union, funded by call EO-2-2017: EO Big Data Shift, under grant number 776242. We also acknowledge the financial support received from ESA for the project "R4openEO" (Sept 2021 to Sept 2022).

This package received major contributions from the following organizations:

EFTAS logo   WWU Münster logo   Wageningen University logo

Links

openeo-r-client's People

Contributors

edzer avatar flahn avatar greatemerald avatar m-mohr avatar olivroy avatar przell avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

openeo-r-client's Issues

run_udf: path to udf code file fails, setwd() works

Just a guess, but does the file exist in your workspace (list.files()). If you can't find the script there, then maybe adapt your workspace (setwd()), use an absolute path or add the folders :)

But it is more likely that I forgot the standard return of a String argument... I will fix it quickly, you can check the latest develop version tomorrow.

Originally posted by @flahn in Open-EO/openeo-usecases#1 (comment)

Parameter ordering

At the moment there was no real use case to bother with the parameter order. But there are some use cases where the specific order might be usefull:

  1. when creating ProcessNodes using the Process graph builder functions
  2. in cases were a back-end really needs the values ordered and cannot make use of named parameter

References @m-mohr in #43 (comment)

Job workflow handling

Right now all job-related functions seem to take the job ID. While that can be extracted from listJobs(), it's not what you want the users to be manually doing. So maybe have listJobs() be an actual list of Job objects that can then be passed to the other functions instead of the job ID?

Connect to latest version

Migrated from Open-EO/openeo.org#18:

I should be able to use connect(host = "https://earthengine.openeo.org") to connect to GEE according to the API spec:

By default, a client SHOULD connect to the most recent production-ready version it supports. If not available, the most recent supported version of all versions SHOULD be connected to.

`describe` requires specifying whether it is a product or a process

This is a small discussion point: at the moment, the describe function requires the user to say whether they want a description of a product or a process via arguments. However, this is not very convenient, because one needs to put product_id always when trying to describe products (and that's not needed to describe processes). And while it's neat to be able to describe several things at a time, is that functionality really useful?

Perhaps it would be better to have something like separate describeProcess and describeProduct functions to make this more uniform.

Load and require raster package

The client should load and require the raster package to be able to download and return a Raster* object when executing a synchronous job execution.

Give users the full process instead of just the graph

In general, but especially also in the QGIS plugin and the Web Editor, openEO processes in JSON need to be fully defined including their potential metadata (see example 1), i.e. it is not enough to just list the graph nodes (see example 2). Unfortunately, the R client and Python client export the graphs without their metadata. This leads reportedly to some confusion when users try to copy/paste the "incomplete" processes into the Web Editor, QGIS Plugin or the Hub for example, which then can't handle them properly. Therefore, I'd really like to see that the R and Python clients print out the full process (see example 1).

Example 1:

{
  "process_graph": {
    "1": {
      "process_id": "add",
      "arguments": {
        "x": 1,
        "y": 2
      },
      "result": true
    }
  }
}

Example 2:

{
  "1": {
    "process_id": "add",
    "arguments": {
      "x": 1,
      "y": 2
    },
    "result": true
  }
}

eurac uc1 example for R broken

Version mismatches to tidyverse packages? Can one simply drop tidyverse and proceed without having to jump through their ever-moving hoops?

> library(openeo)
Loading required package: magrittr
Loading required package: jsonlite
Loading required package: gdalUtils
Loading required package: raster
Loading required package: sp

Attaching package: ‘raster’

The following object is masked from ‘package:magrittr’:

    extract

Loading required package: rgdal
rgdal: version: 1.3-7, (SVN revision 777)
 Geospatial Data Abstraction Library extensions to R successfully loaded
 Loaded GDAL runtime: GDAL 2.4.0, released 2018/12/14
 Path to GDAL shared files: /usr/local/share/gdal
 GDAL binary built with GEOS: TRUE 
 Loaded PROJ.4 runtime: Rel. 5.2.0, September 15th, 2018, [PJ_VERSION: 520]
 Path to PROJ.4 shared files: (autodetected)
 Linking to sp version: 1.3-1 
Loading required package: tibble
Loading required package: future

Attaching package: ‘future’

The following object is masked from ‘package:raster’:

    values

Loading required package: lubridate

Attaching package: ‘lubridate’

The following object is masked from ‘package:base’:

    date

Loading required package: dplyr

Attaching package: ‘dplyr’

The following objects are masked from ‘package:lubridate’:

    intersect, setdiff, union

The following objects are masked from ‘package:raster’:

    intersect, select, union

The following objects are masked from ‘package:stats’:

    filter, lag

The following objects are masked from ‘package:base’:

    intersect, setdiff, setequal, union

Loading required package: Rcpp
Loading required package: rlang

Attaching package: ‘rlang’

The following objects are masked from ‘package:jsonlite’:

    flatten, unbox

The following object is masked from ‘package:magrittr’:

    set_names

> 
> euracHost = "http://saocompute.eurac.edu/openEO_0_3_0/openeo/"
> 
> eurac = connect(host = euracHost,disable_auth = TRUE)
Registered host
> eurac %>% listProcesses()
Error: All columns in a tibble must be 1d or 2d objects:
* Column `process_id` is NULL
* Column `description` is NULL
Call `rlang::last_error()` to see a backtrace
> rlang::last_error()
<error>
message: All columns in a tibble must be 1d or 2d objects:
* Column `process_id` is NULL
* Column `description` is NULL
class:   `rlang_error`
backtrace:
  1. openeo::listProcesses(.)
  9. con$listProcesses()
 10. tibble::add_row(., process_id = process$process_id, description = process$description)
 11. tibble::tibble(...)
 19. tibble:::lst_to_tibble(xlq$output, .rows, .name_repair, lengths = xlq$lengths)
 20. tibble:::check_valid_cols(x)
 21. openeo::listProcesses(.)
 22. con$listProcesses()
Call `rlang::last_trace()` to see the full backtrace
> sessionInfo()
R version 3.5.2 (2018-12-20)
Platform: x86_64-pc-linux-gnu (64-bit)
Running under: Fedora 29 (Workstation Edition)

Matrix products: default
BLAS: /home/rsb/topics/R/R352-share/lib64/R/lib/libRblas.so
LAPACK: /home/rsb/topics/R/R352-share/lib64/R/lib/libRlapack.so

locale:
 [1] LC_CTYPE=en_GB.UTF-8       LC_NUMERIC=C               LC_TIME=en_GB.UTF-8       
 [4] LC_COLLATE=en_GB.UTF-8     LC_MONETARY=en_GB.UTF-8    LC_MESSAGES=en_GB.UTF-8   
 [7] LC_PAPER=en_GB.UTF-8       LC_NAME=C                  LC_ADDRESS=C              
[10] LC_TELEPHONE=C             LC_MEASUREMENT=en_GB.UTF-8 LC_IDENTIFICATION=C       

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
 [1] openeo_0.2.2       rlang_0.3.1        Rcpp_1.0.0         dplyr_0.8.0       
 [5] lubridate_1.7.4    future_1.11.1.1    tibble_2.0.1       rgdal_1.3-7       
 [9] raster_2.8-19      sp_1.3-1           gdalUtils_2.0.1.14 jsonlite_1.6      
[13] magrittr_1.5      

loaded via a namespace (and not attached):
 [1] rstudioapi_0.9.0  tidyselect_0.2.5  lattice_0.20-38   R6_2.4.0         
 [5] foreach_1.4.4     httr_1.4.0        stringr_1.4.0     globals_0.12.4   
 [9] tools_3.5.2       parallel_3.5.2    grid_3.5.2        R.oo_1.22.0      
[13] iterators_1.0.10  assertthat_0.2.0  yaml_2.2.0        digest_0.6.18    
[17] crayon_1.3.4      purrr_0.3.0       R.utils_2.8.0     codetools_0.2-16 
[21] curl_3.3          glue_1.3.0        stringi_1.3.1     compiler_3.5.2   
[25] pillar_1.3.1      R.methodsS3_1.7.1 listenv_0.7.0     pkgconfig_2.0.2  

EURAC uc1 example for R Client is giving an error

In the example code " eurac-uc1-example.R " the line " job_id = eurac %>% defineJob(task=task,format="tiff") " is giving an error related to JSONexception. I also tried to run using format=GTiff , but still the same error.

create_job: missing $location header in response

Hi Florian,
I have experienced an issue when using the function create_job:

job_id = conn %>% create_job(graph=graph, 
                             title="ndvi", 
                             description="Example ndvi", 
                             format="png")

Job was sucessfully registered on the backend.
non-character argument

I looked into the function and found that the variable response does not include the header $location. Instead the header $openeo-identifier`` contains an ID. Using this ID the job could be created and executed. What I did to fix it manually is:

locationHeader = headers(response)$location

by

locationHeader = headers(response)$`openeo-identifier`

Best,
Peter

Performance of describeCollection across OSs

There are some odd issues with the performance of the describeCollection function. On Linux, with rgdal installed, it is instantaneous. On Windows without rgdal installed, it is extremely slow. On Windows with rgdal installed, it is also instantaneous. And on macOS with rgdal installed, it is slow (not quite as slow as on Windows, seemingly, but still uncomfortably slow).

What could be causing that and is there anything that could be done about it?

Extent Viewer for collections

Another convenience function would be to have a function like view_extent(collection_name). It could take the coordinates from the metadata and create a mapview for example. Would be helpful for researchers when prototyping so that they don't select regions outside of the bounding box of the collection. This would reduce the possible error source of going out of extent.

Error: Cannot access data endpoint

I'm trying to update the Jupyter Notebook example to API v0.0.2, and somehow it seems that all communications with the local R backend fail with this error:

Error in private$GET(endpoint, type = "application/json"): Cannot access data endpoint
Traceback:

1. listProcesses(conn)
2. .listToDataFrame(con$listProcesses())
3. con$listProcesses()
4. private$GET(endpoint, type = "application/json")
5. stop("Cannot access data endpoint")

I'm not sure if this is an issue with the client or with the server. connect() seems to succeed, though:

Registered 'http://localhost:8000/api' as host
Login successful.

And I can access the __swagger__ page through the browser fine, too.

I'm using the devel branch versions of both packages.

callback function does recognize nested callback arguments in anyOf

Code used is below for getting the error ("No callbacks found in the parameters of the stated process"), possibly related to parsing of anyOf field in the reduce process description,

library(magrittr)
library(openeo)
library(tibble)

conn = connect(host="https://openeo.eurac.edu",user="aceo",password="aceo_123",login_type="basic")
graph = conn %>% process_graph_builder()

data1 = graph$load_collection(id = graph$data$openEO_S2_32632_10m_L2A, spatial_extent = list(west = 11.2792, south = 46.4643, east = 11.4072, north = 46.5182), temporal_extent = c("2018-06-04T00:00:00Z","2018-06-23T00:00:00Z"))
ndvi = graph$ndvi()
reducer = graph$reduce(data = ndvi, dimension = "temporal")
cb1_graph = conn %>% callback(reducer, parameter = "reducer")

@przell

Support for authentication-less connections

The connect function should have user and password parameters be optional, in the case that the endpoint does not support authentication. This seems to be already supported in the R6 class.

Error during collection creation with a different id_name

When you create a collection for a process graph with a customized id name, then it will come to an error:

 Error in process(., "filter_daterange", prior.name = "imagery", from = "2017-04-01",  : 
  Chain corrupted. prior elemente is neither a process or a collection 

It means that a specific key word for process or collection cannot be found. I will change this implementation to use an attribute for this like the follwing to allow independent type checks.

attr(x, "type") <- "process"
attr(y, "type") <- "collection"
attr(z, "type") <- "udf"

Update Examples

It seems like the examples in the README (and in the examples folder) are for older versions of the R client.
e.g. filter_daterange has still the arguments "from" and "to", but if I call it, it only takes extent and imagery as argument.

oidc login on eurac not working

Hi Florian,
we have implemented open id connect authentication for our backend. It works on the webeditor.
I am trying to login via the r-client (v1.0.0) but I get a "Login failed.". I have written you an e-mail containing the conf information.
Here's what I have tried so far:
Login via connect() function:

euracHost = "https://openeo.eurac.edu"
conf = list(client_id = "xxx", secret = "xxx")

eurac = connect(host = euracHost) # connect to get oidc providers
prov = list_oidc_providers()
prov$Eurac_EDP_Keycloak

eurac = connect(host = euracHost, 
                            login_type = "OIDC", 
                            provider = prov$Eurac_EDP_Keycloak, # "Eurac_EDP_Keycloak", 
                            config = conf)

I get:

Connected to service:  https://openeo.eurac.edu 
Please check the terms of service (terms_of_service()) and the privacy policy (privacy_policy()). By further usage of this service, you acknowledge and agree to those terms and policies.
Warning: Connected host is considered unstable and not production-ready. Unexpected errors might occur.
Login failed.

Login via login():

euracHost = "https://openeo.eurac.edu"
conf = list(client_id = "xxx", secret = "xxx")

eurac = connect(host = euracHost) # connect to get oidc providers
prov = list_oidc_providers()
prov$Eurac_EDP_Keycloak

login(login_type = "oidc", provider = prov$Eurac_EDP_Keycloak, config = conf, con = eurac)

I get (I looked into the code authentication.R, there I see that the error "Login failed." is only thrown by the object BasicAuth, so I don't know where it goes wrong):

Login failed.

The provider object looks like this:

> prov$Eurac_EDP_Keycloak
$id
[1] "Eurac_EDP_Keycloak"

$issuer
[1] "https://edp-portal.eurac.edu/auth/realms/edp"

$scopes
NULL

$title
[1] "Eurac EDP Keycloak"

$description
[1] "Keycloak server linking to the eurac active directory. This service can be used with Eurac and general MS accounts"

$links
$links[[1]]
$links[[1]]$href
[1] "https://edp-portal.eurac.edu/auth"



attr(,"class")
[1] "Provider"

I can't identify the problem :(
When turning on the debug() connecting doesn't work any more like this eurac = connect(host = euracHost). It gives missing value where TRUE/FALSE neededInvalid openEO host stated. Please use an URL pointing to a valid openEO webservice implementation.. When turning on debug() after the connection is established and trying login(login_type = "oidc", provider = prov$Eurac_EDP_Keycloak, config = conf, con = eurac) I still only get the Login failed. error.

Have you connected successfully to any backend using OIDC yet?

All the best,
Peter

reducer callback from_argument: x or data

Hi Florian,
I just tried to to do a temporal reduce with a mean as the reducer.
In the Reducer -> callback -> data -> from_argument the r-client gives "x". On the webeditor "data" is inserted here by default. @prateekbudhwar also checked on the python client and it gives "data" as well.
Here is a snippet from the r code I used:

vv_mean = p$reduce(data = vv, dimension = "temporal", reducer = p$mean)

Here is the process graph produced by the r-client:

  "reduce_TWBUX5887C": {
    "process_id": "reduce",
    "arguments": {
      "data": {
        "from_node": "filter_bands_GMGTA1945Y"
      },
      "reducer": {
        "callback": {
          "mean_OASAU8940N": {
            "result": true,
            "process_id": "mean",
            "arguments": {
              "data": {
                "from_argument": "x"
              },
              "ignore_nodata": {
                "from_argument": "y"
              }
            }
          }
        }
      },
      "dimension": "temporal"
    }
  }

And the one from the webeditor:

  "2": {
    "process_id": "reduce",
    "arguments": {
      "data": {
        "from_node": "3"
      },
      "reducer": {
        "callback": {
          "2": {
            "result": true,
            "process_id": "mean",
            "arguments": {
              "data": {
                "from_argument": "data"
              }
            }
          }
        }
      },
      "dimension": "temporal"
    }
  }

Changing the from_argument to "data" in the graph produced by the r client works in the webeditor!
The r-client also inserts the ignore_nodata part. This doesn't break the graph. But it is not present on the web editor. Is this intended?
I'm using openeo 0.6.1 (dev)

@prateekbudhwar thanks for checking up on this!

Implement queueTask() and related functions

Some endpoints currently only support async calls to the /jobs endpoint (R backend included). There should be support for sending those jobs via the queueTask() function. cancelJob() would also be useful to help those backends clear up job descriptors. And then orderResult() is also just a variant of queueTask() that requests a download (link?).

Can't connect if server doesn't add production flag to well-known discovery (and tibble is not installed?)

I recently was made aware of an issue with the R client:

image

I was able to track it down that the difference is:

  1. the user had not installed tibble and
  2. the server did not had the production flag in GET /.well-known/openeo

So I think the issue is in or around this line:

return(table[c("api_version", "production", "url")])

I'm not sure how exactly data frames work, but it seems that it expects a production flag and throws an error is there's no such flag available. The flag is optional in the API, so it seems to be a bug in the R client.

Edit: Feedback from the user indicates that the issue may also exist for tibbles.

README Instructions

In the example of the README, there is still "describeProduct" instead of "describeCollection" (at least in the v0.3.0 branch).
Also the function downloadJob does not exist, I am not quite sure how it is named now.

Rework the process execution behaviour

With API v0.0.2 there are some major changes in the process management. On the POST /jobs there is no longer an evaluate parameter. The former sync call is now POST /execute. batch is now called with PATCH /jobs/{job_id}/queue and lazy is done via POST /services.

The new 'batch' and 'lazy' call requires an already uploaded process graph and optional output parameter. This is done by POST /jobs now.

This changes require some renaming and new functions for the R client. I suggest that job creation (without determining whether to run in 'batch' or 'lazy') is called createJob. Then, to call the 'lazy' evaluation, we call the function toService and queueJob will match the similar named call on the backend for the 'batch' evaluation.
This means the following functions will be removed: queueTask, orderResult.

The naming can be change and I'm open to suggestions.

connect: Version parameter is confusing

From D34:

The code for connecting to the backend is straightforward by providing username and password. The version parameter is unclear, but likely useful to assure the compatibility of the code for different back-ends.

Users seem to not understand the version paramater in the connect method.

  1. It should be clarified that this is a openEO API version implemented by the back-end, which usually doesn't need to be specified unless the user knows what he's doing.
  2. Also, it's not clear where to get the version number from, a reference to api_versions() should be added.
  3. The parameter should be removed from examples that don't explicitly document the parameter itself.

Connect to highest available api_version as default

con = connect(host = gee_host_url, version="0.4.2", user = user, password = pwd, login_type = "basic")

Why do I have to set the version? 0.4.2 is the "default" for GEE and should be discovered through the well known discovery mechanism.

Yes, you are right. I have to look at this.

I'm not sure, but I could find anything about a default implementation - neither in the 0.4.2 nor 1.0.0 API. There is something about "production" readiness, but apart from that there is nothing.

In the R client this is currently handled in the following ways:

  1. if you pass a direct link (where capabilities are called) as host you don't need a version
  2. if you have the link to the well-known doc you may pass a version, then the direct link is read from the document
  3. you don't pass a version in 2., then you will get an overview printed about the available versions

Originally posted by @flahn in #43 (comment)

help: i can't run executeTask

I'm a posdoc in atmospheric sciences from the University of São Paulo. I was trying to run the examples. I can do everything fine, but i'm getting errors in executeTask:

library(openeo)
conn = connect(host="http://saocompute.eurac.edu/openEO_WCPS_Driver/openeo",
               disable_auth = T, rbackend = TRUE)
conn$is_rserver = TRUE
conn %>% listCapabilities()
str(conn %>% listFormats())
conn %>% listCollections() # s2a_prd_msil1c
str(describeCollection(con = conn,
 collection_id = "s1a_t117_epsg3035_20m_VV")
)
conn %>% listProcesses()
str(conn %>% describeProcess("filter_daterange"))
str(conn %>% describeProcess("max_time"))
str(conn %>% describeProcess("filter_bbox"))
str(conn %>% describeProcess("NDVI"))
str(conn %>% describeProcess("min_time"))


task = collection("s1a_t117_epsg3035_20m_VV") %>%
  process("filter_bbox", prior.name = "imagery",
          left = 652000, right = 672000, 
          bottom = 5181000, top = 5161000,
          srs = "EPSG:32632") %>%
  process("filter_daterange", prior.name = "imagery",
          from = "2017-01-01", to = "2017-01-31") %>%
  process("NDVI", prior.name = "imagery", red = "B04", nir = "B8A") %>%
  process("min_time", prior.name = "imagery")

taskToJSON(task)
job_id = conn %>% defineJob(task)
No encoding supplied: defaulting to UTF-8.
Error in private$POST(endpoint = endpoint, authorized = TRUE, data = job) : 
  An error occured while performing an SQL-query: Unable to run insert stmt on object class JobFull {
    jobId: 1dba775d-533c-4ee7-817e-e877655ea289
    status: submitted
    processGraph: {process_id=min_time, args={imagery={process_id=NDVI, args={imagery={process_id=filter_daterange, args={imagery={process_id=filter_bbox, args={imagery={collection_id=s1a_t117_epsg3035_20m_VV}, left=652000, right=672000, bottom=5181000, top=5161000, srs=EPSG:32632}}, from=2017-01-01, to=2017-01-31}}, red=B04, nir=B8A}}}}
    output: {format={}}
    submitted: 18 Apr 2018 13:20:48 GMT
    updated: null
    userId: null
    consumedCredits: null
}: INSERT INTO `jobs` (`jobId` ,`status` ,`processGraph` ,`output` ,`submitted` ,`updated` ,`userId` ,`consumedCredits` ) VALUES (?,?,?,?,?,?,?,?)

conn %>% downloadJob(job_id)

file = conn %>% executeTask(task = task,
                            format = "tiff",
                            output_file = "test.tiff")
No encoding supplied: defaulting to UTF-8.
Error in private$POST(endpoint, authorized = TRUE, data = job, encodeType = "json",  : 
  An error occured when retrieving query result from WCPS endpoint: http://10.8.246.64:8080/rasdaman/ows?SERVICE=WCS&VERSION=2.0.1&REQUEST=ProcessCoverages&QUERY=for%20%24c1%20in%20%28%20s1a_t117_epsg3035_20m_VV%20%29%20return%20encode%20%28%20condense%20min%20over%20%24pm%20t%20%28imageCrsDomain%28%24c1%5BDATE%28%222017-01-01%22%3A%222017-01-31%22%29%5D%2CDATE%29%29%20using%20%28%28double%29%24c1.B8A%5BE%28652000%3A672000%29%2CN%285161000%3A5181000%29%2CDATE%28%24pm%29%5D%20-%20%24c1.B04%5BE%28652000%3A672000%29%2CN%285161000%3A5181000%29%2CDATE%28%24pm%29%5D%29%20%2F%20%28%28double%29%24c1.B8A%5BE%28652000%3A672000%29%2CN%285161000%3A5181000%29%2CDATE%28%24pm%29%5D%20%2B%20%24c1.B04%5BE%28652000%3A672000%29%2CN%285161000%3A5181000%29%2CDATE%28%24pm%29%5D%29%2C%20%22tiff%22%20%29

I'm running in Rstudio with the following configuration:

> sessionInfo()
R version 3.4.2 (2017-09-28)
Platform: x86_64-pc-linux-gnu (64-bit)
Running under: Ubuntu 17.10

Matrix products: default
BLAS: /usr/lib/x86_64-linux-gnu/openblas/libblas.so.3
LAPACK: /usr/lib/x86_64-linux-gnu/libopenblasp-r0.2.20.so

locale:
 [1] LC_CTYPE=en_US.UTF-8       LC_NUMERIC=C               LC_TIME=en_US.UTF-8       
 [4] LC_COLLATE=en_US.UTF-8     LC_MONETARY=en_US.UTF-8    LC_MESSAGES=en_US.UTF-8   
 [7] LC_PAPER=en_US.UTF-8       LC_NAME=C                  LC_ADDRESS=C              
[10] LC_TELEPHONE=C             LC_MEASUREMENT=en_US.UTF-8 LC_IDENTIFICATION=C       

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base     

other attached packages:
[1] openeo_0.2.0      rgdal_1.2-18      raster_2.6-7      sp_1.2-7          gdalUtils_2.0.1.7
[6] jsonlite_1.5      magrittr_1.5     

loaded via a namespace (and not attached):
 [1] Rcpp_0.12.16      units_0.5-1       lattice_0.20-35   R6_2.2.2          foreach_1.4.4    
 [6] udunits2_0.13     stringr_1.3.0     httr_1.3.1        tools_3.4.2       grid_3.4.2       
[11] R.oo_1.21.0       e1071_1.6-8       DBI_0.8           iterators_1.0.9   class_7.3-14     
[16] yaml_2.1.18       sf_0.6-2          R.utils_2.6.0     codetools_0.2-15  curl_3.1         
[21] stringi_1.1.6     compiler_3.4.2    R.methodsS3_1.7.1 classInt_0.1-24   lubridate_1.7.4  

Thanks

Better error messages

Sometimes the R client returns error messages that are not very helpful, mostly reporting an error in the sending function (POST, GET, ...). For example, running the third task of the hackathon without max_time process leads to:

No encoding supplied: defaulting to UTF-8.
Error in private$POST(endpoint, authorized = TRUE, data = job, encodeType = "json",  : 

With the old API version 0.0.2 it's for sure not to easy to return something meaningful, but for the next version we should make sure that no such message as the one above can appear. But also with the old version it would better to say what problem occurred in POST, e.g. received HTTP code 400 or whatever.

Fail to connect locally hosted openEO backend - Not connecting due to Self Signed Certificate

Hi Florian,
we have locally hosted an openEO backend on top of rasdaman at EURAC for a big data course.
I try to acces through the openeo-r-client like this (the x are there for masking):

driver_url = "https://xx.x.xxx.xxx:xxxx/" 
user = "xxx"
password = "xxx"
conn = connect(host = driver_url, 
               user = user, 
               password = password, 
               login_type = "basic")

I get the error:
SSL certificate problem: self signed certificateError in con$connect(url = host, version = version)$login(user = user, :
attempt to apply non-function

When I access directly in the browser I get the warning:
Your connection is not private
NET::ERR_CERT_AUTHORITY_INVALID

But I can choose to Proceed to xx.x.xxx.xxx (unsafe)

Would it be possible to include the Proceed to xx.x.xxx.xxx (unsafe) somehow into the r package?
Best,
Peter
@prateekbudhwar
@aljacob

newest dev version 0.6.1, built 3.6.1: error `i` must have one dimension, not 2.

Hi Florian,
I have downloaded the newest dev version. It creates some errors that haven't been there before.
This is the setup

devtools::install_github(repo="Open-EO/openeo-r-client",ref="develop",dependencies=TRUE)
library(openeo)
library(dplyr)
openeo_v = as.data.frame(installed.packages()) %>% filter(grepl(pattern = "openeo", x = Package, ignore.case = T))
openeo_v

# connection to openEO Eurac backend 
driver_url = "https://openeo.eurac.edu"
user = ***
password = ***

conn = connect(host = driver_url, 
               user = user, 
               password = password, 
               login_type = "basic")

These queries throw the same error for example:

list_udf_runtimes(conn)
conn %>% openeo::list_collections()

Error:
i must have one dimension, not 2.

Just to let you know.

package version:

  Package                                                    LibPath Version Priority    Depends                                           Imports LinkingTo
1  openeo /home/[email protected]/R/x86_64-pc-linux-gnu-library/3.6   0.6.1     <NA> R (>= 3.3) jsonlite, httr, methods, R6, lubridate, base64enc      <NA>
                 Suggests Enhances      License License_is_FOSS License_restricts_use OS_type MD5sum NeedsCompilation Built
1 tibble, testthat, knitr     <NA> file LICENSE            <NA>                  <NA>    <NA>   <NA>               no 3.6.0

Our Github Action fails to install devtools package

We are working on a workflow to connect to our openEO Backend with the R-Client. Now we are facing the issue that the devtools package fails to install properly. Please have a look at our Action for further information. During our research we recognized that other people ran into the same problem, but we could not yet find a solution that works for us. Any recommendations are much appreciated!

Github examples don't work

I'm new to the package and trying to test out its functionality. The eample provided by the package documentation as well as the example on the main page of the github don't work. I installed version 0.4.2 and also tried using 0.3.1

from the documentation:
con = connect(host='http://example.openeo.org',version='0.4.2')

Could not resolve host: example.openeo.org

from the example on the git page
conn = connect(host="http://backend1.openeo.org/",user="test",password="test",login_type="basic")

Could not resolve host: backend1.openeo.orgError in con$connect(url = host, version = version, exchange_token = exchange_token)$login(user = user, :
attempt to apply non-function

I also can't seem to find any way outlined on the github walkthrough on how to get my own 'user' and 'password' credentials for the API. I've been searching to figure it out myself, but no luck. The documentation says (optional) so maybe this isn't necessaary anymore?

Making things a bit clearer would help a lot!

Thanks for putting this package together, can't wait to try it out!

Auto-completion for compute_result file extension

Title Auto-completion for compute_result file extension
Date 2021-05-12
Issue #62
Category Visualisation, Reporting
Description Adjusting the correct names and short names for file extensions is teadious and a source of error, since it depends on the backend how the extension is expressed (e.g., .netcdf vs. .nc). Querrying the backend for the possible extensions and making them available via auto completion would ease the process significantly.
Dependencies
Links Facilitates local prototyping
Priority Medium
Impact Medium

An idea to increase usability is to auto-complete the file extension according to the format chosen in compute_result. This would be convenient when changing output formats so it doesn't have to be specified in 3 places. I think this is done in the python client.

#' load a collection
data = p$load_collection(id = ado_colls_names$ADO_NDVI_MODIS_231m_3035, 
                         spatial_extent = bbox,
                         temporal_extent = time_range)

#' check the file formats available. 
list_file_formats()

#' save the result. for a point ts json is suitable.
result = p$save_result(data = data, format="JSON") #1

#' ## compute the result -------------------------------------------------------

#' synchronous (result is computed directly)
compute_result(result,
               format = "JSON", #2
               output_file = "openeo_interactive_point.json",  #3
               con = eurac)

CRAN?

I'm updating the homepage and wondering whether there's an official CRAN release? If not we should consider one for 0.4 or the latest for 1.0

Update example scripts

We have script and Jupyter examples, but they are still in API v0.0.1. It should be updated to v0.0.2, probably (though it also depends on which backends support that...).

How should we structure them? The Python client has one notebook/example per use case, that might make things more simple. I guess we could have one per use case, with different back-ends for each to demonstrate that it works on different backends, plus one for the generic functions for listing processes, collections, etc.?

Also, it's a bit of duplication between the Jupyter and plain text examples. At the same time, they serve a bit different purposes: the text examples can be run as-is in a regular R session, whereas Jupyter is good for users to learn how to use it step by step.

Running the R client on computers with restrictions to install software

Public authorities / NGOs / companies may have restrictions to install software (as e.g. Python, R) on their office computers. A remotely hosted openEO web editor is a way for them also to use openEO, but is it also possible to run the openEO client using a portable R installation? Would be nice to check this and add a hint to the documentation.

Simplifications?

I'm looking at the GEE example and I'm confused about some things:

library(openeo)
library(magrittr)
library(tibble)
library(jsonlite)

It doesn't seem that I'm directly using any of the non openeo dependencies. Why do I need to import them myself? Wouldn't it be easier that library(openeo) simply imports everything needed?

con = connect(host = gee_host_url, version="0.4.2", user = user, password = pwd, login_type = "basic")

Why do I have to set the version? 0.4.2 is the "default" for GEE and should be discovered through the well known discovery mechanism.

graph = process_graph_builder(con = con)

Could this be: con$process_graph_builder()? Also, can we get rid of the process graph word for the next version?

data = graph$load_collection(graph$data$`COPERNICUS/S2`, ...)

What is this graph$data$COPERNICUS/S2? Seems not very intuitive, I'd expect "COPERNICUS/S2" as string.

spectral_reduce = graph$reduce(data = data, dimension = "bands",reducer = function(x) {
  B08 = x[1]
  B04 = x[2]
  B02 = x[3]
  (2.5 * (B08 - B04)) / sum(B08, 6 * B04, -7.5 * B02, 1)
})

Why can one use simply sum in the callback, but not in the "top level" code? There it would be graph$sum, I guess? That seems counter-intuitive. On the other hand, the linear transform below uses graph$linear_scale_range and not just linear_scale_range. Confusing.

temporal_reduce = graph$reduce(data = spectral_reduce,dimension = "temporal", reducer = function(x) {
  min(x)
})

Would it be possible to do something like reducer = min(x) or reducer = min if parameters match?

  graph$linear_scale_range(x = value, 
                           inputMin = -1, 
                           inputMax = 1, 
                           outputMin = 0, 
                           outputMax = 255)

A general R question: Could this simply be graph$linear_scale_range(value, -1, 1, 0, 255) as an alternative if one likes it more compact?

compute_result(con=con,graph=graph,format = "PNG",output_file = "gee_evi_example.png")

Comparing to the example above with graph$data$COPERNICUS/S2, could I use something like format = graph$output_format$PNG here? (Not saying this is better than simply "PNG".)

Thanks :-)

Arrays in process graph definitions

I'm not sure if this is an issue per se, but currently the R client generates graphs that have a single collections argument, whereas the example "tree" type process graph uses an array instead. Examples:

Current implementation:

{
  "process_id": "max_time",
  "args": {
    "collections": {
      "process_id": "NDVI",
      "args": {
        "collections": {
          "process_id": "filter_daterange",
          "args": {
            "collections": {
              "process_id": "filter_bbox",
              "args": {
                "collections": {
                  "collection_id": "S2_L2A_T32TPS_20M"
                },
                "left": 652000,
                "right": 672000,
                "top": 5161000,
                "bottom": 5181000,
                "srs": "EPSG:32632"
              }
            },
            "from": "2017-01-01",
            "to": "2017-01-31"
          }
        },
        "red": "B04",
        "nir": "B8A"
      }
    }
  }
} 

"Tree" graph example:

{
  "process_graph":{
    "process_id":"zonal_statistics",
    "args":{
      "collections":[
        {
          "process_id":"filter_daterange",
          "args":{
            "collections":[
              {
                "process_id":"filter_bbox",
                "args":{
                  "collections":[
                    {
                      "process_id":"filter_bands",
                      "args":{
                        "collections":[
                          {
                            "product_id":"Sentinel2-L1C"
                          }
                        ],
                        "bands":8
                      }
                    }
                  ],
                  "left":16.1,
                  "right":16.6,
                  "top":48.6,
                  "bottom":47.2,
                  "srs":"EPSG:4326"
                }
              }
            ],
            "from":"2017-01-01",
            "to":"2017-01-31"
          }
        }
      ],
      "regions":"/users/me/files/",
      "func":"avg",
      "outformat":"GPKG"
    }
  }
}

Note the use of [] after each collections.

Download raster time series from Google Earth Engine

I am trying to retrieve raster time series of NDWI for a given area of interest and time period. My code runs smoothly when using the VITO backend:

library(openeo)
library(raster)

## area of interest
aoi = list(
  west = 5.61
  , east = 5.66
  , south = 51.97
  , north = 51.99
)


### TERRASCOPE ====

#+ connect
vito = connect(
  host = "https://openeo.vito.be"
  , user = "group"
  , password = "group123"
)

#+ process
p1 = processes()

dat1 = p1$load_collection(
  id = "TERRASCOPE_S2_TOC_V2"
  , spatial_extent = aoi
  , temporal_extent = list("2020-04-01", "2020-04-10")
)

spectral_reduce1 = p1$apply(
  dat1
  , function(x, context) {
    p1$normalized_difference(x[8], x[11])
  }
)

r1 = p1$save_result(
  data = spectral_reduce1
  , format = "NetCDF"
)

#+ download
ofl1 = "vito_s2_ndwi.ncdf"
compute_result(
  r1,
  format = "NetCDF",
  output_file = ofl1
)

#+ import
rst1 = brick(ofl1)
dim(rst1)
# 236 (nrow) 336 (ncol) 4 (nlayers)

Unfortunately, my area of interest lies in the United States, for which VITO doesn't seem to provide Sentinel-2 data. At least, I get

SERVER-ERROR: no fitting raster sources found

from compute_result() when specifying a bounding box located in the US.

Therefore, I modified the above code to use Google's Earth Engine instead of VITO. Here, the apply() based approach to get raster time series for the same target extent as above no longer works:

### GOOGLE EARTH ENGINE ====

#+ connect
gee = connect(
  host = "https://earthengine.openeo.org"
  , user = "group1"
  , password = "test123"
)

#+ process_1
p2 = processes()

dat2 = p2$load_collection(
  id = "COPERNICUS/S2"
  , spatial_extent = aoi
  , temporal_extent = list("2020-04-01", "2020-04-10")
)

spectral_reduce2.1 = p2$apply(dat2, function(x, context) {
  p2$normalized_difference(x["B8"], x["B12"])
})

r2.1 = p2$save_result(
  data = spectral_reduce2.1
  , format = "GTIFF-ZIP"
)

#+ download_1
ofl2 = "gee_s2_ndwi.zip"
compute_result(
  r2.1,
  format = "GTIFF-ZIP",
  output_file = ofl2
)

SERVER-ERROR: Server error: Dimension 'undefined' does not exist.

Using a reduce_dimension() based approach instead, the program finishes successfully, but the resulting GeoTIFF doesn't have the expected four layers (ie. four overflights) and, in addition, has a different spatial resolution:

#+ process_2
spectral_reduce2.2 = p2$reduce_dimension(
  data = dat2
  , reducer = function(x, context) {
    p2$normalized_difference(x["B8"], x["B12"])
  }
  , dimension = "bands"
)

r2.2 = p2$save_result(
  data = spectral_reduce2.2
  , format = "GTIFF-ZIP"
)

#+ download_2
compute_result(
  r2.2,
  format = "GTIFF-ZIP",
  output_file = ofl2
)

#+ import_2
rst2.2 = raster::brick(
  unzip(
    ofl2
    , exdir = tempdir()
  )
)
dim(rst2.2)
# 401 (nrow) 1000 (ncol) 1 (nlayers)

My question here is two-fold:

  1. Is it possible at all to extract raster time series as shown in the VITO based example from Google Earth Engine? Of course, I could create a daily loop over the desired time period and calculate the NDWI on a day-to-day basis, but this involves a considerable overhead regarding recurrent backend queries.
  2. Where exactly does the higher spatial resolution in the Earth Engine based approach come from? Sentinel-2 bands 8 and 12 have different spatial resolutions of 10 and 20 m, respectively. Does VITO favor the lower resolution, while Earth Engine uses the higher?

Making required packages optional

It would be useful to look over the packages that the client requires and see if they can be made optional (quite like the raster package suggests, rather than requires, rgdal). At this point, it is really heavy on dependencies, and they even require installing external packages (udunits2 from sf and cairo for some other dependency are the main concerns). In some cases, if the user doesn't have root access, installing packages with external package dependencies may be outright impossible, so making those requirements optional would help usability a lot.

VITO server error during compute_result()

I get

SERVER-ERROR: An error occurred while calling o3247140.datacube_seq.
: java.lang.NullPointerException
	at org.openeo.geotrellissentinelhub.PyramidFactory.datacube_seq(PyramidFactory.scala:100)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244)
	at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
	at py4j.Gateway.invoke(Gateway.java:282)
	at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132)
	at py4j.commands.CallCommand.execute(CallCommand.java:79)
	at py4j.GatewayConnection.run(GatewayConnection.java:238)
	at java.lang.Thread.run(Thread.java:748)

upon running VITO-examples.Rmd. The error occurs at compute_result().

This used to work just fine only last week. Other backends (tested with GEE) don't seem to be affected. Any ideas?

Prevent user from seeing files upon save_result

Title Prevent user from seeing files upon save_result
Date 2019-10-30
Issue #39
Category Usability
Description The process of downloading and manually loading the results from openEO backends could be streamlined by taking this complexity and source of error away from the user. So that he can stay in the R-Environment work progress also for following analysis.
Dependencies Standardized result naming and metadata on openEO backends following the STAC catalogue.
Links Facilitates also local prototyping
Priority High
Impact High

Hi Florian,
during two presentations of the r-client there were requests of the potential users to implement a possibility to read the results of a process graph directly into an r variable.
We had talked about this before and with your hints I came up with a very simple use case. It is viable for the format json and one layer.

This cope block is an example request...

# establish the connection
driver_url = "https://openeo.eurac.edu"
user = "guest"
password = "guest_123"

conn = connect(host = driver_url, 
               user = user, 
               password = password, 
               login_type = "basic")

# build process graph
graph = conn %>% process_graph_builder()

# subset options
aoi = list(west = 11.63, south = 46.532, east = 11.631, north = 46.5325)
timespan = c("2016-07-01T00:00:00.000Z", "2016-07-15T00:00:00.000Z")
bands = c("B04", "B08")

# subset
data1 = graph$load_collection(id = graph$data$`SAO_S2_ST_DEM_BRDF_10m_L2A`,     
                              spatial_extent = aoi,
                              temporal_extent = timespan,  
                              bands = bands)

# filter bands
b_red = graph$filter_bands(data = data1,bands = bands[1])
b_nir = graph$filter_bands(data = data1,bands = bands[2])

# calc ndvi
ndvi = graph$normalized_difference(band1 = b_red, band2 = b_nir)

# get maximum value in timespan
reducer = graph$reduce(data = ndvi, dimension = "temporal")
cb_graph = conn %>% callback(reducer, parameter = "reducer", choice_index = 1)
cb_graph$max(data = cb_graph$data$data) %>% cb_graph$setFinalNode()

# set final node of the graph
graph$save_result(data = reducer, format = "json") %>%  # "netcdf" "GTiff"
  graph$setFinalNode()

Here is the part of how to read the result directly into r...

tmp = openeo::compute_result(con = conn, graph = graph, format = "json")
tmp_char = rawToChar(tmp)
tmp_json = jsonlite::fromJSON(tmp_char)

Is there a senseful way to generalize this idea so that it works for all formats and also multilayer objects? Concerning the size of the result we could implement a limit, so that r doesen't run into problems with too large results.

Best, Peter

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.