worldbank / blackmarblepy Goto Github PK
View Code? Open in Web Editor NEWGeoreferenced Rasters and Statistics of Nightlights from NASA Black Marble
Home Page: https://worldbank.github.io/blackmarblepy/
License: Mozilla Public License 2.0
Georeferenced Rasters and Statistics of Nightlights from NASA Black Marble
Home Page: https://worldbank.github.io/blackmarblepy/
License: Mozilla Public License 2.0
This issue tracks the release of this package on PyPI. We will create a workflow using GitHub Actions.
I tried running the code before and it works perfectly even though it took time to process a whole year's worth of data, hence, I divided the code by month. Now, I tried to rerun it and tried several ways like adjusting the timeout, and the size into a day, but it just usually takes hours until it stops progressing and stays into the same % of "PROCESSING TASKS", is there a way to manage this error? Moreover, the file_skip_if_exists doesn't work since I tried running the same query hoping it will skip the tile with available data, but it just resets the whole process.
Here is the screenshot where I ran the same code and it supposedly skipped the file since before it had xKB of size, now it resets to:
This issue is for adding AllContributors.
Is your feature request related to a problem? Please describe.
It took me some time to notice that the API interface is under "blackmarble package.
Describe the solution you'd like
It might be more intuitive if it was named something like "API Interface"
Describe alternatives you've considered
"Reference"
Additional context
N/A
It'd be great if the documentation included "instructions for developers" on things like:
Current guidance on the contributing page does a good job covering how contributions should be made. It is unfortunate that the hyperlink in the readme (this) does not work, but since it works in the hosted documentation I'm wondering if changing it in the readme would break that link...
This issue is for updating the logo and favicon.
We currently lack comprehensive unit tests. This issue is created to track and coordinate efforts towards implementing unit tests for the package.
With the success of blackmarbler. Both @ramarty and @g4brielvs started Python packages. It would be beneficial to join efforts instead and maintain blackmarblepy solely. This issue to tracks contributions to improve blackmarblepy in regard to readability, compatibility and making it more pythonic. We will create PRs to address each issue separately and prepare for the v0.1 release in December.
This issue tracks a proposal to enhance error handling in a more pythonic way (instead of print
everywhere).
NTL values need to be scaled by 0.1
Have implemented in blackmarbler; replicate for py
https://github.com/worldbank/blackmarbler/blob/main/R/blackmarbler.R#L61
The repository blackmarblepy by @ramarty. An option is to start from scratch, but another option is to transfer the ownership to the World Bank.
All Contributors is failing
This issue is for adding Google Colab badge and example.
Hello, I would like to ask for help regarding my research. I already have the visualization of the NTL radiance of a specific city or municipality but I was not satisfied with the results because the map is zoomed in and the NTL radiance is pixelated. What is the proper solution for this one? Thank you.
Hello,
I followed the exact steps from the documentation/README but running bm_raster
or bm_extract
always returns the following error:
Skipping XXX-XX-XX due to error. Data may not be available.
Hi, i get this error : AttributeError: 'GeoSeries' object has no attribute 'iteritems' when i run the code below see in the documentation of blackmarblepy.
from gadm import GADMDownloader
from blackmarble.extract import bm_extract
from blackmarble.raster import bm_raster
gdf = GADMDownloader(version="4.0").get_shape_data_by_country_name(
country_name="GHA", ad_level=1
)
r_202110 = bm_raster(gdf, product_id="VNP46A4", date_range="2023-01-01", bearer="token")
When I install gadm, it installs geopandas==0.9.0 by default which is the source of the error, but when I re-install geopandas==0.10.2, it solves the issue.
I don't know if we can mention this in the documentation if you confirm that as an issue.
data = bm_extract(roi = gdf,
product_id = "VNP46A2",
date_range = pd.date_range("2021-10-02", "2021-11-01", freq="D"),
bearer = bearer,
)
When download data
, it provides ntl_mean value.
I have found some additional information what does it mean: ntl_mean
, but I'm not sure.
My question is: what is the exact interpretation of ntl_mean
:
Thank you beforehand,
BlackMarble includes information on cloud free observations that produced NTL data, which is especially relevant for monthly and daily data.
Right now it should be possible to get this information by changing the "variable "
argument to the relevant cloud cover variable. However, getting NTL and cloud coverage would require two function calls -- which could be simplified when using bm_extract
. For example, could add an argument in bm_extract
that, when set to TRUE
, adds a variable for cloud free observations (eg, average of cloud free observations across pixels). So a user would get a dataset of, for example, (1) Average NTL and (2) Cloud Free Observations --- which would enable findings of: "oh NTL is low because of clouds!"
Add a download progress bar to the download function.
This issue tracks the creation of BlackMarbleDownloader
for parallel downloading.
It seems that the documentation file is currently unavailable due to a broken link: https://worldbank.github.io/blackmarblepy/examples/blackmarblepy.html. Is there an alternative link or another source where I might find the documentation?
This issue is for refactoring the collating to process in memory.
LAADS DAAC is migrating to AWS, so we should add a module to retrieve from AWS S3 instead. The package, however, would only work on AWS west-2.
Add Documentation based on the Template. The documentation will be published via GitHub Pages.
I'm starting to use the library and trying the blackmarblepy.ipynb tutorial.
When I execute the line corresponding to r_20210205 = bm_raster() at the end of the file download, this error appears NotADirectoryError : [WinError 267] The directory name is not valid
Regarding the installation in both cases (From PyPI and From Source) the same error occurs.
N/A
N/A
The issue arises when replicating the daily data example.
There are no problems up to gdf.explore(). But the bm_raster() function fails to process the files.
This is the code:
gdf = geopandas.read_file(
"https://geodata.ucdavis.edu/gadm/gadm4.1/json/gadm41_BHS_1.json.zip"
)
gdf.explore()
bearer = os.getenv(BLACKMARBLE_TOKEN)
r_20210205 = bm_raster(
gdf, product_id="VNP46A2", date_range="2021-02-05", bearer=BLACKMARBLE_TOKEN
)
The output:
GETTING MANIFEST...: 100%
2/2 [00:00<00:00, 2.77it/s]
QUEUEING TASKS | Downloading...: 100%
2/2 [00:00<00:00, 331.33it/s]
PROCESSING TASKS | Downloading...: 100%
2/2 [00:35<00:00, 15.54s/it]
[2024-05-17 16:14:14 - backoff:105 - INFO] Backing off _download_file(...) for 0.7s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:14 - backoff:105 - INFO] Backing off _download_file(...) for 0.4s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:19 - backoff:105 - INFO] Backing off _download_file(...) for 0.1s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:19 - backoff:105 - INFO] Backing off _download_file(...) for 0.1s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:25 - backoff:105 - INFO] Backing off _download_file(...) for 0.8s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:25 - backoff:105 - INFO] Backing off _download_file(...) for 1.5s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:31 - backoff:105 - INFO] Backing off _download_file(...) for 5.3s (httpx.ReadTimeout: The read operation timed out)
[2024-05-17 16:14:31 - backoff:105 - INFO] Backing off _download_file(...) for 7.0s (httpx.ReadTimeout: The read operation timed out)
COLLECTING RESULTS | Downloading...: 100%
2/2 [00:00<00:00, 399.69it/s]
COLLATING RESULTS | Processing...: 0%
0/1 [00:00<?, ?it/s]
---------------------------------------------------------------------------
CPLE_BaseError Traceback (most recent call last)
File rasterio\\crs.pyx:775, in rasterio.crs.CRS.from_user_input()
File rasterio\\_err.pyx:209, in rasterio._err.exc_wrap_ogrerr()
CPLE_BaseError: OGR Error code 6
During handling of the above exception, another exception occurred:
CRSError Traceback (most recent call last)
Cell In[9], line 4
1 BLACKMARBLE_TOKEN = "[SUPRESSED]"
2 bearer = os.getenv(BLACKMARBLE_TOKEN)
----> 4 r_20210205 = bm_raster(
5 gdf, product_id="VNP46A2", date_range="2021-02-05", bearer=BLACKMARBLE_TOKEN
6 )
File ~\anaconda3\Lib\site-packages\pydantic\validate_call_decorator.py:59, in validate_call.<locals>.validate.<locals>.wrapper_function(*args, **kwargs)
57 @functools.wraps(function)
58 def wrapper_function(*args, **kwargs):
---> 59 return validate_call_wrapper(*args, **kwargs)
File ~\anaconda3\Lib\site-packages\pydantic\_internal\_validate_call.py:81, in ValidateCallWrapper.__call__(self, *args, **kwargs)
80 def __call__(self, *args: Any, **kwargs: Any) -> Any:
---> 81 res = self.__pydantic_validator__.validate_python(pydantic_core.ArgsKwargs(args, kwargs))
82 if self.__return_pydantic_validator__:
83 return self.__return_pydantic_validator__(res)
File ~\anaconda3\Lib\site-packages\blackmarble\raster.py:283, in bm_raster(gdf, product_id, date_range, bearer, variable, quality_flag_rm, check_all_tiles_exist, file_directory, file_prefix, file_skip_if_exists)
279 filenames = _pivot_paths_by_date(pathnames).get(date)
281 try:
282 # Open each GeoTIFF file as a DataArray and store in a list
--> 283 da = [
284 rioxarray.open_rasterio(
285 h5_to_geotiff(
286 f,
287 variable=variable,
288 quality_flag_rm=quality_flag_rm,
289 output_prefix=file_prefix,
290 output_directory=d,
291 ),
292 )
293 for f in filenames
294 ]
295 ds = merge_arrays(da)
296 ds = ds.rio.clip(gdf.geometry.apply(mapping), gdf.crs, drop=True)
File ~\anaconda3\Lib\site-packages\blackmarble\raster.py:285, in <listcomp>(.0)
279 filenames = _pivot_paths_by_date(pathnames).get(date)
281 try:
282 # Open each GeoTIFF file as a DataArray and store in a list
283 da = [
284 rioxarray.open_rasterio(
--> 285 h5_to_geotiff(
286 f,
287 variable=variable,
288 quality_flag_rm=quality_flag_rm,
289 output_prefix=file_prefix,
290 output_directory=d,
291 ),
292 )
293 for f in filenames
294 ]
295 ds = merge_arrays(da)
296 ds = ds.rio.clip(gdf.geometry.apply(mapping), gdf.crs, drop=True)
File ~\anaconda3\Lib\site-packages\blackmarble\raster.py:129, in h5_to_geotiff(f, variable, quality_flag_rm, output_directory, output_prefix)
121 height, width = data.shape
122 transform = from_origin(
123 left,
124 top,
125 (right - left) / width,
126 (top - bottom) / height,
127 )
--> 129 with rasterio.open(
130 output_path,
131 "w",
132 driver="GTiff",
133 height=height,
134 width=width,
135 count=1,
136 dtype=data.dtype,
137 crs="EPSG:4326",
138 transform=transform,
139 ) as dst:
140 dst.write(data, 1)
141 dst.update_tags(**attrs)
File ~\anaconda3\Lib\site-packages\rasterio\env.py:451, in ensure_env_with_credentials.<locals>.wrapper(*args, **kwds)
448 session = DummySession()
450 with env_ctor(session=session):
--> 451 return f(*args, **kwds)
File ~\anaconda3\Lib\site-packages\rasterio\__init__.py:327, in open(fp, mode, driver, width, height, count, crs, transform, dtype, nodata, sharing, **kwargs)
325 writer = get_writer_for_driver(driver)
326 if writer is not None:
--> 327 dataset = writer(
328 path,
329 mode,
330 driver=driver,
331 width=width,
332 height=height,
333 count=count,
334 crs=crs,
335 transform=transform,
336 dtype=dtype,
337 nodata=nodata,
338 sharing=sharing,
339 **kwargs
340 )
341 else:
342 raise DriverCapabilityError(
343 "Writer does not exist for driver: %s" % str(driver)
344 )
File rasterio\\_io.pyx:1563, in rasterio._io.DatasetWriterBase.__init__()
File rasterio\\_io.pyx:1592, in rasterio._io.DatasetWriterBase._set_crs()
File rasterio\\crs.pyx:777, in rasterio.crs.CRS.from_user_input()
CRSError: The WKT could not be parsed. OGR Error code 6
I would appreciate any advice.
When bm.extract() or bm.raster() methods are used, it can not generate the data.
Calling the methods on jupyter notebook produces this error. I tried both on my computer and google colab. It looks like it is an OS error related to h5py
OSError Traceback (most recent call last)
Cell In[16], line 2
1 # f.close()
----> 2 ntl_r = bm_raster(
3 continental_us,
4 product_id="VNP46A2",
5 date_range="2023-01-01",
6 bearer=bearer,
7 variable="Gap_Filled_DNB_BRDF-Corrected_NTL",
8 )
File /opt/anaconda3/lib/python3.11/site-packages/pydantic/validate_call_decorator.py:60, in validate_call.<locals>.validate.<locals>.wrapper_function(*args, **kwargs)
58 @functools.wraps(function)
59 def wrapper_function(*args, **kwargs):
---> 60 return validate_call_wrapper(*args, **kwargs)
File /opt/anaconda3/lib/python3.11/site-packages/pydantic/_internal/_validate_call.py:96, in ValidateCallWrapper.__call__(self, *args, **kwargs)
95 def __call__(self, *args: Any, **kwargs: Any) -> Any:
---> 96 res = self.__pydantic_validator__.validate_python(pydantic_core.ArgsKwargs(args, kwargs))
97 if self.__return_pydantic_validator__:
98 return self.__return_pydantic_validator__(res)
File /opt/anaconda3/lib/python3.11/site-packages/blackmarble/raster.py:355, in bm_raster(gdf, product_id, date_range, bearer, variable, drop_values_by_quality_flag, check_all_tiles_exist, output_directory, output_skip_if_exists)
351 filenames = _pivot_paths_by_date(pathnames).get(date)
353 try:
354 # Open each GeoTIFF file as a DataArray and store in a list
--> 355 da = [
356 rioxarray.open_rasterio(
357 h5_to_geotiff(
358 f,
359 variable=variable,
360 drop_values_by_quality_flag=drop_values_by_quality_flag,
361 output_directory=d,
362 ),
363 )
364 for f in filenames
365 ]
366 ds = merge_arrays(da)
367 clipped_dataset = ds.rio.clip(
368 gdf.geometry.apply(mapping), gdf.crs, drop=True
369 )
File /opt/anaconda3/lib/python3.11/site-packages/blackmarble/raster.py:357, in <listcomp>(.0)
351 filenames = _pivot_paths_by_date(pathnames).get(date)
353 try:
354 # Open each GeoTIFF file as a DataArray and store in a list
355 da = [
356 rioxarray.open_rasterio(
--> 357 h5_to_geotiff(
358 f,
359 variable=variable,
360 drop_values_by_quality_flag=drop_values_by_quality_flag,
361 output_directory=d,
362 ),
363 )
364 for f in filenames
365 ]
366 ds = merge_arrays(da)
367 clipped_dataset = ds.rio.clip(
368 gdf.geometry.apply(mapping), gdf.crs, drop=True
369 )
File /opt/anaconda3/lib/python3.11/site-packages/blackmarble/raster.py:177, in h5_to_geotiff(f, variable, drop_values_by_quality_flag, output_directory)
174 if variable is None:
175 variable = VARIABLE_DEFAULT.get(product_id)
--> 177 with h5py.File(f, "r") as h5_data:
178 attrs = h5_data.attrs
179 data_field_key = "HDFEOS/GRIDS/VNP_Grid_DNB/Data Fields"
File /opt/anaconda3/lib/python3.11/site-packages/h5py/_hl/files.py:567, in File.__init__(self, name, mode, driver, libver, userblock_size, swmr, rdcc_nslots, rdcc_nbytes, rdcc_w0, track_order, fs_strategy, fs_persist, fs_threshold, fs_page_size, page_buf_size, min_meta_keep, min_raw_keep, locking, alignment_threshold, alignment_interval, meta_block_size, **kwds)
558 fapl = make_fapl(driver, libver, rdcc_nslots, rdcc_nbytes, rdcc_w0,
559 locking, page_buf_size, min_meta_keep, min_raw_keep,
560 alignment_threshold=alignment_threshold,
561 alignment_interval=alignment_interval,
562 meta_block_size=meta_block_size,
563 **kwds)
564 fcpl = make_fcpl(track_order=track_order, fs_strategy=fs_strategy,
565 fs_persist=fs_persist, fs_threshold=fs_threshold,
566 fs_page_size=fs_page_size)
--> 567 fid = make_fid(name, mode, userblock_size, fapl, fcpl, swmr=swmr)
569 if isinstance(libver, tuple):
570 self._libver = libver
File /opt/anaconda3/lib/python3.11/site-packages/h5py/_hl/files.py:231, in make_fid(name, mode, userblock_size, fapl, fcpl, swmr)
229 if swmr and swmr_support:
230 flags |= h5f.ACC_SWMR_READ
--> 231 fid = h5f.open(name, flags, fapl=fapl)
232 elif mode == 'r+':
233 fid = h5f.open(name, h5f.ACC_RDWR, fapl=fapl)
File h5py/_objects.pyx:54, in h5py._objects.with_phil.wrapper()
File h5py/_objects.pyx:55, in h5py._objects.with_phil.wrapper()
File h5py/h5f.pyx:106, in h5py.h5f.open()
OSError: Unable to open file (file signature not found)
Initially i moved the project folder to the desktop for possible read&write permission issues. It worked the first run then the error persisted.
In the example notebook, when installing geopandas, pip automatically installed 0.9.0, but it seems version (at least) 0.10.2 is required for .explore() to work. May consider as a comment or in the !pip install code the minimum version requirements for geopandas.
It seems there's an issue with removing cells based on quality.
For example, this:
ntl_r = bm_raster(
gdf,
product_id="VNP46A2",
date_range="2023-01-01",
bearer=bearer,
variable="DNB_BRDF-Corrected_NTL",
quality_flag_rm=[2, 255],
)
fig, ax = plt.subplots()
# Plot
ntl_r["DNB_BRDF-Corrected_NTL"].sel(time="2023-01-01").plot(
ax=ax, cmap=cc.cm.bmy, robust=True
)
cx.add_basemap(ax, crs=gdf.crs.to_string(), source=cx.providers.CartoDB.Positron)
plt.axis("off")
plt.tight_layout()
displays this:
But when removing "0" as well, it looks the same -- when most of these pixels are good quality (0), so these should be removed / set to na as well.
ntl_r = bm_raster(
gdf,
product_id="VNP46A2",
date_range="2023-01-01",
bearer=bearer,
variable="DNB_BRDF-Corrected_NTL",
quality_flag_rm=[0, 2, 255],
)
Same plotting code as above, and figure looks the same.
@g4brielvs I'll take a look here, but not sure if you have any quick thoughts?
The https://ladsweb.modaps.eosdis.nasa.gov/archive/allData/5000/VNP46A3
link in:
Line 35 in 5a4c040
Gives me this page with a 500 error. Is that a temporary error? Or is it something I am doing wrong?
When querying multiple rasters (ie, multiple dates), add parameter that allows for interpolating NA values.
Implemented in blackmarbler (for both bm_raster
and bm_extract
) - https://github.com/worldbank/blackmarbler/blob/main/R/blackmarbler.R#L935
The current implementation of _retrieve_manifest
is really slow depending on the numbers of files to retrieve. Also, an error will be raised if any "manifest" doesn't exist, so we can make a bit more user friendly.
Cleanup documentation about parameters - some of the formatting is a bit weird:
https://worldbank.github.io/blackmarblepy/api/blackmarble.html
Describe the bug
OSError: Unable to synchronously open file (file signature not found)
To Reproduce
Steps to reproduce the behavior:
Just running a simple bm_raster call from the examples is giving this error:
GETTING MANIFEST...: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 4/4 [00:03<00:00, 1.10it/s]
QUEUEING TASKS | Downloading...: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████| 360/360 [00:00<00:00, 2926.85it/s]
PROCESSING TASKS | Downloading...: 75%|█████████████████████████████████████████[2024-02-10 22:28:56 - backoff:105 - INFO] Backing off _download_file(...) for 0.0s (httpx.ConnectTimeout: timed out)
PROCESSING TASKS | Downloading...: 100%|█████████████████████████████████████████████████████████████████████████████████████████████████| 360/360 [00:30<00:00, 11.77it/s]
COLLECTING RESULTS | Downloading...: 100%|███████████████████████████████████████████████████████████████████████████████████████████████████████| 360/360 [00:00<?, ?it/s]
COLLATING RESULTS | Processing...: 0%| | 0/90 [00:00<?, ?it/s]
Traceback (most recent call last):
File "C:\pythonTest\blackMarble.py", line 37, in
daily = bm_raster(
File "C:\Users\Python310\lib\site-packages\pydantic\validate_call_decorator.py", line 58, in wrapper_function
return validate_call_wrapper(*args, **kwargs)
File "C:\Users\Python310\lib\site-packages\pydantic_internal_validate_call.py", line 81, in call
res = self.pydantic_validator.validate_python(pydantic_core.ArgsKwargs(args, kwargs))
File "C:\Users\Python310\lib\site-packages\blackmarble\raster.py", line 283, in bm_raster
da = [
File "C:\Users\Python310\lib\site-packages\blackmarble\raster.py", line 285, in
h5_to_geotiff(
File "C:\Users\Python310\lib\site-packages\blackmarble\raster.py", line 66, in h5_to_geotiff
with h5py.File(f, "r") as h5_data:
File "C:\Users\Python310\lib\site-packages\h5py_hl\files.py", line 562, in init
fid = make_fid(name, mode, userblock_size, fapl, fcpl, swmr=swmr)
File "C:\Users\Python310\lib\site-packages\h5py_hl\files.py", line 235, in make_fid
fid = h5f.open(name, flags, fapl=fapl)
File "h5py_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py\h5f.pyx", line 102, in h5py.h5f.open
OSError: Unable to synchronously open file (file signature not found)
Any help is appreciated
Expected behavior
A clear and concise description of what you expected to happen.
Screenshots
If applicable, add screenshots to help explain your problem.
Desktop (please complete the following information):
Additional context
Add any other context about the problem here.
The file_skip_if_exists
was removed and needs to be reimplemented.
I was downloading data for multiple days and the "retrieving" failed a few times. The function still worked, as it's designed to retry. But not sure if there's something going on where it's trying to download too much at one time? Not sure if limiting to, say, 2-3 concurrent downloads would help this issue? (I never seem to get these red bars when downloading just a couple tiles at once).
Fix PermissionError: [WinError 32] The process cannot access the file because it is being used by another process
Originally posted by @guerreroda in #72 (comment)
Bit 0: Day/Night
0: Night
1: Day
Bits 1-3: Land/Water Background
0: Land & Desert
1: Land no Desert
2: Inland Water
3: Sea Water
5: Coastal
Bits 4-5: Cloud Mask Quality
0: Poor
1: Low
2: Medium
3: High
Bits 6-7: Cloud Detection Results & Confidence Indicator
0: Confident Clear
1: Probably Clear
2: Probably Cloudy
3: Confident Cloudy
Bit 8: Shadow Detected
0: No
1: Yes
Bit 9: Cirrus Detection (IR) (BTM15 - BTM16)
0: No cloud
1: Cloud
Bit 10: Snow/Ice Surface
0: No Snow/Ice
1: Snow/Ice
Thanks for the great work, but files are repeatedly downloaded, even if the ile_skip_if_exists=True parameter is passed in the sample code with the bm_raster function, and file_directory destination is provided. Inspection of the download.py module doesn't show any logic to check for existing local files.
Originally posted by @ArieClaassens in #38 (comment)
This issue is to treat when the GeoDataFrame is indexed in order to return the zonal statistics correctly.
This issue tracks the refactoring of the modules into an object-oriented and more pythonic implementation.
print
with loggingA declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.