Git Product home page Git Product logo

nceplibs-ncio's Introduction

Status

NCEPLIBS-ncio

This is a library used by NCEP GSI system to read the GFS forecast files for use in data assimilation. It is also used by enkf_chgres_recenter_nc, which will read in a template output file, an input file, and regrid the input file to the template output file resolution.

For more detailed documentation see https://noaa-emc.github.io/NCEPLIBS-ncio/.

NCEPLIBS-ncio is part of the NCEPLIBS project.

To submit bug reports, feature requests, or other code-related issues including installation and usage questions, please create a GitHub issue. For general NCEPLIBS inquiries, contact Edward Hartnett (secondary point of contact Alex Richert).

Authors

Jeff Whitaker, Cory Martin

Code manager: Edward Hartnett, Hang Lei

Prerequisites

This package requires:

Installing

mkdir build
cd build
cmake -DCMAKE_INSTALL_PREFIX=/path/to/install ..
make -j2
make install

Using

This library contains a module for reading/writing netcdf gridded data. API docs here.

Examples

  • open a Dataset.
use module_ncio
type(Dataset) :: ds
ds = open_dataset('gfs.t00z.atmf240.nc')
  • read an attribute.
real(4), allocatable, dimension(:) :: ak,bk
character(len=32) charatt
! ak,bk are allocated and filled.
call read_attribute(ds, 'ak', ak)
call read_attribute(ds, 'bk', bk)
! read character variable attribute
call read_attribute(ds, 'long_name', charatt, 'vgrd')
  • read a variable.
real(4), allocatable, dimension(:) :: lats,lons
real(4), allocatable, dimension(:,:,:) :: psfc
! arrays must be of correct rank (but not necessarily
! the same type). They are allocated and filled.
! The entire variable is read at once.
call read_vardata(ds,'latitudes',lats)
call read_vardata(ds,'latitudes',lons)
call read_vardata(ds,'pressfc',psfc)
  • create a new dataset from a template dataset.
type(Dataset) :: dso
! copy_vardata is optional, default .false. means just copy
! variables, dimensions and attributes and coordinate variable 
! data (don't copy all variable data).
dso = create_dataset('gfs.t00z.atmf240_copy.nc', ds, copy_vardata=.true.)
  • write a variable.
real(8), allocatable, dimension(:) :: times
call read_vardata(dso, 'time', times)
! times is now allocated and filled with values from template dataset.
! now overwrite with new values and write back.
times = times + 6 ! add six hours.
call write_vardata(dso, 'time', times)
  • quantize variable data before writing for better compression.
! Lossy compression controlled by parameter nbits (1-31).
! The floating point data is quantized to improve compression
! See doi:10.5194/gmd-10-413-2017.  The method employed
! here is identical to the 'scaled linear packing' method in
! that paper, except that the data are scaling into an arbitrary
! range (2**nbits-1 not just 2**16-1) and are stored as re-scaled floats
! instead of short integers. The zlib algorithm does almost as
! well packing the rescaled floats as it does the scaled
! integers, and this avoids the need for the client to apply the
! rescaling (plus it allows the ability to adjust the packing range).
data_save = data
nbits = 14
call quantize_data(data_save, data, nbits, compress_err)
! compress_err is the max abs compression error (can be saved
! as a variable attribute.
  • write an attribute.
charatt = 'hours since 2016-01-04 06:00:00'
call write_attribute(dso, 'units', charatt, 'time')
  • access Variable and Dimension derived data types.
type(Variable) :: var
type(Dimension) :: dim
! see module_ncio.f90 for type definitions.
! type members can be used to the call netcdf-fortran90 interface
! directly.
var = get_var(ds, 'ugrd')
dim = get_dim(ds, 'time')
  • close a dataset.
call close_dataset(ds)
call close_dataset(dso)

nceplibs-ncio's People

Contributors

aerorahul avatar alexanderrichert-noaa avatar briancurtis-noaa avatar corymartin-noaa avatar edwardhartnett avatar hang-lei-noaa avatar jswhit avatar jswhit2 avatar kgerheiser avatar

Stargazers

 avatar

Watchers

 avatar  avatar  avatar  avatar

nceplibs-ncio's Issues

MacOS CI workflow stopped working due to problems in brew update command

MacOS CI workflow stopped working due to problems in brew update command.

Removing brew update solves the problem and also makes the workflow a lot faster. brew update was updating all the packages on the system, most of which are never used by our testing. The problems seem to occur for the php update.

doxygen warnings...

There are some doxygen warnings:

/home/ed/NCEPLIBS-ncio/src/module_ncio.f90:838: warning: argument 'filename' of command @param is not found in the argument list of module_ncio::close_dataset(type(dataset), intent(inout) dset, integer, intent(out), optional errcode)
/home/ed/NCEPLIBS-ncio/src/module_ncio.f90:845: warning: The following parameter of module_ncio::close_dataset(type(dataset), intent(inout) dset, integer, intent(out), optional errcode) is not documented:
  parameter 'dset'
/home/ed/NCEPLIBS-ncio/src/module_ncio.f90:488: warning: argument 'copyvardata' of command @param is not found in the argument list of module_ncio::create_dataset(character(len= *), intent(in) filename, type(dataset), intent(in) dsetin, logical, intent(in), optional copy_vardata, logical, intent(in), optional paropen, logical, intent(in), optional nocompress, integer, intent(in), optional mpicomm, integer, intent(out), optional errcode)
/home/ed/NCEPLIBS-ncio/src/module_ncio.f90:506: warning: The following parameter of module_ncio::create_dataset(character(len= *), intent(in) filename, type(dataset), intent(in) dsetin, logical, intent(in), optional copy_vardata, logical, intent(in), optional paropen, logical, intent(in), optional nocompress, integer, intent(in), optional mpicomm, integer, intent(out), optional errcode) is not documented:
  parameter 'copy_vardata'
Generating graph info page...

Improve CI

  • Create a developer build which does address santizer, doxygen build, warning check, code coverage, and Debug build. Turn these settings off in all other builds. Use develop branch of all required NCEPLIBS libraries and the latest version of jasper.
  • Create a linux_versions workflow that checks with different versions of dependencies.
  • Create a linux_settings workflow that tests different build options with the most recent versions of dependencies.
  • Use ubuntu-latest/macos-latest everywhere.
  • Use caches everywhere they can be used.

Add automatic test coverage

@kgerheiser figured this out and put it in place in the UFS_UTILS project.

We need to measure test coverage automatically, then you can upload the resulting batch of html files, and look at them on your local machine.

We need to do this for this code, since we have some basic testing, and we need to see where the gaps are.

Intel CI not using ifort

The Intel CI doesn't install the Intel fortran compilers, so I'm pretty confident they're not getting used. Also, given R&D use cases, it would make sense to switch from mpich to intel MPI.

Failure in GSI using ncio/1.1.0

While running the GSI using the ncio/1.1.0 stack module, the code fails with the following error message:

 calling general_read_gfsatm_nc
  in g_create_egrid2agrid, rlone(nlone) + dlone not within tolerance for 0 merid
 ian
  incorrect input grid coordinates in subroutine g_create_egrid2agrid, program s
 tops
application called MPI_Abort(MPI_COMM_WORLD, 999) - process 47

I added prints to src/gsi/general_read_gfsatm.f90. The following calls to read_vardata

       call read_vardata(atmges, 'grid_xt', rlons_tmp)
       call read_vardata(atmges, 'grid_yt', rlats_tmp)

return invalid values. All entries of rlons_tmp are 0.0. All entries of rlats_tmp are 0.0 except the first which is 89.2842275325136.

I was able to get around this failure by adding:

ncerr = nf90_get_var(dset%ncid, dset%variables(nvar)%varid, values)

back into read_vardata_code_1d.f90:

    if (present(ncstart) .and. present(nccount)) then
       allocate(values(nccount(1)))
       start(1)=ncstart(1); count(1)=nccount(1)
       if (dset%variables(nvar)%ndims == 2) then
         start(2)=1; count(2)=1
       end if
    else
       if (dset%variables(nvar)%ndims == 2) then
          allocate(values(dimlen))
       else
          allocate(values(dset%variables(nvar)%dimlens(1)))
          ncerr = nf90_get_var(dset%ncid, dset%variables(nvar)%varid, values)
       end if
    end if
    ncerr = nf90_get_var(dset%ncid, dset%variables(nvar)%varid, values,&
                         start=start, count=count)

tests fail address sanitizer

================================================================
==3311==ERROR: LeakSanitizer: detected memory leaks

Direct leak of 8388608 byte(s) in 1 object(s) allocated from:
    #0 0x7f52e7a56867 in __interceptor_malloc ../../../../src/libsanitizer/asan/asan_malloc_linux.cpp:145
    #1 0x5584bf873c20 in tst_ncio /home/runner/work/NCEPLIBS-ncio/NCEPLIBS-ncio/tests/tst_ncio.F90:95
    #2 0x5584bf887878 in main /home/runner/work/NCEPLIBS-ncio/NCEPLIBS-ncio/tests/tst_ncio.F90:4
    #3 0x7f52e4af9d8f  (/lib/x86_64-linux-gnu/libc.so.6+0x29d8f)

Direct leak of 8388608 byte(s) in 1 object(s) allocated from:
    #0 0x7f52e7a56867 in __interceptor_malloc ../../../../src/libsanitizer/asan/asan_malloc_linux.cpp:145
    #1 0x5584bf9dce93 in __module_ncio_MOD_read_vardata_4d_r4 ../src/read_vardata_code_4d.f90:84
    #2 0x5584bf882625 in tst_ncio /home/runner/work/NCEPLIBS-ncio/NCEPLIBS-ncio/tests/tst_ncio.F90:261
    #3 0x5584bf887878 in main /home/runner/work/NCEPLIBS-ncio/NCEPLIBS-ncio/tests/tst_ncio.F90:4
    #4 0x7f52e4af9d8f  (/lib/x86_64-linux-gnu/libc.so.6+0x29d8f)

Direct leak of 262144 byte(s) in 1 object(s) allocated from:
    #0 0x7f52e7a56867 in __interceptor_malloc ../../../../src/libsanitizer/asan/asan_malloc_linux.cpp:145
    #1 0x5584bf9d65bf in __module_ncio_MOD_read_vardata_5d_r4 ../src/read_vardata_code_5d.f90:60
    #2 0x5584bf877079 in tst_ncio /home/runner/work/NCEPLIBS-ncio/NCEPLIBS-ncio/tests/tst_ncio.F90:113
    #3 0x5584bf887878 in main /home/runner/work/NCEPLIBS-ncio/NCEPLIBS-ncio/tests/tst_ncio.F90:4
    #4 0x7f52e4af9d8f  (/lib/x86_64-linux-gnu/libc.so.6+0x29d8f)

Direct leak of 131072 byte(s) in 1 object(s) allocated from:
    #0 0x7f52e7a56867 in __interceptor_malloc ../../../../src/libsanitizer/asan/asan_malloc_linux.cpp:145
    #1 0x5584bf872f49 in tst_ncio /home/runner/work/NCEPLIBS-ncio/NCEPLIBS-ncio/tests/tst_ncio.F90:92
    #2 0x5584bf887878 in main /home/runner/work/NCEPLIBS-ncio/NCEPLIBS-ncio/tests/tst_ncio.F90:4
    #3 0x7f52e4af9d8f  (/lib/x86_64-linux-gnu/libc.so.6+0x29d8f)

Direct leak of 131072 byte(s) in 1 object(s) allocated from:
    #0 0x7f52e7a56867 in __interceptor_malloc ../../../../src/libsanitizer/asan/asan_malloc_linux.cpp:145
    #1 0x5584bf9e79e9 in __module_ncio_MOD_read_vardata_2d_r4 ../src/read_vardata_code_2d.f90:78
    #2 0x5584bf8847ff in tst_ncio /home/runner/work/NCEPLIBS-ncio/NCEPLIBS-ncio/tests/tst_ncio.F90:280
    #3 0x5584bf887878 in main /home/runner/work/NCEPLIBS-ncio/NCEPLIBS-ncio/tests/tst_ncio.F90:4
    #4 0x7f52e4af9d8f  (/lib/x86_64-linux-gnu/libc.so.6+0x29d8f)


probably don't want to use NF90_CLASSIC_MODEL in only one nf90_create() call, other create questions...

In src/module_fv3gfs_ncio.f90:

    ! create netcdf file
    if (dsetin%ishdf5) then
       if (dset%isparallel) then
          if (present(mpicomm)) then
             ncerr = nf90_create(trim(filename), &
                     cmode=IOR(NF90_CLOBBER,NF90_NETCDF4), ncid=dset%ncid, &
                     comm = mpicomm, info = mpi_info_null)
          else
             ncerr = nf90_create(trim(filename), &
                     cmode=IOR(NF90_CLOBBER,NF90_NETCDF4), ncid=dset%ncid, &
                     comm = mpi_comm_world, info = mpi_info_null)
          end if
       else
          ncerr = nf90_create(trim(filename), &
                  cmode=IOR(IOR(NF90_CLOBBER,NF90_NETCDF4),NF90_CLASSIC_MODEL), &
                  !cmode=IOR(NF90_CLOBBER,NF90_NETCDF4), &
                  ncid=dset%ncid)
       end if
       dset%ishdf5 = .true.
    else
       ncerr = nf90_create(trim(filename), &
               cmode=IOR(IOR(NF90_CLOBBER,NF90_64BIT_OFFSET),NF90_SHARE), &
               ncid=dset%ncid)
       dset%ishdf5 = .false.
    endif

Not sure what is going on with NF90_CLASSIC_MODEL. What it does is add a secret attribute in the file, and set a flag within the netcdf-c library, to disallow using any netCDF-4 metadata that does not work in netCDF classic. So, for example, if NF90_CLASSIC_MODEL is used, and we attempt to define an NF90_INT64 variable, we will get an error.

The intent of CLASSIC_MODEL is to allow us to write code is guaranteed to work in any format of netCDF file, so we can easily use the same code for netCDF/HDF5 and classic files.

Perhaps this is what you want, but then probably you would want it everywhere, not just in one nf90_create() call.

NF90_SHARE should probably not be there. Do you have a specific reason for adding it?

NF90_64BIT_OFFSET offers increase to 4 GB var record size. The NF90_CDF5 binary format, based on the work of the pnetcdf team, integrated with netcdf-c, provides a much less restrictive classic format that may be preferrable.

rename module module_fv3gfs_ncio to just module_ncio.

This module is generic to netCDF and not specific to FV3GFS.
There is no requirement that this library to be tied to the FV3GFS. It provides generic user friendly interfaces to netCDF Fortran API similar to netcdf4-python.

Add --output-on-failure to ctest run

@kgerheiser is it safe to say we always want --output-on-failure on every CI run?

We also have verbose, is there any other option we should be using? If we can come up with a canonical solution we can apply it everywhere.

Create 1.1.1 or 1.2.0 release for time_iso variable addition and fix for issue #58

@kgerheiser @Hang-Lei-NOAA Would it be possible to create a new 1.1.1 or 1.2.0 release for NCEPLIBS-ncio and have this version propagated through the stack? While the update in ncio/1.1.0 does correct issues encountered in getsfcensmeanp, it leads to failures in the GSI itself. The latest update, ac37269, contains changes that will allow the GSI to run without issue.

I would be greatly appreciative if you could create a new release from this version and build it in the stack on the various machines.

Thank you very much for your time.

Add Spack-based CI with OpenMPI

Add Spack-based CI tests, and while we're at it, it would be good to test with OpenMPI which gets used on at least some of the R&D systems (e.g., Hera).

GSI DA app errors with ncio/1.1.1 create_dataset

NOAA-EMC/GSI DA apps built using ncio/1.1.1 do not correctly copy all data from the input netcdf file to the output file via the create_dataset call with copy_vardata=.true.

grid_xt, grid_yt, lat, lon, pfull, and ``phalf are incorrect in the output file opened by create_dataset. For example, while the input file has

 grid_xt = 0, 1.875, 3.75, 5.625, 7.5, 9.375, 11.25, 13.125, 15, 16.875,
    18.75, 20.625, 22.5, 24.375, 26.25, 28.125, 30, 31.875, 33.75, 35.625,
    37.5, 39.375, 41.25, 43.125, 45, 46.875, 48.75, 50.625, 52.5, 54.375,
    56.25, 58.125, 60, 61.875, 63.75, 65.625, 67.5, 69.375, 71.25, 73.125,
    75, 76.875, 78.75, 80.625, 82.5, 84.375, 86.25, 88.125, 90, 91.875,
    93.75, 95.625, 97.5, 99.375, 101.25, 103.125, 105, 106.875, 108.75, ...

the output file created by create_dataset contains

 grid_xt = 0, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _,
    _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _,
    _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _,
    _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _,
    _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _,
    _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _,
    _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _,
    _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, _,
    _, _ ;

This comment applies to both atmfXXX.nc and sfcfXXX.nc files created by the current operational gfs.v16.1.8 and atm and sfc nc files created by ufs_model tag Prototype-P8c

This issue is opened to report this issue and track its resolution.

Create 1.1.2 release for time_iso variable addition and fixes to both read_vardata_code_1d and write_vardata_code

@kgerheiser @Hang-Lei-NOAA @edwardhartnett Would it be possible to create a new 1.1.1 or 1.1.2 release for NCEPLIBS-ncio and have this version propagated through the stack? While the update in ncio/1.1.1 does correct issues encountered in getsfcensmeanp, as well as corrects issues while dealing with the increment, it leads to failures in the analysis due to write_varadata_code. The latest update, 95a0d9b, contains changes that will allow the GSI to run in full analysis mode without issue.

I would be greatly appreciative if you could create a new release from this version and build it in the stack on the various machines.

Thank you very much for your time.

many undocumented functions

There are many undocumented functions in module_ncio.f90. For example read_vardata_1d_r4(). Problem is, there are many functions all tied to an interface. Well, doxygen does not document interfaces. It documents functions. So this does not work:

  !> Write data (in array values) to variable varname in dataset dset.
  !!
  !! @param[in] dset Input dataset instance returned by open_dataset/create_dataset.
  !! @param[in] varname Input string name of variable.
  !! @param values Array with variable data. Must be
  !!          an allocatable array with same rank
  !!          as variable varname (or 1 dimension less).
  !! @param nslice optional index along dimension slicedim
  !! @param slicedim optional, if nslice is set, index of which dimension to slice with
  !!          nslice, default is ndims
  !! @param ncstart optional, if ncstart and nccount are set, manually specify the
  !!          start and count of netCDF write
  !! @param nccount optional, if ncstart and nccount are set, manually specify the
  !!          start and count of netCDF write
  !! @param errcode optional error return code. If not specified,
  !!          program will stop if a nonzero error code returned
  !!          from netcdf library.
  !! @returns dataset dset
  !! @author jeff whitaker 
  interface write_vardata
     module procedure write_vardata_1d_r4, write_vardata_2d_r4, write_vardata_3d_r4,&
          write_vardata_4d_r4, write_vardata_1d_r8, write_vardata_2d_r8, write_vardata_3d_r8,&
          write_vardata_4d_r8, write_vardata_1d_int, write_vardata_2d_int, &
          write_vardata_3d_int, write_vardata_4d_int, &
          write_vardata_5d_int, write_vardata_5d_r4, write_vardata_5d_r8, &
          write_vardata_1d_short, write_vardata_2d_short, write_vardata_3d_short, &
          write_vardata_4d_short, write_vardata_5d_short, &
          write_vardata_1d_byte, write_vardata_2d_byte, write_vardata_3d_byte, &
          write_vardata_4d_byte, write_vardata_5d_byte, &
          write_vardata_1d_char, write_vardata_2d_char, write_vardata_3d_char, &
          write_vardata_4d_char, write_vardata_5d_char
  end interface write_vardata

Doxygen wants documentation for each of these functions. The fact that there are so many similar functions is a consequence of Fortran's lack of a void pointer. (And yet fortran does have something like a void pointer. @jswhit should we be using the transfer intrinsic here? But then, that copies the data...)

I will try and figure out how to handle this in doxygen without repeating all the documentation blocsk.

Question about testing of read_vardata() subroutine...

What I see in the test is this:

  print *,'*** Test read of variable data...'
  call read_vardata(dset, 'pressfc', values_3d)
  call read_vardata(dset, 'vgrd', values_4d)
  values_3d=1.013e5
  values_4d=99.

So seemingly we are reading two values, but then not actually checking if we received the correct values.

Is that correct?

missing documentation for some functions

Guys, documentation is important, please don't neglect it. What may seem obvious to you can lead to a costly or dangerous mistake when a future programmer looks at the code and does not understand.

Some functions need parameters documented. I have left "???" where you need to fill in answers. In future, all code submitted to any NCEPLIBS or UFS_UTILS must be fully documented (see NOAA-EMC/NCEPLIBS#121); please note how this is done in doxygen so you can do it yourself in future work.

Look at module_ncio.f90 and search for "???".

Also, please use proper capitalization and punctuation. The NCEPLIBS documentation should not look like it was written by second-graders! ;-)

why do 5d read_vardata functions have less parameters?

@jswhit here's a question about the ncio library code.

There are a bunch of functions like this:

  interface read_vardata
     module procedure read_vardata_1d_r4, read_vardata_2d_r4, read_vardata_3d_r4,&
          read_vardata_4d_r4, read_vardata_5d_r4, &
          read_vardata_1d_r8, read_vardata_2d_r8, read_vardata_3d_r8,&
          read_vardata_4d_r8, read_vardata_5d_r8, &
          read_vardata_1d_int, read_vardata_2d_int, &
          read_vardata_3d_int, read_vardata_4d_int, read_vardata_5d_int, &
          read_vardata_1d_short, read_vardata_2d_short, &
          read_vardata_3d_short, read_vardata_4d_short, read_vardata_5d_short , &
          read_vardata_1d_byte, read_vardata_2d_byte, &
          read_vardata_3d_byte, read_vardata_4d_byte, read_vardata_5d_byte, &
          read_vardata_1d_char, read_vardata_2d_char, &
          read_vardata_3d_char, read_vardata_4d_char, read_vardata_5d_char
  end interface read_vardata

All these functions except the 5d ones have 8 parameters:

  subroutine read_vardata_1d_r4(dset, varname, values, nslice, slicedim, ncstart, nccount, errcode)
    real(4), allocatable, dimension(:), intent(inout) :: values
    include "read_vardata_code_1d.f90"
  end subroutine read_vardata_1d_r4

But the 5d functions do not include the slice and start/count parameters:

  !> @copydoc read_vardata_4param
  subroutine read_vardata_5d_byte(dset, varname, values, errcode)
    integer(1), allocatable, dimension(:,:,:,:,:), intent(inout) :: values
    include "read_vardata_code_5d.f90"
  end subroutine read_vardata_5d_byte

How come the 5D vars don't get these parameters?

Turn on branch protections for develop branch

@Hang-Lei-NOAA you should not have been able to change the README without a PR and without a review. ;-)

So can you change permissions on this repo so that:

  • develop branch is protected
  • pull requests are always required
  • these rules apply to admins
  • CI tests must pass

Let me know if you need help with these settings, they can be a bit tricky. But they should be the same on all NCEPLIBS.

Empty test coverage results

The current test coverage output is empty even though the CI doesn't fail anymore. I can resolve the issue by removing the -fprofile-abs-path from CMAKE_Fortran_FLAGS and copying the source files into build/src/CMakeFiles/ncio.dir/, but I still don't know what the issue is.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.