Git Product home page Git Product logo

zstdnet's Introduction

ZstdNet

NuGet

ZstdNet is a wrapper of Zstd native library for .NET languages. It has the following features:

  • Compression and decompression of byte arrays
  • Streaming compression and decompression
  • Generation of Dictionaries from a collection of samples

Take a look on a library reference or unit tests to explore its behavior in different situations.

Zstd

Zstd, short for Zstandard, is a fast lossless compression algorithm, which provides both good compression ratio and speed for your standard compression needs. "Standard" translates into everyday situations which neither look for highest possible ratio (which LZMA and ZPAQ cover) nor extreme speeds (which LZ4 covers). Zstandard is licensed under BSD 3-Clause License.

Zstd is initially developed by Yann Collet and the source is available at: https://github.com/facebook/zstd

The motivation to develop of the algorithm, ways of use and its properties are explained in the blog that introduces the library: http://fastcompression.blogspot.com/2015/01/zstd-stronger-compression-algorithm.html

The benefits of the dictionary mode are described here: http://fastcompression.blogspot.ru/2016/02/compressing-small-data.html

Reference

Requirements

ZstdNet requires libzstd >= v1.4.0. Both 32-bit and 64-bit versions are supported. The corresponding DLLs are included in this repository cross-compiled using (i686|x86_64)-w64-mingw32-gcc -DZSTD_MULTITHREAD -DZSTD_LEGACY_SUPPORT=0 -pthread -s. Note that ZSTD_LEGACY_SUPPORT=0 means "do not support legacy formats" to minimize the binary size.

Exceptions

The wrapper throws ZstdException in case of malformed data or an error inside libzstd. If the given destination buffer is too small, ZstdException with ZSTD_error_dstSize_tooSmall error code is thrown away. Check zstd_errors.h for more info.

Compressor class

Block compression implementation. Instances of this class are not thread-safe. Consider using ThreadStatic or pool of compressors for bulk processing.

  • Constructor allow specifying compression options. Otherwise, default values will be used for CompressionOptions.

    Compressor();
    Compressor(CompressionOptions options);

    Options will be exposed in Options read-only field.

    Note that Compressor class implements IDisposable. If you use a lot of instances of this class, it's recommended to call Dispose to avoid loading on the finalizer thread. For example:

    using var compressor = new Compressor();
    var compressedData = compressor.Wrap(sourceData);
  • Wrap compress data and save it in a new or an existing buffer (in such case a length of saved data will be returned).

    byte[] Wrap(byte[] src);
    byte[] Wrap(ArraySegment<byte> src);
    byte[] Wrap(ReadOnlySpan<byte> src);
    int Wrap(byte[] src, byte[] dst, int offset);
    int Wrap(ArraySegment<byte> src, byte[] dst, int offset);
    int Wrap(ReadOnlySpan<byte> src, byte[] dst, int offset);
    int Wrap(ReadOnlySpan<byte> src, Span<byte> dst);

    Note that on buffers close to 2GB Wrap tries its best, but if src is uncompressible and its size is too large, ZSTD_error_dstSize_tooSmall will be thrown. Wrap method call will only be reliable for a buffer size such that GetCompressBoundLong(size) <= 0x7FFFFFC7. Consider using streaming compression API on large data inputs.

  • GetCompressBound returns required destination buffer size for source data of size size.

    static int GetCompressBound(int size);
    static ulong GetCompressBoundLong(ulong size);

CompressionStream class

Implementation of streaming compression. The stream is write-only.

  • Constructor

    CompressionStream(Stream stream);
    CompressionStream(Stream stream, int bufferSize);
    CompressionStream(Stream stream, CompressionOptions options, int bufferSize = 0);

    Options:

    • Stream stream — output stream for writing compressed data.
    • CompressionOptions options Default is CompressionOptions.Default with default compression level.
    • int bufferSize — buffer size used for compression buffer. Default is the result of calling ZSTD_CStreamOutSize which guarantees to successfully flush at least one complete compressed block (currently ~128KB).

    The buffer for compression is allocated using ArrayPool<byte>.Shared.Rent().

    Note that CompressionStream class implements IDisposable and IAsyncDisposable. If you use a lot of instances of this class, it's recommended to call Dispose or DisposeAsync to avoid loading on the finalizer thread. For example:

    await using var compressionStream = new CompressionStream(outputStream, zstdBufferSize);
    await inputStream.CopyToAsync(compressionStream, copyBufferSize);

CompressionOptions class

Stores compression options and "digested" (for compression) information from a compression dictionary, if present. Instances of this class are thread-safe. They can be shared across threads to avoid performance and memory overhead.

  • Constructor

    CompressionOptions(int compressionLevel);
    CompressionOptions(byte[] dict, int compressionLevel = DefaultCompressionLevel);
    CompressionOptions(byte[] dict, IReadOnlyDictionary<ZSTD_cParameter, int> advancedParams, int compressionLevel = DefaultCompressionLevel);

    Options:

    • byte[] dict — compression dictionary. It can be read from a file or generated with DictBuilder class. Default is null (no dictionary).
    • int compressionLevel — compression level. Should be in range from CompressionOptions.MinCompressionLevel to CompressionOptions.MaxCompressionLevel (currently 22). Default is CompressionOptions.DefaultCompressionLevel (currently 3).
    • IReadOnlyDictionary<ZSTD_cParameter, int> advancedParams — advanced API provides a way to set specific parameters during compression. For example, it allows you to compress with multiple threads, enable long distance matching mode and more. Check zstd.h for additional info.

    Specified options will be exposed in read-only fields.

    Note that CompressionOptions class implements IDisposable. If you use a lot of instances of this class, it's recommended to call Dispose to avoid loading on the finalizer thread. For example:

    using var options = new CompressionOptions(dict, compressionLevel: 5);
    using var compressor = new Compressor(options);
    var compressedData = compressor.Wrap(sourceData);

Decompressor class

Block decompression implementation. Instances of this class are not thread-safe. Consider using ThreadStatic or pool of decompressors for bulk processing.

  • Constructor allow specifying decompression options. Otherwise, default values for DecompressionOptions will be used.

    new Decompressor();
    new Decompressor(DecompressionOptions options);

    Options will be exposed in Options read-only field.

    Note that Decompressor class implements IDisposable. If you use a lot of instances of this class, it's recommended to call Dispose to avoid loading on the finalizer thread. For example:

    using var decompressor = new Decompressor();
    var decompressedData = decompressor.Unwrap(compressedData);
  • Unwrap decompress data and save it in a new or an existing buffer (in such case a length of saved data will be returned).

    byte[] Unwrap(byte[] src, int maxDecompressedSize = int.MaxValue);
    byte[] Unwrap(ArraySegment<byte> src, int maxDecompressedSize = int.MaxValue);
    byte[] Unwrap(ReadOnlySpan<byte> src, int maxDecompressedSize = int.MaxValue);
    int Unwrap(byte[] src, byte[] dst, int offset, bool bufferSizePrecheck = true);
    int Unwrap(ArraySegment<byte> src, byte[] dst, int offset, bool bufferSizePrecheck = true);
    int Unwrap(ReadOnlySpan<byte> src, byte[] dst, int offset, bool bufferSizePrecheck = true);
    int Unwrap(ReadOnlySpan<byte> src, Span<byte> dst, bool bufferSizePrecheck = true);

    Data can be saved to a new buffer only if a field with decompressed data size is present in compressed data. You can limit size of the new buffer with maxDecompressedSize parameter (it's necessary to do this on untrusted data).

    If bufferSizePrecheck flag is set and the decompressed field length is specified, the size of the destination buffer will be checked before actual decompression.

    Note that if this field is malformed (and is less than actual decompressed data size), libzstd still doesn't allow a buffer overflow to happen during decompression.

  • GetDecompressedSize reads a field with decompressed data size stored in compressed data.

    static ulong GetDecompressedSize(byte[] src);
    static ulong GetDecompressedSize(ArraySegment<byte> src);
    static ulong GetDecompressedSize(ReadOnlySpan<byte> src);

DecompressionStream class

Implementation of streaming decompression. The stream is read-only.

  • Constructor

    DecompressionStream(Stream stream);
    DecompressionStream(Stream stream, int bufferSize);
    DecompressionStream(Stream stream, DecompressionOptions options, int bufferSize = 0);

    Options:

    • Stream stream — input stream for reading raw data.
    • DecompressionOptions options Default is null (no dictionary).
    • int bufferSize — buffer size used for decompression buffer. Default is the result of calling ZSTD_DStreamInSize — recommended size for input buffer (currently ~128KB).

    The buffer for decompression is allocated using ArrayPool<byte>.Shared.Rent().

    Note that DecompressionStream class implements IDisposable and IAsyncDisposable. If you use a lot of instances of this class, it's recommended to call Dispose or DisposeAsync to avoid loading on the finalizer thread. For example:

    await using var decompressionStream = new DecompressionStream(inputStream, zstdBufferSize);
    await decompressionStream.CopyToAsync(outputStream, copyBufferSize);

DecompressionOptions class

Stores decompression options and "digested" (for decompression) information from a compression dictionary, if present. Instances of this class are thread-safe. They can be shared across threads to avoid performance and memory overhead.

  • Constructor

    DecompressionOptions();
    DecompressionOptions(byte[] dict);
    DecompressionOptions(byte[] dict, IReadOnlyDictionary<ZSTD_dParameter, int> advancedParams);

    Options:

    • byte[] dict — compression dictionary. It can be read from a file or generated with DictBuilder class. Default is null (no dictionary).
    • IReadOnlyDictionary<ZSTD_dParameter, int> advancedParams — advanced decompression API that allows you to set parameters like maximum memory usage. Check zstd.h for additional info.

    Specified options will be exposed in read-only fields.

    Note that CompressionOptions class implements IDisposable. If you use a lot of instances of this class, it's recommended to call Dispose to avoid loading on the finalizer thread. For example:

    using var options = new DecompressionOptions(dict);
    using var decompressor = new Decompressor(options);
    var decompressedData = decompressor.Unwrap(compressedData);

DictBuilder static class

  • TrainFromBuffer generates a compression dictionary from a collection of samples.

    static byte[] TrainFromBuffer(IEnumerable<byte[]> samples, int dictCapacity = DefaultDictCapacity);

    Options:

    • int dictCapacity — maximal dictionary size in bytes. Default is DictBuilder.DefaultDictCapacity, currently 110 KiB (the default in zstd utility).

Wrapper Authors

Copyright (c) 2016-present SKB Kontur

ZstdNet is distributed under BSD 3-Clause License.

zstdnet's People

Contributors

algorithmsarecool avatar borzunov avatar chipitsine avatar dscheg avatar neoskye avatar nerai avatar troublenadiagirl avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

zstdnet's Issues

Memory access violation with dictionary + max compression level

Using version 1.4.5 from nuget
To reproduce:

using ZstdNet;

var stm = File.OpenRead("outp");
var compressed = new MemoryStream();
var dict = await File.ReadAllBytesAsync("dict");

using var compressor = new CompressionStream(compressed, new CompressionOptions(dict, CompressionOptions.MaxCompressionLevel));
await stm.CopyToAsync(compressor); // <<< AccessViolationException

Results in:

System.AccessViolationException: 'Attempted to read or write protected memory. This is often an indication that other memory is corrupt.'

[Managed to Native Transition]
ZstdNet.dll!ZstdNet.CompressionStream.WriteInternalAsync(System.ReadOnlyMemory<byte> buffer, System.Threading.CancellationToken cancellationToken)
ZstdNet.dll!ZstdNet.CompressionStream.WriteAsync(System.ReadOnlyMemory<byte> buffer, System.Threading.CancellationToken cancellationToken)
System.Private.CoreLib.dll!System.IO.Stream.CopyToAsync.__Core|29_0(System.IO.Stream source, System.IO.Stream destination, int bufferSize, System.Threading.CancellationToken cancellationToken)
[Resuming Async Method]
System.Private.CoreLib.dll!System.Runtime.CompilerServices.AsyncTaskMethodBuilder<System.Threading.Tasks.VoidTaskResult>.AsyncStateMachineBox<System.IO.Stream.<<CopyToAsync>g__Core|29_0>d>.ExecutionContextCallback(object s)
System.Private.CoreLib.dll!System.Threading.ExecutionContext.RunInternal(System.Threading.ExecutionContext executionContext, System.Threading.ContextCallback callback, object state)
System.Private.CoreLib.dll!System.Runtime.CompilerServices.AsyncTaskMethodBuilder<System.Threading.Tasks.VoidTaskResult>.AsyncStateMachineBox<System.IO.Stream.<<CopyToAsync>g__Core|29_0>d>.MoveNext(System.Threading.Thread threadPoolThread)
System.Private.CoreLib.dll!System.Runtime.CompilerServices.AsyncTaskMethodBuilder<System.Threading.Tasks.VoidTaskResult>.AsyncStateMachineBox<System.IO.Stream.<<CopyToAsync>g__Core|29_0>d>.MoveNext()
System.Private.CoreLib.dll!System.Threading.ThreadPool..cctor.AnonymousMethod__87_0(object state)
System.Private.CoreLib.dll!System.Threading.Tasks.Sources.ManualResetValueTaskSourceCore<long>.SignalCompletion()
System.Private.CoreLib.dll!Microsoft.Win32.SafeHandles.SafeFileHandle.ThreadPoolValueTaskSource.ExecuteInternal()
System.Private.CoreLib.dll!Microsoft.Win32.SafeHandles.SafeFileHandle.ThreadPoolValueTaskSource.System.Threading.IThreadPoolWorkItem.Execute()
System.Private.CoreLib.dll!System.Threading.ThreadPoolWorkQueue.Dispatch()
System.Private.CoreLib.dll!System.Threading.PortableThreadPool.WorkerThread.WorkerThreadStart()
System.Private.CoreLib.dll!System.Threading.Thread.StartCallback()
[Async] System.Private.CoreLib.dll!System.IO.Strategies.BufferedFileStreamStrategy.CopyToAsyncCore(System.IO.Stream destination, int bufferSize, System.Threading.CancellationToken cancellationToken)
>	[Async] zstdtest.dll!Program.<Main>$(string[] args) Line 8	C#

outp and dict files: zstdtest.zip

Span<T> support

Now that System.Memory is available, would you be willing to accept some PRs adding Span overloads to get the API closer to 0-copy?

Shared Compressor fails in multithreaded code

I compiled sources of ZstdNet for FW 4.8. Plain usage:

var compressor = new Compressor();
data = File.ReadAllBytes(f.Filename);
ComprData = compressor.Wrap(data);

...and immediately fails:

System.AccessViolationException
HResult=0x80004003
Message=Attempted to read or write protected memory. This is often an indication that other memory is corrupt.
Source=
StackTrace:
ZStdLibFW.dll!ZstdNet.Compressor.Wrap(System.ReadOnlySpan src, System.Span dst) Line 90 C#
ZStdLibFW.dll!ZstdNet.Compressor.Wrap(System.ReadOnlySpan src) Line 56 C#
ZStdLibFW.dll!ZstdNet.Compressor.Wrap(byte[] src) Line 43 C#

It happens in Compressor.cs[90]:

var dstSize = Options.AdvancedParams != null
			    ? ExternMethods.ZSTD_compress2(cctx, dst, (size_t)dst.Length, src, (size_t)src.Length)
			    : Options.Cdict == IntPtr.Zero
				    ? ExternMethods.ZSTD_compressCCtx(cctx, dst, (size_t)dst.Length, src, (size_t)src.Length, Options.CompressionLevel)
				    : ExternMethods.ZSTD_compress_usingCDict(cctx, dst, (size_t)dst.Length, src, (size_t)src.Length, Options.Cdict);

Any idea why memory management fails?

ThreadSafety issue in DictBuilder.TrainFromBuffer

DictBuilder.TrainFromBuffer appears to deadlock occasionally when called concurrently.

Example that causes deadlocks fairly reliably:

var anArr = Enumerable.Range(0, 200000).Select(n=>(byte)(n*n)).ToArray();
var arrs= Enumerable.Range(0,200).Select(i=>anArr.Skip(i*3).Take(i*5+1000).ToArray()).ToArray();
Console.WriteLine("start");
var dicts = Enumerable.Range(0,20)
    .AsParallel() //relevant line A
    .Select(dictI=> {
        var subset = arrs
            .Skip(dictI*7%37).Take(100 + dictI*37%53).ToArray() //relevant line B
            ;
        return DictBuilder.TrainFromBuffer(subset, 9812);
    }).ToArray();
Console.WriteLine("sometimes deadlocks");

I pretty much never see the program reach "sometimes deadlocks".

Executing the program without parallelism (comment out line A) works reliably, as does building the same dictionary in all threads (comment out line B).

Please add possibility to call Dictionary creation from stream

I would like to do some training of dictionaries with datasets that are many gigabytes and consist of sometimes millions of files. I keep them in 7z files and linearly decompress them using Sharpcompress on the fly. It would be awesome to be able to feed that data straight into the dictionary creation using a custom stream or something.

Even so, the function that does the training copies an array into a stream, which for large datasets is a waste of memory when you could just directly supply the needed stream. The function that accepts the stream is in a class marked as internal so I can't access it directly.

Ideally I'd love to be able to load up a 7z file with hundreds of gigabytes of data and stream that into the dictionary creation without running out of RAM (because I don't like to keep that many files on my hard drive bare to waste space and system resources).

System.DllNotFoundException: libzstd

When I used zstdnet in web applications, it throwed System.DllNotFoundException: libzstd. I review ed the loading native DLL's source code(ExternMethods class) , found a bug that set directory dll error in SetWinDllDirectory method,as follows:
var location = Assembly.GetExecutingAssembly().Location;
In web application, it will get temporary asp.net files directory, not the native dll directory.
I suggest the code modify following:
using System.Web;
var isWebApplication = false;
if(HttpContext.Current!=null){
isWebApplication = true;
}
var location = string.empty;
if(isWebApplication){
//get web application bin direcotry
location = $"{System.AppDomain.CurrentDomain.BaseDirectory}bin/";
}else{
location = Assembly.GetExecutingAssembly().Location;
}

Combination of CompressionStream and Decompressor fails to Unwrap on multiple iterations

    [TestCase(new byte[0], 0, 0)]
    [TestCase(new byte[] { 1, 2, 3 }, 1, 2)]
    [TestCase(new byte[] { 1, 2, 3 }, 0, 2)]
    [TestCase(new byte[] { 1, 2, 3 }, 1, 1)]
    [TestCase(new byte[] { 1, 2, 3 }, 0, 3)]
    public void StreamCompressAndRegularDecompress(byte[] data, int offset, int count)
    {
        var tempStream = new MemoryStream();
        using (var compressionStream = new CompressionStream(tempStream))
            compressionStream.Write(data, offset, count);

		byte[] decompressedBytes;
        using (Decompressor decompressor = new Decompressor())
        {
            decompressedBytes = decompressor.Unwrap(tempStream.ToArray());
        }

        var dataToCompress = new byte[count];
        Array.Copy(data, offset, dataToCompress, 0, count);

        Assert.AreEqual(dataToCompress, decompressedBytes);
    }

.NET 5 on Windows - Not finding DLL

Imported this as a nuget package to a .NET 5 program. It isn't finding the DLL in the x64 folder. It does work if the file is in the same folder.

I want it embedded in a single file anyways, so I'll probably have to do something custom.

Question: Verifying we can safely reuse the context?

Hello, thanks for the great library!

We wanted to verify we aren't running into any UB here.

We are using an ObjectPool of compressors like in this comment #31 (comment) with a small tweak that we're using a custom PoolObjectPolicy and passing in a dictionary:

new DefaultObjectPool<Compressor>(
                new CustomPolicy(
                    new CompressionOptions(dictionaryBytes)));

We then call Wrap multiple times serially on the same compressor.

The question is whether there's any sort of reset on the context necessary? Following all the Wrap calls, this eventually calls:

ExternMethods.ZSTD_compress_usingCDict(this.cctx, dst, (UIntPtr) checked ((ulong) dst.Length), src, (UIntPtr) checked ((ulong) src.Length), this.Options.Cdict))).EnsureZstdSuccess()));

The docs seem to say you can reuse a ZSTD_CDict* with no issue in the bulk processing dictionary API + in the beginning it says we can reuse a context multiple times:

  When compressing many times,
  it is recommended to allocate a context just once,
  and re-use it for each successive compression operation.
  This will make workload friendlier for system's memory.
  Note : re-using context is just a speed / resource optimization.
         It doesn't change the compression ratio, which remains identical.
  Note 2 : In multi-threaded environments,
         use one different context per thread for parallel execution.
 
typedef struct ZSTD_CCtx_s ZSTD_CCtx;
ZSTD_CCtx* ZSTD_createCCtx(void);
size_t     ZSTD_freeCCtx(ZSTD_CCtx* cctx);  /* accept NULL pointer */

Just making sure this is an accurate understanding of how the lib is meant to be used?

zstdnet, version=1.4.50, does not have a strong name.

Hi,
I am trying to use the ZStdNet nuget package but it seems like the assemblies present in the Nuget are Not Strong name signed.

Due to this, I can't use this package in production.

Kindly update the package with Signed binaries.

Thanks,
Satish Chandra

DictBuilder.TrainFromBuffer throw Error (generic) exception

Am I correct? I want to make my own dictionary to do the compression, but I cannot build the dictionary successfully.

            List<string> samples = new List<string>();
            samples.Add("Testing1");
            samples.Add("Testing2");
            samples.Add("Testing3");
            samples.Add("Testing4");
            samples.Add("Testing5");
            samples.Add("Testing6");
            samples.Add("Testing7");
            var buffer= samples.Select(s => Encoding.UTF8.GetBytes(s)).ToArray();            
            var trainedParams= DictBuilder.TrainFromBuffer(buffer);      <-- throw ZstdNet.ZstdException: 'Error (generic)'

Frame requires too much memory for decoding (+ fix)

I've had some issues with decompression a single file, other went fine.

My fix was the following:

using FileStream fileStreamInput = File.OpenRead("TheInputPath");
using FileStream fileStreamOutput = File.OpenWrite("TheOutputPath");

// This is the fix
IReadOnlyDictionary<ZSTD_dParameter, int> advancedParams = new Dictionary<ZSTD_dParameter, int>([
    new KeyValuePair<ZSTD_dParameter, int>(ZSTD_dParameter.ZSTD_d_windowLogMax, 31)
]);

DecompressionOptions decompressionOptions = new(null, advancedParams);
await using var decompressionStream = new DecompressionStream(fileStreamInput, decompressionOptions);
await decompressionStream.CopyToAsync(fileStreamOutput);

Took the 31 from this ZSTD_WINDOWLOG_MAX_64:
https://github.com/facebook/zstd/blob/v1.4.5/lib/zstd.h#L1049

I have no idea what it all means, but I hope it helps someone stuck with the same issue 😃

PInvokeStackImbalance when calling from x86

When calling from x86 code, the Managed Debug Assistant breaks on PInvokeStackImbalance. This appears to be because the DllImport attributes in ExternMethods.cs do not contain CallingConvention = CallingConvention.Cdecl

Streaming support

Is streaming support planned, or was it left out due to technical limitations?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.