Git Product home page Git Product logo

pump's Introduction

pump

pump is a small node module that pipes streams together and destroys all of them if one of them closes.

npm install pump

build status

What problem does it solve?

When using standard source.pipe(dest) source will not be destroyed if dest emits close or an error. You are also not able to provide a callback to tell when then pipe has finished.

pump does these two things for you

Usage

Simply pass the streams you want to pipe together to pump and add an optional callback

var pump = require('pump')
var fs = require('fs')

var source = fs.createReadStream('/dev/random')
var dest = fs.createWriteStream('/dev/null')

pump(source, dest, function(err) {
  console.log('pipe finished', err)
})

setTimeout(function() {
  dest.destroy() // when dest is closed pump will destroy source
}, 1000)

You can use pump to pipe more than two streams together as well

var transform = someTransformStream()

pump(source, transform, anotherTransform, dest, function(err) {
  console.log('pipe finished', err)
})

If source, transform, anotherTransform or dest closes all of them will be destroyed.

Similarly to stream.pipe(), pump() returns the last stream passed in, so you can do:

return pump(s1, s2) // returns s2

Note that pump attaches error handlers to the streams to do internal error handling, so if s2 emits an error in the above scenario, it will not trigger a proccess.on('uncaughtException') if you do not listen for it.

If you want to return a stream that combines both s1 and s2 to a single stream use pumpify instead.

License

MIT

Related

pump is part of the mississippi stream utility collection which includes more useful stream modules similar to this one.

For enterprise

Available as part of the Tidelift Subscription.

The maintainers of pump and thousands of other packages are working with Tidelift to deliver commercial support and maintenance for the open source dependencies you use to build your applications. Save time, reduce risk, and improve code health, while paying the maintainers of the exact dependencies you use. Learn more.

pump's People

Contributors

dignifiedquire avatar feross avatar mafintosh avatar pgte avatar phated avatar shinnn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

pump's Issues

How to propagate errors in pipe using pump?

Hello,

I am trying to propagate errors that may occur in a pipe but it is not working for me and wondering if someone can help? I am trying to replace the _handleError(err) method by using pump. Please see code snippet below:
screen shot 2018-10-23 at 9 35 22 am

Callback is called when destination file is still open if source stream emits error

Consider the following code:

const fs = require('fs')
const stream = require('stream')
const pump = require('pump')

fs.mkdirSync('foo')
const source = new stream.Readable({
    read() { this.emit('error', new Error('some error')) }
})
const dest = fs.createWriteStream('foo/dest')
pump(source, dest, err => {
    if (err) {
        fs.unlinkSync('foo/dest')
        fs.rmdirSync('foo') // Throws ENOTEMPTY
    }
})

Looks like the callback is called when the destination file is still open, which prevents the directory "foo" from being deleted.

As a workaround I can move the cleanup code a bit:

const fs = require('fs')
const stream = require('stream')
const pump = require('pump')

fs.mkdirSync('foo')
const source = new stream.Readable({
    read() { this.emit('error', new Error('some error')) }
})
const dest = fs.createWriteStream('foo/dest')
dest.on('close', () => {
    fs.unlinkSync('foo/dest')
    fs.rmdirSync('foo') // OK
})
pump(source, dest, () => {})

This works but the code gets really messy.

Any suggestions would be highly appreciated!

Tested on Windows 10 with node 9.1.0 and 9.5.0.

Question: Is pump compatible with the request library?

Greetings!

I'm trying to learn how to better manage stream-related errors in a web application. I'm handling file uploads to a web server using the request library (https://www.npmjs.com/package/request).

Example:

const readStream = fs.createReadStream('path/to/file.jpg');
const writeStream = request.post('http://localhost/uploads');
readStream.pipe(writeStream);

For the most part, this works. However, error handling is a little tricky and I suspect I'm not handling errors and closing all streams properly. What I want to do is make sure that any issues with either stream result in both being closed and cleaned up properly, to prevent leaking file descriptors.

I'm trying to use pump to help with that, as it seems like it should take care of that scenario for me. However, I observe something unexpected when executing the following code:

const readStream = fs.createReadStream('path/to/file.jpg');
const writeStream = request.post('http://localhost/uploads');
writeStream.on('end', () => console.log('the request stream has ended'));
pump(readStream, writeStream, (err) => console.log('PUMP has finished'));

When I run this example, I see "PUMP has finished" appear long before "the request stream has ended". This would indicate that pump thinks the stream is done before we get back an http response.

This poses a problem for me, since I would be relying on pump to tell me when everything is finished before knowing when this asynchronous task is completed.

Is this expected? Am I doing something wrong?

Versions:

pump: 3.0.0
request: 2.81.0
node: 8.9.3

pump on "unclosable" stream

According to nodejs documentation, not all streams emit close event, process.stdout is an example.

In the classic situation like this:

pump(readableStream, transformStream, process.stdout, function (err) {
  console.log('END', err)
})

the log is never printed

To make a workaround I wrote:

pump(stream, transform, function (err) {
  console.log('END', err)
}).pipe(outputStream)

Is there another way more elegant to write this?

pump with series tasks

Hello, very very thanks.
I use pump in sass, compileSrc, compileLibs, compileHtml tasks

gulp.task('build',
    function (cb) {
        runSequence(
            'clean',
            'sass',
            'compileSrc',
            'compileLibs',
            'compileHtml',
            cb
        );
    }
);

result run build:

[17:05:37] Starting 'clean'...
[17:05:37] Finished 'clean' after 36 ms
[17:05:37] Starting 'sass'...
[17:05:39] Finished 'sass' after 1.52 s
[17:05:39] Starting 'compileSrc'...

Process finished with exit code 0

but run manual seperate this tasks or use pipe, result ok.

Async errors not caught in Node < 10

I'm seeing different behaviour in Node 8 and 9 vs 10, 11 and 12.

The Node 8 docs suggest an error is emitted when calling Readable.prototype.destroy, however pump doesn't propagate the error to the callback. e.g.

const { Readable, Writable } = require('stream');
const pump = require('pump');

console.log(process.version);

const readable = new Readable({
    read() {
        process.nextTick(() => this.destroy(new Error('This 
*async* error *is not* caught by `pump` in Node < 10')))
        // this.destroy(new Error('This *sync* error *is* caught by `pump` in Node < 10'))
    },
});

const writable = new Writable({
    write(chunk, encoding, done) {
        console.log(chunk);
        done();
    }
})

pump(readable, writable, (ex) => {
    console.log('Finished');
    console.error(ex); // undefined in Node < 10
});

Surprisingly it can be caught by binding directly to the readable error event. e.g.

readable.on('error', (ex) => {
    console.log('Error handler', ex); // Error
});

Which is why I think it might be an issue with pump, rather than Node core?

"premature close" error when calling destroy on source stream

I am using node js 8.* and it has new method destroy for Readable streams.

https://nodejs.org/docs/latest-v8.x/api/stream.html#stream_readable_destroy_error

The following is generating premature close error coming from end-of-stream module.

var pump = require('pump')
var fs = require('fs')

var source = fs.createReadStream('/dev/random')
var dest = fs.createWriteStream('/dev/null')

pump(source, dest, function(err) {
  console.log('pipe finished', err)
})

setTimeout(function() {
  source.destroy();
}, 1000)

Any ideas?

Browser Support

I see #5, but it is not mentioned in the readme and imports fs.

I can open a PR to run the tests in Sauce Labs like @feross mentioned, and add a badge to the readme if you want. I am not sure how to do it without a tool like mocha, but I am sure I could figure it out.

If you are bundling the package with a tool like browserify you would have to exclude the fs module. There is a fairly simple way I have used to support smaller browser builds using a small DI wrapper. I can make a PR for this to if you are interested.

Thanks!!

Incrementally pipe additional streams & handle split streams

I'm blundering through stream error handling and found this great library, but unfortunately I think it really can't handle split streams (e.g. using readable-stream-clone).

For the most general solution, it seems we can't rely on having all the streams at once to pipe(), or that the pipe() topology is a linked list.

I'm imagining a StreamWrapper object that keeps track of all the piped streams, and does the pump eos destroy stuff, but allows you to incrementally pipe additional streams.

Any thoughts or existing options for this? I'm reluctant to wade into these difficulties but I think I need a more versatile solution. Thanks!

Reduce the size of the npm package by limiting the included files

Looks like the files property (https://docs.npmjs.com/files/package.json#files) is not used in package.json to specify the included files, nor is the .npmignore file (https://docs.npmjs.com/misc/developers#keeping-files-out-of-your-package) is being used for blacklisting unwanted files, for the package published to npm.

Would you consider adding either the files property or the .npmignore file, so that the resulting package file would have smaller size?

The current size can be seen when executing the command npm pack (https://docs.npmjs.com/cli/pack).

This issue was create via tawata

bring into node core

At the face to face in Berlin we brought up bringing this into core and exposing it as require('stream').pump. A couple questions:

  1. are you ok with this idea?
  2. do you want to open the pull or are you ok if I or @mcollina does so?
  3. How do you feel about waving copyright on this and end-of-stream? It's OK if you don't want to do so but if you do it will make it significantly less complicated to bring it into node core.

callback in pump firing before _flush complete

This is probably a request for clarification rather than a bug report. I have a transform stream with an async _flush method. However, pump's callback fires before I call the callback in flush. As a result, my app terminates early. If instead of using pump, I just resolve a promise on the 'end' event, my app functions normally. Is there something about using async await inside _flush in a transform stream that breaks pump, or is bad practice?

Question: "Error: task completion callback called too many times"

Hi, I'm getting this error:

Error: task completion callback called too many times

var gulp = require('gulp');
var sass = require('gulp-sass');
var uglify = require('gulp-uglify');
var autoprefixer = require('gulp-autoprefixer');
var cleanCSS = require('gulp-clean-css');
var rename = require('gulp-rename');
var runSequence = require('run-sequence');
var pump = require('pump');

var pumpCb = function (err) {
    if (err) {
        console.log('Error: ', err.toString());
    }
};

// Local

gulp.task('local-sass', function (pumpCb) {
    pump([
            gulp.src('public/assets/src/sass/**/*.scss'),
            sass(),
            autoprefixer(),
            gulp.dest('public/assets/dist/css/')
        ],
        pumpCb
    );
});

gulp.task('local-js', function (pumpCb) {
    pump([
            gulp.src('public/assets/src/js/**/*.js'),
            gulp.dest('public/assets/dist/js/')
        ],
        pumpCb
    );
});

// Production

gulp.task('prod-sass', function (pumpCb) {
    return pump([
            gulp.src('public/assets/src/sass/**/*.scss'),
            sass({outputStyle:'compact'}),
            autoprefixer(),
            gulp.dest('public/assets/dist/css/')
        ],
        pumpCb
    );
});

gulp.task('prod-css', function (pumpCb) {
    return pump([
            gulp.src(['./public/assets/dist/css/**/*.css','!./public/assets/dist/css/**/*.min*']),
            cleanCSS(),
            rename({ suffix: '.min' }),
            gulp.dest('public/assets/dist/css/')
        ],
        pumpCb
    );
});

gulp.task('prod-js', function (pumpCb) {
    return pump([
            gulp.src('public/assets/src/js/**/*.js'),
            uglify(),
            rename({ suffix: '.min' }),
            gulp.dest('public/assets/dist/js')
        ],
        pumpCb
    );
});

// Calls

gulp.task('default', function () {
    gulp.watch('public/assets/src/sass/**/*.scss',['local-sass']);
    gulp.watch('public/assets/src/js/**/*.js',['local-js']);
});

gulp.task('prod', function () {
    runSequence('prod-sass','prod-css','prod-js');
});

Not sure what's going wrong?
Trying to search for answers didn't yield anything that helped.

Pass error in stream.destroy()

Why stream.destroy is not call with err argument ?

In my stream object it will be nice to get the error.

class AskReply extends Transform {
  destroy(error) {  //error is always undefined
    return super.destroy(error);
  }
} 

stream remains active after pump callback when error does not terminate stream

If a readable stream in a pump chain emits an error but continues to produce data the pump callback will be invoked at the point of error and the stream will continue to run. The callback will not be re-invoked when/if the streams actually close. This appears to be due to internal use of end-of-stream which assumes "failure" and closure are coincident.

Sample code here.

A stream that continues after an error is certainly unusual, but it's legal, and pump does not fulfill its responsibility by invoking the callback and shutting down the remaining pipe stages when the stream actually closes or errors at dest in this situation. I'm not going to argue strongly that it should, but IMO saying it shouldn't isn't trivially justifiable either. So, your call.

Multiple pumps inside a single task

Is there any best-practice-way of using multiple pumps inside a single task and return the callback after all of them finished?

Maybe a simplified demo code clearifies the question:

gulp.task('compress', function (cb) {
  const folders = ['folder1', 'folder2'];

  folders.forEach((folder) => {
    pump([
        gulp.src(`lib/${folder}/*.js`),
        uglify(),
        gulp.dest(`${folder}/dist/`)
      ],
      cb // <= this is wrong of course
    );
  });  

  // cb(); // calling it here might be as wrong as well
});

This is, of course, not a real life code snippet. But it shows the need to run multiple pumps inside a single task. In my real case the source and destination folders are a little bit more complex.

Is there any way to solve that with pump?

Error are not triggered

Please see 'end-of-stream' module's second param options,

eos(stream, {readable: reading, writable: writing}, function (err) {
    if (err) return callback(err)
    closed = true
    callback()
  })

can you change the codes to

eos(stream, {readable: reading, writable: writing, error: true}, function (err) {
    if (err) return callback(err)
    closed = true
    callback()
  })

Question: when not to use `pump`?

Consider this example.

As soon as stream data have been collected, the HTTP connection is closed just before someAsyncStuff is invoked. Refactoring pump(req, busboy... to req.pipe(busboy) fixes the issue. It's also my understanding that using pump is a best practice for piping streams (for error handling and some clean-up), but in this case it introduces an issue. I would like to understand why this example is not working as I was expecting.

import * as Busboy from 'busboy'
import * as express from 'express'
import * as pump from 'pump'
import concat = require('concat-stream')

const app = express()

app.post('/', (req, res) => {
  const busboy = new Busboy({ headers: req.headers })

  busboy.on('file', (_, file) => {
    pump(
      file,
      concat(buffer =>
        someAsyncStuff(buffer.toString())
          .then(length => res.send({ length }))
          .catch(err => res.status(500).send(err.message))
      ),
      err => {
        if (err) res.status(500).send(err.message)
      }
    )
  })

  pump(req, busboy, err => {
    if (err) res.status(500).send(err.message)
  })
})

function someAsyncStuff(s: string): Promise<number> {
  return new Promise(resolve => setTimeout(() => resolve(s.length), 1))
}

app.listen('3000')

http2

http2 uses streamClosed instead of close which pump doesn't seem to take into account?

Pump closes the stream after just 16 iterations

running this script:

const fs = require('fs');
const split = require('split2');
const through2 = require('through2');
const pump = require('pump');

const Job = class {

    constructor(file) {
        this.file = file;
        this.counter = 0;
    }

    parseLine() {
        return through2.obj((data, enc, cb) => {
            this.counter++;
            const buffer = new Buffer(data);
            console.log(this.counter);
            return cb(null, buffer);
        });
    }

    stream() {
            return new Promise((resolve, reject) => {
            pump(fs.createReadStream(this.file, 'utf-8'),
            split(),
            this.parseLine(),
             err => {
                if(err) {
                    console.log(err);
                    return reject(err);
                } else {
                    console.log(this.counter);
                    return resolve('over');
                }
            }
            );
        });
    }
}

const runJob = async (file) => {
    const job = new Job(file);
    const res = await job.stream();
    console.log(res);
}

runJob('/home/branko/Desktop/shards/xab_1.jsonl');

parses just 16 records, although there are like 2k in the file

if I add a write stream which I don't need, like this:

stream() {
        const writeStream = fs.createWriteStream('/home/branko/Desktop/file');
        return new Promise((resolve, reject) => {
            pump(fs.createReadStream(this.file, 'utf-8'),
            split(),
            this.parseLine(),
            writeStream,
            err => {
                if(err) {
                    console.log(err);
                    return reject(err);
                } else {
                    console.log(this.counter);
                    return resolve('over');
                }
            }
            );
        });
    }

it parses through all records. I just need one stream

Use pump with a Transform Stream as final stream object

Using pump like this;

pump.apply null, [
  tcpSocket
  transformStream
  writableStream
  handleEndFn
]

will call the handleEndFn just fine when the TCP socket breaks / closes.

Now, due to the way the destroyer passes its arguments (based on the stream object position) to the end-of-stream module, the following

pump.apply null, [
  tcpSocket
  transformStream
  transformStream
  handleEndFn
]

will not trigger the handleEndFn function (allowHalfOpen defaults to true).

What would be the handest way to circumfence the obliged applcation of pump, but now using a Transform Stream as the last stream object in the pipeline?

pump swallowing errors when read stream is empty

pump has inconsistent behavior with pipe, it is swallows writable errors, when readable stream is empty, for example, such code will log undefined:

const {Readable} = require('stream');
const {createWriteStream} = require('fs');
const pump = require('pump');

const inStream = new Readable({
  read() {}
});

inStream.push(null);

pump(inStream, createWriteStream('/'), console.log);

And this one will show an error:

const {Readable} = require('stream');
const {createWriteStream} = require('fs');
const pump = require('pump');

const inStream = new Readable({
  read() {}
});

inStream.push(null);
inStream.pipe(createWriteStream('/'));

This relates to nodejs/node#24517

pump throttle

Could you verify that this where a throttle would be placed?

buffered data in readable stream does not reach writable stream

If an error occurs all streams are destroyed. This prevents buffered data in the pipe from reaching the destination.

Here is a test case:

const EventEmitter = require('events');
const net = require('net');
const pump = require('pump');
const { Writable } = require('stream');

class FakeParser extends Writable {
  constructor(opts) {
    super(opts);
  }

  _write(chunk, enc, cb) {
    console.log(chunk);
    setTimeout(cb, 200);
  }
}

const ee = new EventEmitter();
const opts = { allowHalfOpen: true };
const server = net.createServer(opts);

server.on('connection', (socket) => {
  const parser = new FakeParser({ highWaterMark: 3 });
  pump(socket, parser, console.error);

  setTimeout(() => {
    ee.emit('trigger econnreset');
    socket.write('foo');
  }, 150);
});

server.listen(() => {
  const socket = net.createConnection(
    Object.assign({ port: server.address().port }, opts)
  );

  ee.on('trigger econnreset', () => socket.destroy());

  socket.on('connect', () => {
    socket.write('foo');
    setTimeout(() => socket.write('bar'), 50);
    setTimeout(() => socket.write('baz'), 100);
  });
});

Expected result:

All three chunks are processed by the writable stream.

Actual result:

Only the first is processed.


This is probably done by design but I'd like to confirm this.

pump + webpack-stream exits watch task

The combination of webpack-stream and pump exits the watch task on error during webpack execution. I used the same pattern as the uglify example here: https://github.com/gulpjs/gulp/tree/master/docs/why-use-pump

Is this most likely an issue within webpack-stream (maybe it doesn't exit correctly?) or is there any option in pump to not exit on error??? I thought pump does exactly this -> to reliably catch errors and does not exit the running watch process...

objectMode piping?

Does pump handle object streams correctly? I'm having trouble getting it to work, e.g.

const through = require('through2')
const ndjson = require('ndjson')
cons pump = require('pump')

const rs = process.stdin
const ts = ndjson.parse()
const ws = through.obj(function (chunk, enc, cb) {
  console.log(typeof chunk)
})

// this works fine
rs.pipe(ts).pipe(ws)
// => 'object'

// this turns objects back into strings
pump(rs, ts, ws)
// => 'string'

Any idea what's going on?

returning last stream

Before version 2, pump() used to return the last stream, and apparently now it doesn't.
Even though this wasn't a documented feature before, I find it really useful to simplify the code.

pump@1:

return pump(s1, s2)

pump@2:

pump(s1, s2)
return s2

Do you think this could be a useful feature? If so, I'm willing to add a PR.

Callback not called with stdout/stderr

Probably because they don't emit the finish, but it would be nice to handle them properly anyway.

It's unfortunate because in my CLI I often do:

pump(
  inputStream,
  output  === undefined || output === '-'
    ? process.stdout
    : createWriteStream(output),
  () => {
    // do some cleanup after transfer
  }
)

Pump synchronized ?

I have an issue, I'm using pump on every tasks I have (there is 4):

  • compileTS
  • minify
  • concat
  • unitTests

I need all these tasks to be synchronous, so I use gulp-sync. But it still seems that tasks are asynchronous. I'm wondering if it comes from Pump ?

Note: I can provide more details if needed.

Consider switching to promise based API

Since promises is being embraced more and more in the Node.js community, I would just like to see if there is any interest in releasing a new major version of this module which returns a Promise instead of taking a callback. I personally would find this very convenient since I'm currently "promisifying" this library before I use it. This would also solve #9 in the same go...

I'd be happy to submit a pull request, thanks for the consideration ๐Ÿ‘Œ

use `pump(zipStream, writeStream)`, the zip file at disk is nested one level more

for example;
zip file: upload.zip include a foo.json, then upload it use stream, when i get the stream at server. and do this;

 const ctx = this.ctx;
 const stream = await ctx.getFileStream();
 const target = path.join(this.app.baseDir, 'app/upload', 'demo.zip');
 const writeStream = createWriteStream(target);
pump(stream ,writeStream )

the file render at disk actualy is demo.zip inclued upload -> foo.json. if I do this. it will be normal

 const ctx = this.ctx;
 const stream = await ctx.getFileStream();
 const target = path.join(this.app.baseDir, 'app/upload', 'demo.zip');
 const writeStream = createWriteStream(target);
 stream.pipe(writeStream);
 stream.on('finish', () => {
     stream.close();
 })

Arrays-in-arrays to create reuseable functions

Using gulp, we often find ourselves creating pipelines that end with the same few transforms.

    pump([
        src("somefiles/**/*.js"),
        somtransformation(),
        rename(renameMin),
        uglify(),
        dest("/somedirectory")
    ], cb);

It would be nice if the input arguments could be flattened first, to allow for subarrays:

    pump([
        src("somefiles/**/*.js"),
        somtransformation(),
        [
            rename(renameMin),
            uglify(),
            dest("/somedirectory")
        ]
    ], cb);

This would enable us to write this code, where the function tail() returns a partial pipeline:

const tail = function(dir) {
    return [ 
        rename(renameMin),
        uglify(),
        dest(dir)
    ];
}

    pump([
        src("somefiles/**/*.js"),
        somtransformation(),
        tail("/somedirectory")
    ], cb);

Qustion: use pump with substreams

Is it correct ussage of pump? I would like to unzip tar.gz file, which contains several files.

  var zlib = require('zlib');
  var tar = require('tar');
  var fs = require('fs'); 

  var source = fs.createReadStream('/dev/random');

  pump(
    source,
    zlib.Gunzip(),
    extract
      .on('entry', (entry) => {
        const entryDest = path.join(destFolder, entry.path);
        pump(entry, fs.createWriteStream(entryDest), (err) => {
          console.log(err);
        });
      }),
    (err) => {
      console.log(err);
    }
  );

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.