Git Product home page Git Product logo

audio-recorder-polyfill's Introduction

Audio Recorder Polyfill

MediaRecorder polyfill to record audio in Edge and Safari. Try it in online demo and see API.

  • Spec compatible. In the future when all browsers will support MediaRecorder, you will remove polyfill.
  • Small. 1.11 KB (minified and gzipped). No dependencies. It uses Size Limit to control size.
  • One file. In contrast to other recorders, this polyfill uses “inline worker” and don’t need a separated file for Web Worker.
  • MP3 and WAV encoder support.
navigator.mediaDevices.getUserMedia({ audio: true }).then(stream => {
  recorder = new MediaRecorder(stream)
  recorder.addEventListener('dataavailable', e => {
    audio.src = URL.createObjectURL(e.data)
  })
  recorder.start()
})
Sponsored by Evil Martians

Install

Install package:

npm install --save audio-recorder-polyfill

We recommend creating separated webpack/Parcel bundle with polyfill. In this case, polyfill will be downloaded only by Edge and Safari. Good browsers will download less.

Files recorded without the polyfill will not be playable on Safari, it is highly recommended to convert it to MP3 on the back-end of your application. If that’s not an option you can use the polyfill in all browsers to force the audio to be converted to the right format with the price of client’s performance.

  entry: {
    app: './src/app.js',
+   polyfill: './src/polyfill.js'
  }

Install polyfill as MediaRecorder in this new bundle src/polyfill.js:

import AudioRecorder from 'audio-recorder-polyfill'
window.MediaRecorder = AudioRecorder

Add this code to your HTML to load this new bundle only for browsers without MediaRecorder support:

+   <script>
+     if (!window.MediaRecorder) {
+       document.write(
+         decodeURI('%3Cscript defer src="/polyfill.js">%3C/script>')
+       )
+     }
+   </script>
    <script src="/app.js" defer></script>

ES Modules

Polyfill supports ES modules. You do not need to do anything for bundlers.

For quick hacks you can load polyfill from CDN. Do not use it in production because of low performance.

import AudioRecorder from 'https://cdn.jsdelivr.net/npm/audio-recorder-polyfill/index.js'
window.MediaRecorder = AudioRecorder

Usage

In the beginning, we need to show a warning in browsers without Web Audio API:

if (MediaRecorder.notSupported) {
  noSupport.style.display = 'block'
  dictaphone.style.display = 'none'
}

Then you can use standard MediaRecorder API:

let recorder

recordButton.addEventListener('click', () => {
  // Request permissions to record audio
  navigator.mediaDevices.getUserMedia({ audio: true }).then(stream => {
    recorder = new MediaRecorder(stream)

    // Set record to <audio> when recording will be finished
    recorder.addEventListener('dataavailable', e => {
      audio.src = URL.createObjectURL(e.data)
    })

    // Start recording
    recorder.start()
  })
})

stopButton.addEventListener('click', () => {
  // Stop recording
  recorder.stop()
  // Remove “recording” icon from browser tab
  recorder.stream.getTracks().forEach(i => i.stop())
})

If you need to upload record to the server, we recommend using timeslice. MediaRecorder will send recorded data every specified millisecond. So you will start uploading before recording would finish.

// Will be executed every second with next part of audio file
recorder.addEventListener('dataavailable', e => {
  sendNextPiece(e.data)
})
// Dump audio data every second
recorder.start(1000)

Audio Formats

Chrome records natively only to .webm files. Firefox to .ogg.

You can get used file format in e.data.type:

recorder.addEventListener('dataavailable', e => {
  e.data.type //=> 'audio/wav' with polyfill
              //   'audio/webm' in Chrome
              //   'audio/ogg' in Firefox
})

WAV

As default, this polyfill saves records to .wav files. Compression is not very good, but encoding is fast and simple.

MP3

For better compression you may use the MP3 encoder.

import AudioRecorder from 'audio-recorder-polyfill'
import mpegEncoder from 'audio-recorder-polyfill/mpeg-encoder'

AudioRecorder.encoder = mpegEncoder
AudioRecorder.prototype.mimeType = 'audio/mpeg'
window.MediaRecorder = AudioRecorder

Limitations

This polyfill tries to be MediaRecorder API compatible. But it still has small differences.

  • WAV format contains duration in the file header. As result, with timeslice or requestData() call, dataavailable will receive a separated file with header on every call. In contrast, MediaRecorder sends header only to first dataavailable. Other events receive addition bytes to the same file.
  • Constructor options are not supported.
  • BlobEvent.timecode is not supported.

Custom Encoder

If you need audio format with better compression, you can change polyfill’s encoder:

  import AudioRecorder from 'audio-recorder-polyfill'
+ import customEncoder from './ogg-opus-encoder'
+
+ AudioRecorder.encoder = customEncoder
+ AudioRecorder.prototype.mimeType = 'audio/ogg'
  window.MediaRecorder = AudioRecorder

The encoder should be a function with Web Worker in the body. Polyfill converts function to the string to make Web Worker.

module.exports = () => {
  function init (sampleRate) {
    
  }

  function encode (input) {
    
  }

  function dump (sampleRate) {
    
    postMessage(output)
  }

  onmessage = e => {
    if (e.data[0] === 'init') {
      init(e.data[1])
    } else if (e.data[0] === 'encode') {
      encode(e.data[1])
    } else if (e.data[0] === 'dump') {
      dump(e.data[1])
    }
  }
}

audio-recorder-polyfill's People

Contributors

ai avatar dependabot[bot] avatar fjwong avatar floydback avatar harwee avatar huigegegithub avatar jiyonghong avatar joekrill avatar karthikmuralidharan avatar mick88 avatar progfay avatar serverwentdown avatar sseppola avatar tomquirk avatar wmattei avatar youkaclub avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

audio-recorder-polyfill's Issues

Conditionally importing for safari and edge

Hi, thanks for this very useful polyfil,

My question is: There is any way to avoid overidding the original MediaRecorder?

In my project It looks like something like:

import * as MediaRecorderPoly from 'audio-recorder-polyfill';

Still overrides it globally, and that kind of behaviour is what I wanted only safari and edge to do

Can this library be used with React for a PWA?

Hi there,

I was researching for libraries that handles audio recording for my react pwa for both iOS and Android. I was wondering if this library can be used together with react? If yes, is there any examples out there?

iOS supported?

Polyphill start record on ios (13), but can't play records sound after stop. What's wrong?

iOS supported?

Reference to iOS supported? #36!!

It's working on IPhone XS, IPad etc but not on IPhone X.

After the recording is stopped, not error, but the wav file it's not playing and it's invalid.

Built version including mp3 encoding

Hi, I develop using a buildless environment. How can I get a built version of the polyfill to include to my project ? Tried to build the repo but failed on node-env
Thanks

onaudioprocess is not being fired in ios safari.

I'm looking forward for this polyfill :)

In the current version I have a problem with the playback of the blob in ios safari 11 (problem is also in the demo). Making the blob into url using createObjectURL() and attach it to a player don't work in ios safari 11.

Video support

Any chance it would also record video type streams? :)

Trouble with Safari playing sound

Hello,

I have some trouble on Safari 11 for playing the recorded sound.
It seems like the blob created doesn't load.

It used to work before and now the safari player is broken.
I first thought it has to do with my code but then I went back to your demo and there is the same bug.

Are you aware of this problem and do you have any clues to solve it ?

Thanks you in advance

44bytes wav Safari 13.0.5 userAgent

Hello, on recent version Safari some of recordings are 44 bytes size with no audio. Aprox. 70percents of the recordings are good, 30percents - no audio. No apparent reason i can think of. Whats the problem?

this.requestData is not a function

I received this error when terminating the recording with a timer and can absolutely not figure out why it appears.

The error references https://github.com/ai/audio-recorder-polyfill/blob/master/index.js#L133 as the source of the issue, but since the class clearly has a method of that name i'm a bit lost on why the error gets thrown.

Sorry if i'm not seeing the clear issue here.

I'm using the suggested polyfill implementation given in the README with Webpack.

Thank you for your help :)

Add documentation on how to build library and use it

During development phase, I need to use a built version of the polyfill. I ran 'npm run start' and a /test/demo/build directory was created.
I added <script src="/audio-recorder-polyfill-master/test/demo/build/polyfill.js"></script> in index.html.

How do I access the library ? In particular I want to compress files in mp3. I am not sure how to use
https://github.com/ai/audio-recorder-polyfill#mp3 with the built library.

Glitchy audio recordings on low end/older mobile devices

Hi, we've created a website that lets people record 2 minute audio clips and then upload them. It looks like on lower end mobile devices we get very choppy recordings, for example:

https://prod.roundware.com/rwmedia/20200611-144343-72862.mp3

Which I think was recorded on an android 8 device:

Mozilla/5.0 (Linux; Android 8.1.0; 6062W) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.132 Mobile Safari/537.36

Our code is basically the example from your readme, but configured to create the blob as mp3. (Here's a snippet)

// Record with mp3.
MediaRecorder.encoder = mpegEncoder;
MediaRecorder.prototype.mimeType = "audio/mpeg";

startRecordButton.addEventListener("click", () => {
  console.log("Start recording!");

  // Request permissions to record audio
  navigator.mediaDevices.getUserMedia({ audio: true }).then(stream => {
    recorder = new MediaRecorder(stream);

    // Set record to <audio> when recording will be finished
    recorder.addEventListener('dataavailable', e => {
      console.log('data available: ' + e.data.size);
      saveBlob = e.data;
    })

    // Start recording
    recorder.start();
  })

Is there any way to improve the quality of recordings on these lower end devices? The glitching seems to happen even for short audio recordings, so it doesn't seem like it's the size of the blob that causes it.

Thanks.

MP3 encoder seems to record at ~90% normal speed?

I use the polyfill for explain.mit.edu, which syncs the audio to blackboard visuals.

However, I noticed that when I switched from the WAV encoder to the MP3 encoder, my voice becomes deeper and slower, and the visuals gets more and more ahead of the audio (1 second ahead after 10 seconds, 2 seconds ahead after 20 seconds, etc.). My hypothesis is that the MP3 encoder records at maybe 90% normal speed because the MP3 recorder has a non-trivial encoding process.

A personal workaround is to slow down my app's internal timer to match that of the slowed MP3 audio.
However, I prefer the following solutions (but don't know how):

  1. Find a way to do read the MediaRecorder's current timestamp, and let the audio be the source of truth for timestamp. Is this possible?
  2. Verify with you all that the MP3 does record in at a slowed rate, and wait til the fix. Is this on your roadmap?

Thanks for reading and appreciate any help.

How to know the AudioContext.currentTime when recording started

When mediaRecorder.start() is called some time can pass until it really records the first sample coming from a media device. Is there a way to know this first sample position in time? Maybe using the AudioContext clock (AudioContext.currentTime).

I need this because I record my voice while reproducing some music and I need to sync it perfectly afterwards.

No sound on iOS Safari.

Hello
I am experiencing weird issue on iOS Safari. Audio seems to be recorded - time of audio is also compatible to its recording length but still with no sound. Manipulating native controls does not take any effect. I have made an attempt by switching package version into "audio-recorder-polyfill": "ai/audio-recorder-polyfill" as suggested in topic: #24 but no luck with this as well. Here is short summary of what I am trying to achieve. So basically simple click on button is triggering MediaRecorder. It is happening in parallel with Azure's Speech Recognizer SDK:

// first click
const startSpeechRecognition = ({ callback, key, token }) => {
  const speechConfig = SpeechSDK.SpeechConfig.fromAuthorizationToken(token, SPEECH_REGION)
  speechConfig.speechRecognitionLanguage = SPEECH_RECOGNITION_LANG
  const audioConfig = SpeechSDK.AudioConfig.fromDefaultMicrophoneInput()
  const recognizer = new SpeechSDK.SpeechRecognizer(speechConfig, audioConfig)
  const recorder = setupRecorder(callback)

  return new Promise((resolve, reject) => {
    recognizer.startContinuousRecognitionAsync(
      () => recorder.then((r) => {
        r.recorder.start()
        resolve({
          recognizer,
          recorder
        })
      }),
      err => reject(err)
    )
  })
}

Internal call of setupRecorder(callback) is defined as below:

const setupRecorder = callback => navigator.mediaDevices
  .getUserMedia({ audio: true })
  .then((stream) => {
    const data = []
    const recorder = new MediaRecorder(stream)
    recorder.addEventListener(MEDIA_EVENT.START, (e) => {
      data.length = 0
    })
    recorder.addEventListener(MEDIA_EVENT.DATA_AVAILABLE, (e) => {
      data.push(e.data)
    })
    recorder.addEventListener(MEDIA_EVENT.STOP, () => {
      callback(data)
      recorder.stream.getTracks().forEach(track => track.stop())
    })

    return { recorder }
  })

Second click on button triggers stop for Speech Recognizer and MediaRecorder stop event

// second click
const stopSpeechRecognition = ({ recognizer, recorder }) => new Promise((resolve, reject) => {
  recognizer.stopContinuousRecognitionAsync(
    () => resolve(recorder.then(r => r.recorder.stop())),
    err => reject(err)
  )
})

So callback from "stop" MediaRecorder event handler is invoked with stream data which later is parsed into Blob.

// callback is defined inside Vue's component instance, so that's why you see "this" here
const callback = (data) => {
  const audio = speechService.getAudioData({
    data,
    format: this.audioData.format,
    userId: this.selectUserId()
  })
  this.audioData = {
    ...this.audioData,
    ...audio
  }
}

// and in speechService:
const getAudioData = ({ data, format }) => {
  if (!data) { return }

  const blob = new Blob(data, { type: format })
  const blobUrl = window.URL.createObjectURL(blob)

  return {
    blob,
    blobUrl,
    type: format
  }
}

For an iOS Safari I am using default wav format as pointed in docs.
Any tips what could be wrong?

Cannot stop recorder on iOS

On iOS 12 calling recorder.stop() has no effect.

Here's my CodePen:

https://codepen.io/tonybogdanov/pen/bQXRBZ

As you can see the code will ask for a microphone, record 2 seconds then stop and report to the console. On browsers with native implementation the console looks like this:

"asking for mic"
"starting"
"event" "start"
"stopping"
"event" "dataavailable"
"event" "stop"

On polyfilled MediaRecorder the output is like this:

"asking for mic"
"starting"
"event" "start"
"stopping"

As you can see neither the stop nor dataavailable events are being fired. Either there's a problem with both, or the stop() call is not executing properly and the recorder continues.

How to properly concat WAV blobs?

I know, that there is a problem with WAV, due to:

WAV format contains duration in the file header. As result, with timeslice or requestData() call, dataavailable will receive a separated file with header on every call. In contrast, MediaRecorder sends header only to first dataavailable. Other events receive addition bytes to the same file.

But do you have any idea how to properly concat WAV blobs? I'm following docs:

const mediaRecorder = new MediaRecorder(stream, { mimeType: "audio/wav" });

mediaRecorder.addEventListener("dataavailable", (e) => {
    chunks.push(e.data);
})

then somewhere:

mediaRecorder.stop();
const blob = new Blob(chunks);
const url = URL.createObjectURL(blob);
audio.src = url;

and the duration of the recorded audio is everytime 1sec. Is it possible to solve this problem somehow?

Doesn't work in Safari

I tried with Safari on OSX High Sierra and Mojave, the recorded sound feels partially and metallic.
Any suggestions?

Add more information on how to install

Hi I'm sorry,

I don't have a lot of JS experience and don't find the documentation very clear on how to install.
I followed the instructions, installed with npm. I have a simple html site.

When trying to load the polyfill-js I get:
Reference error require is not defined

Is just doing:
npm install --save audio-recorder-polyfill enough?

Also where would this code go on a simple html site?

entry: { app: './src/app.js', polyfill: './src/polyfill.js' }

This is the code I'm using for app.js
navigator.mediaDevices.getUserMedia({ audio: true }).then(stream => { recorder = new MediaRecorder(stream) recorder.addEventListener('dataavailable', e => { audio.src = URL.createObjectURL(e.data) }) recorder.start() })

I installed Parcel but I'm even more confused now as well.

Invalid blobs on 13.0.5

Spec:

  • Safari 13.0.5
  • macOS 10.15.3

When I tried the default code on the readme the data in the 'dataavailable' returns invalid data. When looking at these blobs the event only returns two distinct file size: 44B and 4140B. The 44B bit file gives an error in the player while the 4140B file is able to be played but contains no sound.

In order to investigate the issue I looked at the https://ai.github.io/audio-recorder-polyfill/ example but that gave exactly the same results (2 diff files size, both containing no useful information). I looked at the #4 discussion and tried to reboot my pc. This allowed me to get one recording but all recordings after that failed again. I rebooted again but then it gave me directly an 44B file.

If you need further investigating from my side please send.

Uncaught SyntaxError: Unexpected end of input

I'm using this library with create-react-app's latest version and I'm using this lib for audio recording on iOS safari devices.
I created src/polyfills.js and imported this into my src/index.js

the content of src/polyfills.js:

import AudioRecorder from 'audio-recorder-polyfill'
window.MediaRecorder = AudioRecorder

my src/index.js:

...
import './polyfills'
...

my recorder is same as your example:

navigator.mediaDevices.getUserMedia({ audio: true }).then(stream => {
  recorder = new MediaRecorder(stream)
  recorder.addEventListener('dataavailable', e => {
    audio.src = URL.createObjectURL(e.data)
  })
  recorder.start()
})

but it gives me the error:
Uncaught SyntaxError: Unexpected end of input at line 74 (EOF).

Recreate MediaRecorder after every stop()

There is a difference in native and polyfill MediaRecorders behavior.

With native I can do:

recorder = new MediaRecorder(stream, options)
recorder.start()
recorder.stop()
recorder.start() // start recording second time

With polyfill this work with a bug, so I must do:

recorder = new MediaRecorder(stream, options)
recorder.start()
recorder.stop()
recorder = new MediaRecorder(stream, options) // need re-create recorder
recorder.start()

Bug: When I start recording with polyfill second time (without re-creating) the result audio has stuttering (and has incorrect duration ~x2). When I start recording third time this effect are increased.

Maybe it should be better to make polyfill behave as native MediaRecorder?

Issue with createWorker function

When using the latest version 0.3.2 of this (great!) lib, I receive the following error:
Uncaught SyntaxError: Unexpected end of input

I narrowed it down to the createWorker function, which removes the trailing brace "}" for some reason:

let js = fn
.toString()
.replace(/^function\s*()\s*{/, '')
.replace(/}$/, '')

Even when adding back the brace, other issues pop up. Up to and including version 0.2.0 everything worked fine, so the issues must have been introduced with version 0.3.0.

Problem with Safari 11 Mac OSX High Sierra

Hi,

I have a problem with the demo.
When I record something, most of the times (unfortunately not always) the first one or two seconds are clipped - meaning that after some silence the audio starts to play with a click. So if I record 'one two three' with a slow voice, I only hear ' ___ (click) two three'.

I wonder why that is?

Thanks, Dirk

Firefox requesting permission

When you run the demo in Firefox, everytime you click on the record button you get asked for recoding permission.

I tried to load the:

navigator.mediaDevices.getUserMedia({ audio: true }).then(stream => {
        recorder = new MediaRecorder(stream);

Once in the beginning, using reacts componentDidMount, but then the data wont be useable after the second recording.

Any workaround for this?

I understand that it need to be invoked because the getUserMedia() returns a stream object that needs to be passed in .then() but it doesn´t feel right.
In this Demo you are only prompted once (Tested on FF mobile) but the audio isn't recording
https://danielstorey.github.io/WebAudioTrack/index.html

error on start

Hi,
I use audio-recorder-polyfill in an angular application.
I have this routine:

    navigator.mediaDevices.getUserMedia({ audio: true }).then(stream => {
      this.recorder = new window.MediaRecorder(stream)
      this.recorder.addEventListener('dataavailable', e => {
        //audio.src = URL.createObjectURL(e.data)
      })
      this.recorder.start()
    })

But when it is executed I have this error:

TypeError: Argument 1 ('event') to EventTarget.dispatchEvent must be an instance of Event dispatchEvent

I don't understand what it wants.
I load the polyfill on the Init event.

  public loadScript() {
    this.polyfill = true;
    let body = <HTMLDivElement>document.body;
    let script = document.createElement('script');
    script.innerHTML = '';
    script.src = '/assets/vendor/recorder_polyfill.js';
    script.async = true;
    script.defer = true;
    body.appendChild(script);
  }

  ngOnInit() {
    if (!window.MediaRecorder) {
      this.loadScript();
    }

Tnx

Embedding recorder

Hi,
I'm trying to embed a recorder in my zend project. I compiled this project by npm run build and it seems that everything is fine with that.
I copied "as is" the compiled index.html in a my view and edited the link to embed polyfill.js.
My project is hosted by a virtual machine (Ubuntu 18)
Chrome returns an error (getUserMedia is null). I can understand it even I think that that could be handled better.
But if I force Polyfill it returns "Your browser doesn’t support MediaRecorder or WebRTC to be able to polyfill MediaRecorder." That is impossible.

Same answer by Safari (actually event not forcing) and Firefox,

What am I missing ?

Thank you

MP3 Bitrate

Is there a way to change the bitrate for the mp3 encoder?

Recorded File from library has no audio in it. If we display video stream in iPhone and record audio separately.

I had the requirement to record video of the user. As video recording is not supported in iPhone browsers so I used this library to record audio only and take snapshots from video stream. The issue I am currently facing is that this polyfill library gives me a file after recording is completed but with no audio in it. The size and duration of the file is as it is expected to be. All this works fine on other devices even on MacOS Safari but on iPhone safari, it doesn't work.

I think the problem is that video is utilizing both video and audio streams, and when we record access audio stream via polyfill it doesn't give any sound to it or stream with no data

recorded audio with polyfill has zero duration

I am currently not able to get this work on browser safari version 12.1.2 with on High Sierra. There are no console errors but e.data has a 44 bytes audio/wav file no matter how long you record. I have the code snippet below -

import MediaRecorder from 'audio-recorder-polyfill';

record = (stream) => {
this.recorder = new MediaRecorder(stream);
this.recorder.addEventListener('dataavailable', this.onRecordingReady);
}

onRecordingReady = (e) => {
let $audioEl = this.refs.audio,
blob = e.data, // this is a 44 bytes audio/wav
that = this;
$audioEl.src = URL.createObjectURL(blob);
$audioEl.play();
this.saveAudioFile(blob);
}

What am i missing? Or is this something that is broken at the moment?

Support the MediaRecorder API feature request on bugs.webkit.org

This bug report on bugs.webkit.org tracks the MediaRecorder implementation in WebKit and thus Safari.

I encourage everyone to show their interest and add a comment describing how a MediaRecorder implementation would help their project/development. Creating an account takes a second.

Chrome implemented MediaRecorder in Jan 2016. By that time the MediaRecorder feature request had gathered 2800+ starts on bugs.chromium.org.

Also, WebKit is an open source project so one can also contribute code towards implementing the MediaRecorder API.

sampleRate

Hi! Good evening.

How configure sampleRate? or options of the MediaRecorder base.
I need reduce size of the audios.
{
audioBitsPerSecond : 8000
bitsPerSecond: 8000
}
Thanks!

How to enable ssl for deployment

Hi. I saw the demo. It is fantastic. Is there a way where I can put up a self signed ask certificate and make it working in a intranet ? Can you please guide what steps you filled to deploy in https ? Thanks

mp3 encoder?

First of all, thanks for building this great polyfill, it's a life saver on safari.

I saw there's a wav encoder built-in, wondering is there any chance for a mp3 encoder, maybe using lamejs, so that recorded audio files could be played across chrome/firefox/safari?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.