Git Product home page Git Product logo

sfmediastream's Introduction

Written by Software License Tweet

SFMediaStream

A HTML5 media streamer library for playing music, video, or even microphone & camera live streaming with node server. The transmitted data is compressed (depend on the browser media encoder) before being sent to node server, and the latency is configurable.

The default configuration is intended for newer browser. If you want to build 2-way communication for older and newer browser, then you must send streamer encoding information to the presenter before start the communication or using mp4 instead of opus.

Install with CDN link

You can download minified js from this repository or use this CDN link <script type="text/javascript" src='https://cdn.jsdelivr.net/npm/sfmediastream@v1'></script>

If you want to use the different version, please modify the v1 into a specific version.

And include it on your project

// Prefixed with Scarlets when imported with CDN link
var presenter = new ScarletsMediaPresenter(...);
var streamer = new ScarletsAudioStreamer(...);

Install with NPM

npm i sfmediastream

This is for web bundler like Webpack or Browserify, and can't be used as a library for Node.js. If you want to use this recorder/effect/plugin for Node.js, the I think it may be possible by using headless browser like Puppeteer.

const {MediaPresenter, AudioStreamer, ...} = require('sfmediastream');
var presenter = new MediaPresenter(...);
var streamer = new AudioStreamer(...);

Adding retro-compatibility

In case of the browser doesn't support some codec like audio/wav, audio/webm, or audio/ogg you can to add opus-media-recorder before using the library.

Safari browser actually is partially supported by using this polyfill. It able to stream audio, but is not playable by the browser.

How to use

ScarletsMediaPresenter

This class is used for streaming local media like camera or microphone to the server.

Properties

Property Details
debug Set to true for outputting any message to browser console
mediaRecorder Return current mediaRecorder that being used
mediaStream Return current mediaStream that being used
mediaGranted Return true if user granted the recorder
recordingReady Return true if the recording was ready
recording Return true if currently recording
destination Used for connect audio node to the recorder
options.mimeType Return mimeType that being used
// Example for accessing the properties
presenterMedia.debug = true;

Method

Function Arguments Description
startRecording () Start recording camera or microphone
stopRecording () Stop recording camera or microphone
connect (AudioNode) Connect presenter's stream to audio processing before being recorded and disable direct output
disconnect (AudioNode) Disconnect presenter's stream from audio processing

Event Listener

onRecordingReady

Callback when the library is ready for recording

presenterMedia.onRecordingReady = function(packet){
    console.log("Header size: " + packet.data.size);
    mySocket.emit('bufferHeader', packet);
};
onBufferProcess

Callback when data buffer is ready to be played

presenterMedia.onBufferProcess = function(packet){
    console.log("Data", packet);
    mySocket.emit('stream', packet);
};

Example

var presenterMedia = new ScarletsMediaPresenter({
    audio:{
        channelCount:1,
        echoCancellation: false
    },/* video:{
        frameRate:15,
        width: 1280,
        height: 720,
        facingMode: (frontCamera ? "user" : "environment")
    } */
}, 1000); // 1sec

presenterMedia.onRecordingReady = function(packet){
    console.log("Recording started!");
    console.log("Header size: " + packet.data.size + "bytes");

    // Every new streamer must receive this header packet
    mySocket.emit('bufferHeader', packet);
}

presenterMedia.onBufferProcess = function(packet){
    console.log("Buffer sent: " + packet[0].size + "bytes");
    mySocket.emit('stream', packet);
}

presenterMedia.startRecording();
presenterMedia.stopRecording();

ScarletsAudioStreamer

This class is used for buffering and playing microphone stream from the server.

// The minimum duration for audio is ~100ms
var audioStreamer = new ScarletsAudioStreamer(1000); // 1sec

Properties

Property Details
debug Set to true for outputting any message to browser console
playing Return true if playing a stream
latency Return current latency
mimeType Return mimeType of current streamed media
outputNode Will be available when using .connect(AudioNode)
// Example for accessing the properties
audioStreamer.debug = true;

Method

Function Arguments Description
playStream () Set this library to automatically play any received buffer
receiveBuffer (packetBuffer) Receive arrayBuffer and play it when last buffer finished playing
realtimeBufferPlay (packetBuffer) Receive arrayBuffer and immediately play it
stop () Stop playing any buffer
connect (AudioNode) Connect the streamer to other AudioNode
disconnect (AudioNode) Disconnect the streamer from any AudioNode

ScarletsVideoStreamer

This class is used for buffering and playing microphone & camera stream from the server.

Still in experimental mode and have some bug.

// Usually the minimum duration for video is 1000ms
var videoStreamer = new ScarletsVideoStreamer(videoHTML, 1000); // 1sec

Properties

Property Details
debug Set to true for outputting any message to browser console
playing Return true if playing a stream
latency Return current latency
mimeType Return mimeType of current streamed media
outputNode Will be available when using .connect(AudioNode)
// Example for accessing the properties
videoStreamer.debug = true;

Method

Function Arguments Description
playStream () Set this library to automatically play any received buffer
receiveBuffer (arrayBuffer) Receive arrayBuffer and play it when last buffer finished playing
stop () Stop playing any buffer
audioConnect (AudioNode) Connect the streamer to other AudioNode
audioDisconnect (AudioNode) Disconnect the streamer from any AudioNode

Example

// var audioStreamer = new ScarletsAudioStreamer(1000); // 1sec
var videoStreamer = new ScarletsVideoStreamer(1000); // 1sec
videoStreamer.playStream();

// First thing that must be received
mySocket.on('bufferHeader', function(packet){
    videoStreamer.setBufferHeader(packet);
});

mySocket.on('stream', function(packet){
    console.log("Buffer received: " + packet[0].byteLength + "bytes");
    videoStreamer.receiveBuffer(packet);
});

// Add an effect
var ppDelay = ScarletsMediaEffect.pingPongDelay();

// Stream (source) -> Ping pong delay -> destination
videoStreamer.audioConnect(ppDelay.input);
ppDelay.output.connect(ScarletsMedia.audioContext.destination);

ScarletsMediaPlayer

This class is used for playing video or audio from url.

var mediaPlayer = new ScarletsMediaPlayer(document.querySelector('audio'));

Properties

Property Details
autoplay Sets or returns whether the audio/video should start playing as soon as it is loaded
preload Sets or returns whether the audio/video should be loaded when the page loads
loop Sets or returns whether the audio/video should start over again when finished
buffered Returns a TimeRanges object representing the buffered parts of the audio/video
preload Sets or returns whether the audio/video should be loaded when the page loads ("none", "metadata", "auto")
controller Returns the MediaController object representing the current media controller of the audio/video
currentTime Sets or returns the current playback position in the audio/video (in seconds)
currentSrc Returns the URL of the current audio/video
duration Returns the length of the current audio/video (in seconds)
ended Returns whether the playback of the audio/video has ended or not
error Returns a MediaError object representing the error state of the audio/video
readyState Returns the current ready state of the audio/video
networkState Returns the current network state of the audio/video
paused Returns whether the audio/video is paused or not
played Returns a TimeRanges object representing the played parts of the audio/video
seekable Returns a TimeRanges object representing the seekable parts of the audio/video
seeking Returns whether the user is currently seeking in the audio/video
audioOutput Return audioContext from media source
videoOutput Return videoContext from media source
The videoContext still in experimental mode and haven't been implemented.
// Example for accessing the properties
mediaPlayer.preload = "metadata";

Method

Function Arguments Description
load () Re-loads the audio/video element
canPlayType (str) Checks if the browser can play the specified audio/video type
speed (0 ~ 1) Sets or returns the speed of the audio/video playback
mute (boolean) Sets or returns whether the audio/video is muted or not
volume (0 ~ 1) Sets or returns the volume of the audio/video
play () Starts playing the audio/video
pause () Pauses the currently playing audio/video

Below also the available method.

prepare

Load media from URL

mediaPlayer.prepare('my.mp3' /*single*/ || /*with fallback*/ ['my.mp3', 'fallback.ogg'], function(){
    mediaPlayer.play();
});
on

Register event callback

mediaPlayer.on('loadedmetadata', function(e){
    // See at the property above
    console.log(e.target.duration);
});
off

Un-register event callback

mediaPlayer.off('abort');
once

Register event callback and remove listener after called

mediaPlayer.once('abort', function(e){
    alert('User aborted the buffer');
});

Available Events

Event Details
abort Fires when the loading of an audio/video is aborted
canplay Fires when the browser can start playing the audio/video
canplaythrough Fires when the browser can play through the audio/video without stopping for buffering
durationchange Fires when the duration of the audio/video is changed
emptied Fires when the current player is empty
ended Fires when the current player is ended
error Fires when an error occurred during the loading of an audio/video
loadeddata Fires when the browser has loaded the current frame of the audio/video
loadedmetadata Fires when the browser has loaded meta data for the audio/video
loadstart Fires when the browser starts looking for the audio/video
pause Fires when the audio/video has been paused
play Fires when the audio/video has been started or is no longer paused
playing Fires when the audio/video is playing after having been paused or stopped for buffering
progress Fires when the browser is downloading the audio/video
ratechange Fires when the playing speed of the audio/video is changed
seeked Fires when the user is finished moving/skipping to a new position in the audio/video
seeking Fires when the user starts moving/skipping to a new position in the audio/video
stalled Fires when the browser is trying to get media data, but data is not available
suspend Fires when the browser is intentionally not getting media data
timeupdate Fires when the current playback position has changed
volumechange Fires when the volume has been changed
waiting Fires when the video stops because it needs to buffer the next frame
playlistchange Fires when the player starts another playlist

playlistchange the callback function will get (player, playlist, index) as the arguments.

Video Properties

Property Details
poster Specifies an image to be shown while the video is downloading, or until the user hits the play button
height Sets the height of the video player
width Sets the width of the video player
// Example for accessing the properties
mediaPlayer.poster = 'url.png';

Properties

audioFadeEffect

Enable fade effect when playing or pausing the sound

mediaPlayer.audioFadeEffect = true;
audioOutput

Can be used to connect the media to other effect or plugin like equalizer

// Create equalizer and pass audio output as equalizer input
var equalizer = ScarletsMediaEffect.equalizer(null, mediaPlayer.audioOutput);

// Connect to final destination
equalizer.output.connect(ScarletsMedia.audioContext.destination);

ScarletsMediaEffect

This feature can be used on every media if you have the media source node as the input. And make sure every node is connected to AudioContext.destination or it will not playable.

The plugins have a function to destroy node connection that aren't being used. So don't forget to destroy your unused effect to clean unused memory.

effect.destroy();

Available Plugin

Effect Details
Chorus An effect to make a single voice like multiple voices
ConReverb An reverb effect that simulates from other audio source
CutOff An cutoff filter that have adjustable width
Delay An effect that play the audio back after a period of time
Distortion It's.. like.. distortion..
DubDelay Delay with feedback saturation and time/pitch modulation
Equalizer Adjustable frequency pass filter
Fade Volume fade in and fade out effect
Flanger An audio effect by mixing two identical signals together with one signal who get delayed
Harmonizer An pitch shift effect which like playing an harmony
Noise Noise generator like a radio
PingPongDelay Stereo delay effect that alternates each delay between the left and right channels
Reverb Configurable reflection effect
StereoPanner Can be used to pan an audio stream left or right
Tremolo Modulation effect that creates a change in volume
// Directly connect audio output as an input for ping pong delay plugin
var ppDelay = ScarletsMediaEffect.pingPongDelay(mediaPlayer.audioOutput);

// Create StereoPanner handler
var panner = ScarletsMediaEffect.stereoPanner(/* input [optional] */);
// panner.input (will be available if no input passed on plugin)

// Connect ppDelay output to panner input
ppDelay.output.connect(panner.input);

// Modify the plugin (Still need to be documented)
panner.set(-1); // Left channel

// Connect to final destination
panner.connect(ScarletsMedia.audioContext.destination);

// Visualization
// player.audioOutput -> pingPongDelay -> Panner -> final destination

Playlist

This will be available on current media player

Properties

Property Details
currentIndex Return index of current playing media
list Return array playlist that are being used
original Return original array playlist
loop Set this to true if you want to play this playlist again from beginning
shuffled Return true if the list was shuffled
// Example for accessing the properties
console.log('Current playlist count', mediaPlayer.playlist.original.length);

Method

reload

Replace old playlist data

mediaPlayer.playlist.reload([{
    yourProperty:'',
    stream:['main.mp3', 'fallback.ogg']
}, ...]);
add

Add new data to playlist

mediaPlayer.playlist.add({
    yourProperty:'',
    stream:['main.mp3', 'fallback.ogg']
});
remove

Remove original playlist data by index

// mediaPlayer.playlist.original[0]
mediaPlayer.playlist.remove(0);
next

Play next music, this will also trigger playlistchange event

mediaPlayer.playlist.next();
previous

Play previous music, this will also trigger playlistchange event

mediaPlayer.playlist.previous();
play

Play music by index

// mediaPlayer.playlist.list[0]
mediaPlayer.playlist.play(0);
shuffle

Shuffle the playlist

// mediaPlayer.playlist.list
mediaPlayer.playlist.shuffle(true || false);

Contribution

If you want to help in SFMediaStream please fork this project and edit on your repository, then make a pull request to here. Otherwise, you can help with donation via patreon.

Compile from source

After you downloaded this repo you need to install some devDependencies.

$ npm i
$ gulp watch

After you make some changes on /src it will automatically compile into /dist/SFMediaStream.js. Make sure you cleared your cache when doing some experiment.

License

SFMediaStream is under the MIT license.

But don't forget to put the a link to this repository.

sfmediastream's People

Contributors

orlandoaleman avatar orlandolaycos avatar stefansarya avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

sfmediastream's Issues

Failed to execute 'appendBuffer' on 'SourceBuffer': This SourceBuffer is still processing an 'appendBuffer' or 'remove' operation.

I'm using SFMediaStream for a proximity chat mod for Among Us. I ran into an issue when trying to use it. The issues is:

sfmediastream@latest:8 Uncaught DOMException: Failed to execute 'appendBuffer' on 'SourceBuffer': This SourceBuffer is still processing an 'appendBuffer' or 'remove' operation.
    at l.a.append (https://cdn.jsdelivr.net/npm/sfmediastream@latest:8:4518)
    at d.t.receiveBuffer (https://cdn.jsdelivr.net/npm/sfmediastream@latest:8:3385)
    at f.<anonymous> (https://amongus.derock.dev/play.html?u=UEQXAJ&p=Derock:263:39)
    at f.r.emit (https://cdn.socket.io/socket.io-3.0.1.min.js:6:2341)
    at f.value (https://cdn.socket.io/socket.io-3.0.1.min.js:6:30320)
    at f.value (https://cdn.socket.io/socket.io-3.0.1.min.js:6:30041)
    at f.value (https://cdn.socket.io/socket.io-3.0.1.min.js:6:29648)
    at b.<anonymous> (https://cdn.socket.io/socket.io-3.0.1.min.js:6:33736)
    at b.r.emit (https://cdn.socket.io/socket.io-3.0.1.min.js:6:2341)
    at b.value (https://cdn.socket.io/socket.io-3.0.1.min.js:6:17782)

Here's the code I use for the client:

        socket.on('newBufferHeader', (data) => {
            if(!otherStreams[data.id])
                otherStreams[data.id] = new ScarletsAudioStreamer(200);
            otherStreams[data.id].playStream();
            otherStreams[data.id].setBufferHeader(data.data);
            console.log('[WS] Got headers for ' + data.id)
            // otherStreams[data.id].audioConnect(ScarletsMedia.audioContext.destination);

            if(nextUp) //handleNextBuffer(nextUp, otherStreams[data.id], data.id)
                otherStreams[data.id].receiveBuffer(nextUp);
        })
    
        socket.on('data', (data) => {
            console.log('[AUDIO] Got audio for ' + data.id)
            if(otherStreams[data.id])
                //handleNextBuffer(data.data, otherStreams[data.id], data.id)
                otherStreams[data.id].receiveBuffer(data.data);
            else {
                console.error('[AUDIO] Missing headers for ' + data.id)
                console.log('[WS] Asked for headers')
                socket.emit('requestBuffer', {need: data.id});
                nextUp = data.data;
            }
        })

The client acts like a presenter and a streamer, both delays are synced up (200ms).
For streamer I use the following:

        const presenter = new ScarletsMediaPresenter({
            audio: {
                channelCount: 1,
                echoCancellation: false
            }
        }, 200);

        presenter.onRecordingReady = (packet) => {
            console.log('[AUDIO] Ready to stream audio');
            headers = packet;
            socket.emit('bufferHeader', packet);
            console.log("[WS] Sent headers");
        }
        presenter.onBufferProcess = (packet) => {
            socket.emit('data', packet);
            console.log('[SEND] Audio Sent')
        }

Sometimes it'll work for a few seconds, other times it errors after receiving one chunk.

Any help would be appreciated.

How to audio data stream to the speakers on node server ?

Hi,
Your project is very interesting.
I wonder how to output the speaker on the server node?
Example:

const Speaker = require('speaker');//https://www.npmjs.com/package/speaker
...
 socket.on('stream', function(packet){
	console.log(packet);
        packet.pipe(speaker);//this is only example...
        socket.broadcast.emit('stream', packet);
    });

Thanks

group-call help

Hi
Your project is very interesting, but I can't use it because of my poor knowledge.
I am beginner in nodejs, but I need tool for streaming speaker (browser/microphone) voice to another users (browsers) by nodejs server.
I have just run Your example group-call and got errors:
The AudioContext was not allowed to start. It must be resumed (or created) after a user gesture on the page.
WebSocket connection to 'ws://omisoft.pl/socket.io/?EIO=3&transport=websocket' failed: Error during WebSocket handshake: Unexpected response code: 404
The first is I guess not SSL connection, but the second error is too dificult for me to resolve.
I don't know how use Your code in SSL mode.
If You have an advice and time for help I am ready to pay for it.

Mariusz Urban

any chance for windows?

hi,
running the server on my linux machine,
accessing the example/streamingtest server on my windows machine on the same network,
but getting

MediaPresenter.js:118 Uncaught TypeError: Cannot read property 'getUserMedia' of undefined
    at l.t.startRecording (MediaPresenter.js:118)
    at asPresenter ((index):49)
    at HTMLButtonElement.onclick ((index):12)

after clicking on presenter. (using Firefox btw)

Any idea?

Can I have multiple transmitters and listeners at the same time?

Hello, I have been following your project for a long time. I find it very interesting, I was using it to transmit live audio but only unilaterally (one transmitter and multiple listeners).
Now I want to go to the next level, I try to have multiple emitters and listeners at the same time.

That the user is an audio transmitter and receiver at the same time, and that they also listen to what other users are emitting something like discord

Is it possible to do this with your project?

And if so, how could I do it?

Thank you for giving us this project, which is really very useful.

Effect not working

I just tried
presenterMedia.disconnect(presenterMedia.destination);
ppDelay = ScarletsMediaEffect.pingPongDelay();

ppDelay.mix(20);


audioStreamer.connect(ppDelay.input);
ppDelay.output.connect(presenterMedia.destination);

this. Am I wrong? it is not working

Implement Volume

Hello!
Love this repo!
How can I set a Volume slider () on the audioStreamer?

Correct implementation to play audio received

I have done the following to get microphone audio and stream it.
I use socket.io to send the data (streamAudio) and then listen for streamedAudio which is received arrayBuffer data.

let presenterMedia = new MediaPresenter({
            audio:{
                channelCount:1,
                echoCancellation: false
            },
        }, 1000); // 1sec

        presenterMedia.onRecordingReady = (packet) => {
            console.log("Recording started!");
            console.log("Header size: " + packet.data.size + "bytes");
        };

        presenterMedia.onBufferProcess = (packet) => {
            console.log("Buffer sent: " + packet);
            this.socket.emit('streamAudio', packet);
        };

        const streamer = new AudioStreamer();
        this.socket.on('audioStreamed', (data) => {
            console.log('audioStreamed', data]); //arrayBuffer
            streamer.realtimeBufferPlay(data);
        });

        presenterMedia.startRecording();

Now I would expect sounds from my microphone to be played back to me in realtime but I hear nothing. Does anyone know why?

Effect problem

I just tried put effect on object to multiple edit audio. with multiple ScarletsAudioStreamer edited version. It work well to send edited voice. But the first time audio node was work perfect, with real time update. But it work only one. Second one audio send work, but real time update is not work.
example:
First: Audioeditnode[firstid] = Scarlets Effect
second: Audioeditnode[secondid] = Scarlets Effect

Event update(data) {
foreach(audioeditnode)key => {
audioeditnode[key].update(packet);
}
}

Video not play

I am sending and receiving data but video is not playing: look my code

HTML SEND VIDEO:

HOLA

<script src="https://cdnjs.cloudflare.com/ajax/libs/socket.io/2.2.0/socket.io.js"></script> <script type="text/javascript" src='assets/js/SFMediaStreamer.js?'></script> <script type="text/javascript" src="assets/js/admin/streamingVideo.js?"></script>

JS SEND VIDEO

`
(function(){
var socket;
$(()=>{

	await socketC();
	mediaRecord();
	btnAction();
});
function mediaRecord(){
	var presenterMedia = new ScarletsMediaPresenter({
	    audio:true, video: true
	}, 1000); // 1sec

	presenterMedia.debug = true;
	presenterMedia.onRecordingReady = function(packet){
	    console.log("Recording started!");
	    console.log("Header size: " + packet.data.size + "bytes");
	    console.log(packet.data);
	    // Every new streamer must receive this header packet
	    // console.log(socket);
	    let blob = new Blob([packet.data]
			, {type:'video/webm'}
		);
	    socket.emit('bufferHeader', blob);
	    // socket.emit('data', new Uint8Array(packet));
	    // socket.emit('prueba', {data: 888});
	}

	presenterMedia.onBufferProcess = function(packet){
	    console.log("Buffer sent: " + packet[0].size + "bytes");

	    // console.log(packet);
	    socket.emit('stream', {data:packet});
	}
	presenterMedia.startRecording();

	setTimeout(()=>{
		presenterMedia.stopRecording();

	}, 60000);

}
async function socketC(){
	socket = await  io.connect(
		"https://xxxx.xxxxx.xxxx:xxxx",
		{transports: [ 'websocket' ], upgrade:false}
	);
	socket.on('connect',()=>{
		console.log('socket conectado');
		mediaRecord();
	});
}
function btnAction(){
	$('#btn').click(()=>{
		socket.emit('test', {data: 666});
	})
}

})();`

console

image

JS BACKEND: Socket connection

`
'use strict'
let socket = require('socket.io');
let app = require('express')();
let logger = require('winston');
let http = require('http');
let https = require('https');
let fs = require('fs');
logger.remove(logger.transports.Console);
logger.add(logger.transports.Console, {colorize:true, timestamp:true});
logger.info('SocketIO > listening on port ');

let https_server = https.createServer({
key: fs.readFileSync('../../ssl/mycpanel.pem'),
cert: fs.readFileSync('../../ssl/mycpanel.pem')
}, app).listen(3010);

var io = socket.listen(https_server);

io.sockets.on('connection',function(socket){
console.log('NUEVA CONEXION');
socket.on('prueba',function(data){
console.log('data prueba socjet', data);
})
socket.on('bufferHeader', function(data){
console.log("bufferHeader", data);
io.sockets.emit('bufferHead', data);
});
socket.on('stream', function(data){
console.log("streamer", data);
io.sockets.emit('stream', data);
});
socket.on('videoCam',(data)=>{
io.sockets.emit('videoCam', data);
console.log(data);
});
// socket.on('newOrderToBill', (data)=>{
// console.log(data);
// io.sockets.emit('newOrderToBill',data);
// });
});
`

** HTML RECEIVE VIDEO **

HOLA

PLAY

<script src="https://cdnjs.cloudflare.com/ajax/libs/socket.io/2.2.0/socket.io.js"></script> <script type="text/javascript" src="assets/js/SFMediaStreamer.js?"></script> <script type="text/javascript" src="assets/js/admin/recibeVideo.js?"></script>

**JS RECEIVE VIDEO **

` var video = document.querySelector('#video1');
var video2 = document.querySelector('#video2');
var scarlet = false;
var socket;
var chunks = [];
var ms = new MediaSource();
var uno = false;
$(()=>{

    socketC();
    setTimeout(()=>{
        socket.disconnect();
    }, 50000);
    // receiveVideo();
});
function socketC(){
    socket = io.connect(
        "https://host3.virtualsoccergroups.com:3010",
        {transports: [ 'websocket' ], upgrade:false}
    );
    socket.on('connect',function(){
        if(!scarlet){
            playVideo();
            //for prevent propagation if socket reconnect
            scarlet = true;
        }
        console.log('socket on')
    });
}

 function playVideo(){
    var videoStreamer = new ScarletsVideoStreamer(video, 1000); // 1sec
    videoStreamer.playStream();

    // First thing that must be received
    socket.on('bufferHead', function(packet){
        console.log('bufferHeader',packet);
        videoStreamer.setBufferHeader(packet);
    });

    socket.on('stream', function(packet){
        console.log("Buffer received: " , packet);
        videoStreamer.receiveBuffer(packet);
    });

    // Add an effect
    var ppDelay = ScarletsMediaEffect.pingPongDelay();

    // Stream (source) -> Ping pong delay -> destination
    videoStreamer.audioConnect(ppDelay.input);
    ppDelay.output.connect(ScarletsMedia.audioContext.destination);

 }



function download(blob) {
    var link = document.createElement('a');
    link.setAttribute('download', 'video.webm');
    link.setAttribute('href', URL.createObjectURL(blob));
    link.style.display = "none";
    // NOTE: We need to add temporarily the link to the DOM so
    //       we can trigger a 'click' on it.
    document.body.appendChild(link);
    link.click();
    document.body.removeChild(link);
}

function btnAction(){
    $('#btnPlay').click(function(){
        // playVideo();
        video.play();
    })
}

`
console:
image

Stream Delay

I used example code
startbtn.addEventListener('click', function () {

var presenterMedia = new ScarletsMediaPresenter({
    audio:{
			channelCount:1,
			echoCancellation: false
    }
}, 100);

presenterMedia.onRecordingReady = function(packet){
    console.log("Recording started!");
    console.log("Header size: " + packet.data.size + "bytes");

    // Every new streamer must receive this header packet
    socket.emit('bufferHeader', packet);
}

presenterMedia.onBufferProcess = function(packet){
    //console.log("Buffer sent: " + packet[0].size + "bytes");
    socket.emit('bufferStream', packet);
}

presenterMedia.startRecording();

});
videoStreamer = new ScarletsAudioStreamer(null, 100);
videoStreamer.playStream();

// Buffer header must be received first
socket.on('bufferHeader', function(packet,socketid){
videoStreamer.setBufferHeader(packet);
});

// Receive buffer and play it
socket.on('bufferstream', function(packet,socketid){
videoStreamer.receiveBuffer(packet);
});
like this, but It keep make more delay If I keep use that

Audio Recoding on Firefox does not work

I am using this library to broadcast audio to different subscriber on a prototype application. But recording from Firefox can not be broadcasted to client. It works fine on chrome meanwhile.

mimeType: audio/webm;codecs="opus"

scope.audioContext.createMediaElementSource is not a function

I am getting following:
I downloaded: dist/SFMediaStream.js and change name to SFMediaStreamer.js
Error Console
image
image

[Code por copy or edit]
var video = document.querySelector('#video1');
var video2 = document.querySelector('#video2');
var socket;
var chunks = [];
var ms = new MediaSource();
var uno = false;
$(()=>{
socket = io.connect(
"https://host3.virtualsoccergroups.com:3010",
{transports: [ 'websocket' ], upgrade:false}
);
socketC();
setTimeout(()=>{
socket.disconnect();
}, 50000);
// receiveVideo();
});
function socketC(){
socket.on('connect',function(){
console.log('socket on')
});
}

var videoStreamer = new ScarletsVideoStreamer(video, 1000); // 1sec
// videoStreamer.playStream();

// First thing that must be received
socket.on('bufferHeader', function(packet){
    console.log('bufferHeader',packet);
    videoStreamer.setBufferHeader(packet);
});

socket.on('stream', function(packet){
    console.log("Buffer received: " , packet);
    videoStreamer.receiveBuffer(packet);
});

// Add an effect
var ppDelay = ScarletsMediaEffect.pingPongDelay();

// Stream (source) -> Ping pong delay -> destination
videoStreamer.audioConnect(ppDelay.input);
ppDelay.output.connect(ScarletsMedia.audioContext.destination);


function download(blob) {
    var link = document.createElement('a');
    link.setAttribute('download', 'video.webm');
    link.setAttribute('href', URL.createObjectURL(blob));
    link.style.display = "none";
    // NOTE: We need to add temporarily the link to the DOM so
    //       we can trigger a 'click' on it.
    document.body.appendChild(link);
    link.click();
    document.body.removeChild(link);
}

function btnAction(){
    $('#btnPlay').click(function(){
        video.play();
    })
}

Use node to relay RTSP stream and decode it with SFMediaStream?

Hello,

is there anyway to use the node server to "proxy" a RTSP stream to the browser trough websocket and then use SFMediaStream to decode and display it?

I saw yeellowstone which might be used to connect to the RTSP server, but I'm not sure how to transform the data to something that SFMediaStream would be able to understand.

Thank you in advance

Rico

Some precisions

Hey, this project looks awesome. I have some questions

  1. I cannot find the nodejs server part in source code.

  2. is there an online demo so I can test it ?

  3. I guess it works fine with chrome and FF. But did you try it with Safari (it has probably a different mediaRecorder API as usual)

Regards

p2p-video-stream bug on FireFox

It's not working on firefox, and there are a lot of errors.

Can you please tell me how to stream video via the nodejs server?

edit: how to use this lib to stream the video across

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.