Git Product home page Git Product logo

h264-live-player's Introduction

Motivation

This is a very simple h264 video player (that can run on live stream) for your browser. You might use this with raspicam raw h264 stream. This is a player around Broadway Decoder, with very simple API. NAL unit (h264 frames) are split on the server side, transported using websocket, and sent to the decoded (with frame dropping, if necessary)

Version License

History

  • I was targetting a real-time camera video feedback (no audio/surveillance cam) in the browser
  • There is no solution for "real time" mp4 video creation / playback (ffmpeg, mp4box.js, mp4parser - boxing takes time)
  • Media Source Extension is a dead end (mp4 boxing is far too hard to re-create on the client side)
  • Broadway provide the crazy emscripten/asm build of a h264 decoder accelerated by webGL canvas
  • Here is all the glue we need, enjoy ;-)

Installation/demo

git clone [email protected]:131/h264-live-player.git player
cd player
npm install

node server-rpi.js    # run on a rpi for a webcam demo
node server-static.js # for sample video (static) file delivery
node server-tcp.js    # for a remote tcp (rpi video feed) sample
node server-ffmpeg    # usefull on win32 to debug the live feed (use ffmpeg & your directshow device / webcam) 

# browse to http://127.0.0.1:8080/ for a demo player

Recommendations

  • Broadway h264 Decoder can only work with h264 baseline profile
  • Use a SANE birate
  • Browserify FTW
  • Once you understand how to integrate the server-side, feel free to use h264-live-player npm package in your client side app (see vendor/)
  • Use uws (instead of ws) as websocket server

Credits

Keywords / shout box

raspberry, mp4box, h264, nal, raspivid, mse, media source extension, iso, raspicam, bitrate, realtime, video, mp4, ffmpeg, websocket, ws, socket.io "Let's have a beer and talk in Paris"

h264-live-player's People

Contributors

131 avatar bhlowe avatar buzztiaan avatar dependabot[bot] avatar emersion avatar jan4984 avatar jesusmoraleda avatar kevinstubbs avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

h264-live-player's Issues

Size is not correct drawn

I set the size to 640 * 480, however, in the browser, though the canvas was resized, but the video is still shown as a very small size

no such file or directory, stat 'samples/test.h264'

I run node server-static.js,and i open chrome to visit http://127.0.0.1:8080/,however,when i click start button,i get an error,like this
Error: ENOENT: no such file or directory, stat 'samples/test.h264' at Error (native) at Object.fs.statSync (fs.js:849:18) at Class.get_feed (/Applications/MAMP/htdocs/player/lib/static.js:22:44) at Class.start_feed (/Applications/MAMP/htdocs/player/lib/_server.js:28:27) at WebSocket.<anonymous> (/Applications/MAMP/htdocs/player/lib/_server.js:71:14) at emitTwo (events.js:87:13) at WebSocket.emit (events.js:172:7) at Receiver.ontext (/Applications/MAMP/htdocs/player/node_modules/ws/lib/WebSocket.js:816:10) at /Applications/MAMP/htdocs/player/node_modules/ws/lib/Receiver.js:477:18 at /Applications/MAMP/htdocs/player/node_modules/ws/lib/Receiver.js:361:7
what should i do?

Handle Multiple Remote Feed

Hi Support,
How can i handle the multiple remote feed to connect on the server and get a desired stream based on the requested client.
For Example
Client1 and Client2 wants to see the remote-feed of Device1
Client3 and Client4 wants to see the remote-feed of Device2
Can you guide me how i achieve this?
Right now for one remote feed viewing for multiple clients is ok
but how for the device2 based on the clients want to see
Regards,
Waqar

Amazing lib

This library is... amazing! Thank you for creating it. Three things:

Publish it?

Edit: I see it is published already; maybe orient the README to give clues about how to use it in the developers code?

For you and the library; it might be worth-it to publish it officially on NPM instead of requiring cloning it. I just copy/pasted the back-end and the distributed js as:

const server = require('server');
const stream = require('./lib/raspivid');

server().then(ctx => {
  new stream(ctx.server);
});

Then in the front-end you could add a /client.js and /client.min.js file, so you could use http://unpkg.com/ for instance as http://unpkg.com/LIBRARYNAME/client.min.js.

If this sounds good, I have published a ton of libraries in the past and might be able to help you.

Error handling

I would highly recommend doing something like this:

const { exec } = require('child_process');

process.on('uncaughtException', err => {
  if (err.code === 'ENOTCONN') {
    exec('pkill raspivid');
  }
});

In this way the video is stopped when a client disconnects./

Question

How is server-tcp.js supposed to be used? As I understand it, I run raspivid -t 0 -o - -w 1280 -h 720 -fps 25 | nc -k -l 5001 in the Raspi. Then in my laptop I change the data inside server-tcp.js to match the raspi IP:PORT and just run it with node server-tcp.js. Open it, it displays the controls properly but the buttons do nothing. It doesn't really matter now, but at some point I'll want to have a server somewhere acting as a proxy to avoid stressing the Raspberry Pi.

Error: shutdown ENOTCONN

I'm trying to pinpoint an error when stopping the stream and/or reloading the page. What happens is if you refresh the page when watching the stream, the server will crash and leaves the raspivid command running.

This is the event log :
New guy
Incomming action 'REQUESTSTREAM'
raspivid -t 0 -o - -w 960 -h 540 -fps 30
stopping client interval
events.js:160
throw er; // Unhandled 'error' event
^

Error: shutdown ENOTCONN
at exports._errnoException (util.js:1022:11)
at Socket.onSocketFinish (net.js:237:26)
at emitNone (events.js:86:13)
at Socket.emit (events.js:185:7)
at finishMaybe (_stream_writable.js:513:14)
at endWritable (_stream_writable.js:523:3)
at Socket.Writable.end (_stream_writable.js:488:5)
at Socket.end (net.js:431:31)
at WebSocket. (/home/pi/player/lib/_server.js:81:25)
at emitTwo (events.js:111:20)

got a noisy stream image

Thank you for the nice project!
I want to use my own WebSocket Server IIS to send the H264 to the client by websokcet .
Hence, I use your live-player only for displaying the frames in a Chrome browser.
After i used the player ,I can get streams and the player can draw streams on canvas ,
everything work very well .
But sometimes the image drawing in canvas get noisy image to much .
my ffmpeg command :
ffmpeg -re -rtbufsize 1500M -f dshow -i video="AVerMedia HD Capture C985 Bus 2" -framerate 30 -pix_fmt yuv420p -c:v libx264 -profile:v baseline -coder 0 -wpredp 0 -tune zerolatency -s 320x240 -f rawvideo

normal image:
_000

noisy image:
_001

test page :
https://blive2.crazybet.win/h264liveshow.html
all function of buttons in this test page is disable, and player will auto play.

i spilt streams with 00 00 00 00 nal in my IIS server .
please give me a advice ,thank you for your help

Install fails on Raspbian

I have a current and updated RasPi with Raspbian GNU/Linux 8.0 (jessie).

After I run

apt -y install apache2 git npm
git clone https://github.com/131/h264-live-player.git player
cd player
npm install

I get these errors:

npm WARN deprecated [email protected]: connect 2.x series is deprecated

[email protected] install /var/www/html/player/node_modules/ws/node_modules/utf-8-validate
node-gyp rebuild

/bin/sh: 1: node: not found
gyp: Call to 'node -e "require('nan')"' returned exit status 127. while trying to load binding.gyp
gyp ERR! configure error
gyp ERR! stack Error: gyp failed with exit code: 1
gyp ERR! stack at ChildProcess.onCpExit (/usr/share/node-gyp/lib/configure.js:344:16)
gyp ERR! stack at ChildProcess.emit (events.js:98:17)
gyp ERR! stack at Process.ChildProcess._handle.onexit (child_process.js:809:12)
gyp ERR! System Linux 4.9.35-v7+
gyp ERR! command "nodejs" "/usr/bin/node-gyp" "rebuild"
gyp ERR! cwd /var/www/html/player/node_modules/ws/node_modules/utf-8-validate
gyp ERR! node -v v0.10.29
gyp ERR! node-gyp -v v0.12.2
gyp ERR! not ok
npm WARN This failure might be due to the use of legacy binary "node"
npm WARN For further explanations, please read
/usr/share/doc/nodejs/README.Debian

[email protected] install /var/www/html/player/node_modules/ws/node_modules/bufferutil
node-gyp rebuild

/bin/sh: 1: node: not found
gyp: Call to 'node -e "require('nan')"' returned exit status 127. while trying to load binding.gyp
gyp ERR! configure error
gyp ERR! stack Error: gyp failed with exit code: 1
gyp ERR! stack at ChildProcess.onCpExit (/usr/share/node-gyp/lib/configure.js:344:16)
gyp ERR! stack at ChildProcess.emit (events.js:98:17)
gyp ERR! stack at Process.ChildProcess._handle.onexit (child_process.js:809:12)
gyp ERR! System Linux 4.9.35-v7+
gyp ERR! command "nodejs" "/usr/bin/node-gyp" "rebuild"
gyp ERR! cwd /var/www/html/player/node_modules/ws/node_modules/bufferutil
gyp ERR! node -v v0.10.29
gyp ERR! node-gyp -v v0.12.2
gyp ERR! not ok
npm WARN optional dep failed, continuing [email protected]
npm WARN This failure might be due to the use of legacy binary "node"
npm WARN For further explanations, please read
/usr/share/doc/nodejs/README.Debian

npm WARN optional dep failed, continuing [email protected]
npm WARN cannot run in wd [email protected] cd vendor && npm install (wd=/var/www/html/player)
[email protected] node_modules/stream-split

[email protected] node_modules/stream-throttle
├── [email protected]
└── [email protected]

[email protected] node_modules/express
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected]
├── [email protected] ([email protected])
├── [email protected] ([email protected], [email protected])
├── [email protected] ([email protected], [email protected], [email protected], [email protected], [email protected], [email protected])
├── [email protected] ([email protected])
└── [email protected] ([email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected], [email protected])

[email protected] node_modules/mout

[email protected] node_modules/ws
├── [email protected]
└── [email protected]

I am not familiar with NodeJS, so can anybody tell me what I am doing wrong or if I forget some dependencies?

error when node server-rpi.js

throw er; // Unhandled 'error' event
^

Error: listen EADDRINUSE :::8080
at Server.setupListenHandle [as _listen2] (net.js:1345:14)
at listenInCluster (net.js:1386:12)
at Server.listen (net.js:1474:7)

Query On How To Live Stream

Hi There
I was given this file to pipe open VLC media player.
However, is there anyway that I can use your Live Player to pipe .h264 files onto my browser
as VLC media player has lots of latency. Thanks for your help.

c=IN IP4 127.0.0.1
m=video 55004 RTP/AVP 96
a=rtpmap:96 H264/90000

Some question about YUVWebGLCanvas

Thanks for your great job!
I'm working on some video decode job.
I get streams of YUV image data from iOS site and using the YUVWebGLCanvas to draw the image into the canvas.
but there is some weird thing.

  1. the image position is not quiet right, it will cut the eage of the right about 20px and move to the left eage
  2. the image color it's weird

image

I know It's hard to understand without the pictures.
Could you give me some help about the YUVWebGLCanvas part ?

each packet "Passed invalid frame to decoder"

Hi,
tried on fresh raspbian wheezy, and get no video, just empty frame in the browser for server-rpi.js.
Attached browser js debug, npm, nodejs version, as well as output on the pi. Thanks for all your effort.

s1

browser: each packet "Passed invalid frame to decoder"
stderr: increasing TOTAL_MEMORY to 67108864 to be compliant with the asm.js spec
stderr: pre-main prep time: 2 ms
WSAvcPlayer: Connected to ws://10.42.0.128:8080
Incoming request Object {action: "init", width: 960, height: 540}
creatingTextures: size: (960, 540)
WSAvcPlayer: Sent REQUESTSTREAM
WSAvcPlayer: [Pkt 1 (190922 bytes)]
WSAvcPlayer: Passed invalid frame to decoder
WSAvcPlayer: [Pkt 2 (55292 bytes)]
WSAvcPlayer: Passed invalid frame to decoder
WSAvcPlayer: [Pkt 3 (50451 bytes)]
WSAvcPlayer: Passed invalid frame to decoder
WSAvcPlayer: [Pkt 4 (45751 bytes)]
WSAvcPlayer: Passed invalid frame to decoder
WSAvcPlayer: [Pkt 5 (20568 bytes)]
WSAvcPlayer: Passed invalid frame to decoder
WSAvcPlayer: [Pkt 6 (17367 bytes)]
WSAvcPlayer: Passed invalid frame to decoder
WSAvcPlayer: [Pkt 7 (14129 bytes)]
WSAvcPlayer: Passed invalid frame to decoder

PI versions
pi@raspberrypi:/Downloads/h264-live-player$ node -v
v0.10.28
pi@raspberrypi:
/Downloads/h264-live-player$ npm -v
1.4.9
pi@raspberrypi:~/Downloads/h264-live-player$ uname -a
Linux raspberrypi 4.1.7-v7+ #817 SMP PREEMPT Sat Sep 19 15:32:00 BST 2015 armv7l GNU/Linux

PI output
pi@raspberrypi:~/Downloads/h264-live-player$ node server-rpi.js
New guy
Incomming action 'REQUESTSTREAM'
Incomming action 'REQUESTSTREAM'
Failure 70
Failure null
stopping client interval
New guy
Incomming action 'REQUESTSTREAM'
Failure 70

Timestamp Metadata

Looking for a way to decode timestamps to sync metadata on the client end maybe through a websocket, or pipe on the encoder end. Is it possible to make this happen or decode the epoch timestamps on the client end?

grunt pack not found ,and also can't play video in brower

HI,
when I run command : grunt pack
it shows task pack not found.

then I add following code to Gruntfile.js myself:

 grunt.registerTask('pack', , 'pack task.', function() { 
         grunt.file.expand({filter:'isDirectory'}, 'grunt/**').forEach(grunt.loadTasks);
         ...
  });

then run grunt successful,but server still can't work correctly :
open http://127.0.0.1:8080/ in firefox(ubuntu 16.04 x64) or http://servier_ip:8080 in IE(Windows7 x64)
I just get 3 button in my browser, can't play video.(I have tried rpi static tcp,these 3 all can't work)
I think , if it work correctly, it should at least can run static(local file).
I don't know what's the problem...

Decode Constrained Baseline H264 Stream no picture results,could you give some help?

Our HW does not support “strict” baseline profile, but the profile is exposed via MSDK API with the following features not supported

  1. ASO- Arbitrary slice ordering
  2. FMO – Flexible macroblock ordering
  3. RS – Redundant slices
    1 and 3 can quite easily be achieved manually after encoding a frame(s) with minimal performance impact.
    Note that since we do not support ASO/FMO/RS the baseline profile closely represents CBP (Constrained Baseline Profile).

Chrome console print correctly as follows:

_main.js:4062 creatingTextures: size: (1920, 1080)
_main.js:4438 WSAvcPlayer: Sent REQUESTSTREAM
_main.js:4397 WSAvcPlayer: [Pkt 1 (12779 bytes)]
_main.js:4359 103
_main.js:4374 WSAvcPlayer: Passed SPS to decoder
_main.js:4397 WSAvcPlayer: [Pkt 2 (12535 bytes)]
_main.js:4359 65
_main.js:4374 WSAvcPlayer: Passed P frame to decoder
_main.js:4397 WSAvcPlayer: [Pkt 3 (20754 bytes)]
_main.js:4359 65
_main.js:4374 WSAvcPlayer: Passed P frame to decoder
_main.js:4397 WSAvcPlayer: [Pkt 4 (27743 bytes)]
_main.js:4359 65
_main.js:4374 WSAvcPlayer: Passed P frame to decoder
_main.js:4397 WSAvcPlayer: [Pkt 5 (18234 bytes)]
_main.js:4359 65
_main.js:4374 WSAvcPlayer: Passed P frame to decoder
_main.js:4397 WSAvcPlayer: [Pkt 6 (7004 bytes)]
_main.js:4359 65
_main.js:4374 WSAvcPlayer: Passed P frame to decoder
_main.js:4397 WSAvcPlayer: [Pkt 7 (18056 bytes)]
_main.js:4359 65
_main.js:4374 WSAvcPlayer: Passed P frame to decoder
_main.js:4397 WSAvcPlayer: [Pkt 8 (20673 bytes)]
_main.js:4359 65
_main.js:4374 WSAvcPlayer: Passed P frame to decoder
_main.js:4397 WSAvcPlayer: [Pkt 9 (10179 bytes)]
_main.js:4359 65

but no picture was decoded?

Please help.

Possible to run as ffmpeg codec copy instead of lib264?

I got the code to run with ffmpeg settings like this:

var args = [
        '-re',
        "-i", "rtsp://@192.168.1.63:554/axis-media/media.amp?videocodec=h264&resolution=1280x720&h264profile=baseline",
        // "-framerate", this.options.fps,
        "-video_size", this.options.width + 'x' + this.options.height,
        '-pix_fmt',  'yuv420p',
        // '-c:v',  'copy',
        '-c:v',  'libx264',
        '-b:v', '600k',
        '-bufsize', '600k',
        '-vprofile', 'baseline',
        '-tune', 'zerolatency',
        '-f' ,'h264',
        '-'
    ];

The ffmpeg process on such a command takes up 100% of one of my cores and I would like to lower that. I see that the video codec output is set to libx264. My input stream is already h264. Am I correct in assuming that this command is converting h264 into h264? In my case I am wondering if that is necessary.

When I comment out '-c:v', 'libx264', and uncomment '-c:v', 'copy', the ffmpeg CPU usage drops to 1% but the canvas player is blank.

The canvas is also blank if I keep '-c:v', 'libx264', but comment out '-vprofile', 'baseline',

I know that broadway.js expects the h264 profile to be baseline. My guess is that '-vprofile', 'baseline', converts the output to the baseline profile. Is it possible for me to have the input be baseline and just use the copy codec?

Is canvas size must equal to encode image size?

I encode device screen image with size 960*540.It's OK when I decode in same size canvas.But if I change the canvas size, it can't decode to right image.

Is canvas must equal to encode size ?
Could I do image scale when broadway decode h264 stream ?

Thank You.

No Audio

Hi,

I have test this code, and it works great!
I found it very very good one! thank you so much for the POC.

Just wonder if it can support audio as well?

Thank You
Sassy

How to run with static mp4 input?

Should I convert to h264 stream or should I load it as is or chunk it to NAL units (as with getSampleNALUnits() from the Broadway project)?

there is a question

macdeMacBook-Pro:h264-live-player-master mac$ node server-static.js
New guy
Incomming action 'REQUESTSTREAM'
fs.js:0
(function (exports, require, module, __filename, __dirname) { // Copyright Joy

Error: ENOENT, no such file or directory 'samples/test.h264'
at Error (native)
at Object.fs.statSync (fs.js:797:18)
at Class.get_feed (/Users/mac/Downloads/h264-live-player-master/lib/static.js:22:44)
at Class.start_feed (/Users/mac/Downloads/h264-live-player-master/lib/_server.js:28:27)
at WebSocket. (/Users/mac/Downloads/h264-live-player-master/lib/_server.js:71:14)
at WebSocket.emit (events.js:110:17)
at Receiver.ontext (/Users/mac/Downloads/h264-live-player-master/node_modules/ws/lib/WebSocket.js:816:10)
at /Users/mac/Downloads/h264-live-player-master/node_modules/ws/lib/Receiver.js:477:18
at /Users/mac/Downloads/h264-live-player-master/node_modules/ws/lib/Receiver.js:361:7
at /Users/mac/Downloads/h264-live-player-master/node_modules/ws/lib/PerMessageDeflate.js:238:5
macdeMacBook-Pro:h264-live-player-master mac$

1920x1080@30hz Video delay too much

hi, I test on Raspberry3 Model B, and use : node server-rpi.js,
i set raspivid input 1920x1080@30hz,When i browser video ,it have a very big delay.
Any advise for this ?

Thank you so much.

cameraIP decoder

I want to use the following pattern for camera IP. The pattern has 2 camera IP. I use ffmpeg to put data stream into tcp gate as below: ffmpeg -i " rtsp://10.10.14.1:554/user=&password=&channel=1&stream=0.sdp?" -f h264 -vcodec libx264 -profile:v baseline -level 3.0 tcp://10.10.10.159:4434?listen. When I tried to put it in, the result returned as "Passed P frame to decoder". Images still displayed but too blurry to see. Could you help me? Thank you

stops sending data after about 4 minutes

hi, I'm using server-ffmpeg.js and everything works fine. The problem is that after about 4 minutes it stops sending data to browser. I logged in lib/_server.js the broadcast metohd and I noticed that broadcast is no longer called. Also I noticed that the ffmpeg process is still alive but using 0% CPU. In lib/ffmpeg.js I logged streamer's events: exit, close, disconnect and error but none of them is used. When reloading the index.html page the web-socket is connecting fine (it logs action: "init" message comming from lib/_server.js the new_client method). The same time the same camera is used elsewhere and it doesn't stop streaming.

What's the problem?

How to use uws as websocket server

I try to use UWS as my websocket server , I can compiled and run the chat demo application by VS C++ 2015 without node.js ,but I don't really know how the ws client send binary data to UWS,and broadcast to all ws client ,would you give me some hint, thank you.

document

where can i find a document for this project. i wnat to learn this porject but i don't know where start! so i want to find a document!

some problem

I run the application, but when i click "Start Video", there is a problem. i want to know the mean of the following code:
const feed = new RemoteTCPFeedRelay(server, {
feed_ip : "172.19.20.165",
feed_port : 5001,
});
this code in the server-tcp.js, what's the meaning of feed_ip? similarly, there is the same code as this, in the remotetcpfeed.js , having the following code:
feed_ip : '127.0.0.1',
feed_port : 5001,
this code, feed_ip is 127.0.0.1, what's the meaning and the difference bewteen two codes?
thanks you!

ffmpeg stops abruptly with some cameras

hi, I'm running server-ffmpeg.js but I as soon as the ffmpeg process starts it then stops while the equivalent command writes fine to out-rawvideo-file:
ffmpeg -rtsp_transport tcp -re -i 'rtsp://172.20.19.114:554/user=admin_password=6QNMIQGe_channel=1_stream=0.sdp?real_stream' -an -pix_fmt yuv420p -c:v copy -f rawvideo out-rawvideo-file

One of the configuration I tested is:

var args = [
    "-rtsp_transport", "tcp",
    "-re",
    "-i", "'rtsp://172.20.19.114:554/user=admin_password=6QNMIQGe_channel=1_stream=0.sdp?real_stream'",
    "-an",
    '-pix_fmt', 'yuv420p',
    '-c:v', 'copy',
    '-f' ,'rawvideo',
    '-'
];

The same happens while using Encoding Video parameteres specified by https://github.com/mbebenita/Broadway.

Based on the logging below I conclude that ffmpeg fails somehow but I don't understand why it doesn't when used with the same command to write into out-rawvideo-file.

While debugging I noticed that _server.js -> broadcast is never called. I've logged some key execution points:

New guy

ffmpeg -rtsp_transport tcp -re -i 'rtsp://172.20.19.114:554/user=admin_password=6QNMIQGe_channel=1_stream=0.sdp?real_stream' -an -pix_fmt yuv420p -c:v copy -f rawvideo -

ffmpeg.js ffmpeg spawned

ffmpeg.js streamer.stderr.on data:
<Buffer 66 66 6d 70 65 67 20 76 65 72 73 69 6f 6e 20 32 2e 38 2e 31 31 2d 30 75 62 75 6e 74 75 30 2e 31 36 2e 30 34 2e 31 20 43 6f 70 79 72 69 67 68 74 20 28 ... >

ffmpeg.js streamer.stderr.on data:
<Buffer 20 20 6c 69 62 61 76 75 74 69 6c 20 20 20 20 20 20 35 34 2e 20 33 31 2e 31 30 30 20 2f 20 35 34 2e 20 33 31 2e 31 30 30 0a 20 20 6c 69 62 61 76 63 6f ... >

ffmpeg.js streamer.stderr.on close
ffmpeg.js streamer.on exit code = 1, signal = null
ffmpeg.js streamer.on close

With another camera the same config/code works fine.
What could be the problem?

server-tcp.js doesn't show any video

server-rpi.js works fine, when run on the RasPi. So I know my browser (latest Firefox) and camera are fine.

I've amended the IP address in server-tcp.js but it doesn't show any video.

On the desktop I get:

chris@grey:~/h264-live-player$ node server-tcp.js 
New guy
Incomming action 'REQUESTSTREAM'
remote stream ready

and on the Pi, just:

pi@raspberrypi:~ $ raspivid -t 0 -o - -w 1280 -h 720 -fps 25 | nc -k -l 5001

I've also tried the following on the Pi, which confirms the desktop has connected to the Pi:

pi@raspberrypi:~ $ raspivid -t 0 -w 1920 -h 1080 -fps 30 -l -o tcp://0.0.0.0:5001
Waiting for a TCP connection on 0.0.0.0:5001...Client connected from 10.0.0.12:39705

I have used wirehark, and that shows traffic flowing from pi:5001 to the server and sever:8080 traffic to the desktop. So the data is getting to the browser, but it is not being displayed.

This is odd, because the same browser happily shows video from server-rpi.js, running on the Pi.

What have I failed to understand?

Question: Client Connecting to Stream In Progress

Please forgive me if this isn't the best place to ask:

I'd like to have a single stream from a pi that many clients can connect to without having to restart the stream. I've been using the server-rpi.js as a starting point.

For more context: I have the pi connect to a WebSocket server which just broadcasts the data to all connected clients as a sort of proxy. My goal is to have 1 connection from the pi to the proxy, and many connections to the proxy. If someone knows of an existing implementation like this, that'd be super helpful. :)

I'm only able to render video on the client if it is connected to the WebSocket proxy prior to the pi connecting. If it connects after the pi has begun streaming data, I only get a blank canvas.

I am still working my way through https://www.ietf.org/rfc/rfc3984.txt to understand what is happening under the hood. I'm thinking this may have to do with the Packetization mode and/or the headers, but I have yet to fully understand the document.

My questions:

  • What prevents a client from being able to render a stream in progress?
  • Is what I described above even possible with this technology?

Thanks for the awesome lib!

How to play sample admiral.264?

I used server-static.js to delivery samples.
Sample out.h264 can be play well, but admiral.264 can not be play.
I tried to use ffplay to play admiral.264, it worked fine.

My client player setting was copied from your h264-live-player/public/index.html.

<script type="text/javascript">
var canvas = document.createElement("canvas");
document.body.appendChild(canvas);
var uri = "ws://" + document.location.host;
var wsavc = new WSAvcPlayer(canvas, "webgl", 1, 35);
wsavc.connect(uri);
window.wsavc = wsavc;
</script>

Should I change any setting?

Because I have a h264 file looks like sample admiral.264 have the same problem.
It can not be play too.
That is why I ask this question.

Texture size mismatch

when i changed the out.h264, i encounter a problem, as Texture size mismatch, data:101376, texture: 518400
what's matter?

Cannot play a stream

I'm using my raspi to send a stream to the server-tcp.js app.

I can view the server-tcp app from browser and click the start video / stop video buttons.

The server responds with text message but a video stream will not show up.

Have tried all diagnostic steps that I can think of.

Anything I should know about or try?

Thanks,

-tom

about ipcam

Excuse me,I want to use my IPcam with h264-live,but I don't know how to use your project to do.
Thank you!

How to set this for rpi camera broadcast?

I have a brand new raspberry installation, with an official raspberry camera plugged to it. How can I make this work with it?

I would love to know how to set the stream and broadcast from scratch.

Thank you.

Latency

Hi, this looks very interesting!
What kind of latency are you getting, from the frame arriving over websocket to it is displayed on the canvas?

how to play h264 video stream instread of h264 file

now i can play the h264 sample file, but now i want to implement another function, namely, i want to change the h264 sample file to h264 video stream, i can get the 264 video stream by websocket, but how can i play the stream, which one i can use?

Readme typo

'Recommandation' should be --> 'Recommendation'

Communication Protocol between Server and Client

Nice project! I want to use my own WebSocket Server to send the H264 frames to the client. Hence, I want to use your live-player only for displaying the frames in the browser. When I click on "Start Video" in your index.html sample I receive a REQUESTSTREAM WebSocket message on the server side and start sending the frames, but nothing is displayed in the browser. I send a baseline encoded h264 stream with a normal bitrate (I get the frames from an RTSP camera). The messages are send as a bytestream (opcode 130). Do I have to consider something else? I also tested an mp4 muxer (muxing the h264 frames on the client side to a mp4 file which is then displayed). This works, so I assume the frames are send correctly..

the video from h264 is blur

now, i can play my self h254 file, but the video is so very blur, can there is a solution? the blur video as follows:

1
2

i need a solution to solve this problem!

pack not found while grunt pack

HI,
when I run command : grunt pack
it shows task pack not found.

then I add following code to Gruntfile.js myself:

 grunt.registerTask('pack', , 'pack task.', function() { 
         grunt.file.expand({filter:'isDirectory'}, 'grunt/**').forEach(grunt.loadTasks);
         ...
  });

then run grunt successful,but server still can't work correctly :
open http://127.0.0.1:8080/ in firefox(ubuntu 16.04 x64) or http://servier_ip:8080 in IE(Windows7 x64)
I just get 3 button in my browser, can't play video.(I have tried rpi static tcp,these 3 all can't work)
I think , if it work correctly, it should at least can run static(local file).
I don't know what's the problem...

Using GStreamer with h264-live-player on Raspberry Pi

Hello, has anyone been able to get GStreamer to work with this? I would like to request an official example of how to do it, if possible.

Here's what I've tried:

  1. Modified lib/raspivid.js
    This pipeline should send an h264 camera stream (the raspicam) to STDOUT with a clock overlay added:

    gst-launch-1.0 rpicamsrc bitrate=950000 ! video/x-h264,width=800,height=480,framerate=10/1 ! \
    h264parse ! decodebin ! clockoverlay shaded-background=true draw-shadow=true font-desc="Nimbus Mono" ! \
    omxh264enc ! video/x-h264,width=800,height=480,framerate=10/1,profile=high ! h264parse ! \
    queue ! fdsink
    

    This works as expected from the command line. However when modifying raspivid.js, it does not seem to work. It does activate the camera (the red LED comes on), and if I change the fdsink to a tcpserversink I can connect to it fine, so gst-launch-1.0 IS working, it's just h264-live-player isn't picking up the data from STDOUT (or isn't recognizing it):

    get_feed() {
        //var msk = "raspivid -t 0 -o - -w %d -h %d -fps %d";
        //var cmd = util.format(msk, this.options.width, this.options.height, this.options.fps);
        var msk = "gst-launch-1.0 rpicamsrc bitrate=950000 ! video/x-h264,width=800,height=480,framerate=10/1 ! h264parse ! decodebin ! clockoverlay shaded-background=true draw-shadow=true font-desc=\"Nimbus Mono\" ! omxh264enc ! video/x-h264,width=800,height=480,framerate=10/1,profile=high ! h264parse ! queue ! fdsink";
        var cmd = msk;
    
        console.log(cmd);
    
        //    var streamer = spawn('/opt/gst/bin/gst-launch-1.0', ['', '0', '-o', '-', '-w', this.options.width, '-h', this.options.height, '-fps', this.options.fps, '-pf', 'baseline']);
        var streamer = spawn('/opt/gst/bin/gst-launch-1.0', ['rpicamsrc', 'bitrate=950000', '!', 'video/x-h264,width=800,height=480,framerate=10/1', '!', 'h264parse', '!', 'decodebin', '!', 'clockoverlay', 'shaded-background=true', 'draw-shadow=true', 'font-desc="Nimbus Mono"', '!', 'omxh264enc', '!', 'video/x-h264,width=800,height=480,framerate=10/1,profile=high', '!', 'h264parse', '!', 'queue', '!', 'fdsink']);
        streamer.on("exit", function(code){
          console.log("Failure", code);
        });
    
        return streamer.stdout;
      }
    
  2. Using the server-tcp.js example

    I have an almost identical gst pipeline, only this time it outputs the h264 stream to any clients connected to a tcp port:

    gst-launch-1.0 rpicamsrc ! video/x-h264,width=800,height=480,framerate=10/1 ! h264parse ! \
    decodebin ! omxh264enc ! tcpserversink port=5001 host=0.0.0.0
    

    If I connect to this TCP stream with VLC or another instance of GStreamer (locally or remotely) it works well, however I also don't get video with h264-live-player when I try with that.

In example 1, it's using the STDOUT file-descriptor to display an h264 stream, and in the second it's using a TCP h264 stream. I feel like this should be working just fine, but it's not. I was able to get both examples working just fine with raspivid (as your standard code includes), but I need to be able to use GStreamer to add overlays and custom processing plugins.

Is there any apparent reason these commands aren't working with h264-live-player?

Thanks!

Receive no video - just see start stop disconnect buttons

I execute node server-rpi.js
process is running
can connect to port 8080 from browser on the pi
page comes up but only has the three buttons
clicking buttons doesn't do anything

Is there any logging info? Are there any diagnostic procedures I could try?

Thanks,

-tom

Multiple clients on server-rpi??

Subject says it all ... i've tried to wrap the start_feed function with a bool so as not to start the spawned process a second time but I don't get any output on the second client; the first continues to work fine.

Any ideas?

start_feed() {
        if (this.isFeedStarted) {
            return;
        }

        var readStream = this.get_feed();
        this.readStream = readStream;

        readStream = readStream.pipe(new Splitter(NALseparator));
        readStream.on("data", this.broadcast);
        this.isFeedStarted = true;
    }

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.