Git Product home page Git Product logo

homebridge-camera-ffmpeg's Introduction

Homebridge Camera FFmpeg

npm npm verified-by-homebridge certified-hoobs-plugin

Homebridge Plugin Providing FFmpeg-based Camera Support

Installation

This plugin is supported under both Homebridge and HOOBS. It is highly recommended that you use either Homebridge Config UI X or the HOOBS UI to install and configure this plugin.

Manual Installation

  1. Install this plugin using: sudo npm install -g homebridge-camera-ffmpeg --unsafe-perm.
  2. Edit config.json manually to add your cameras. See below for instructions on that.

Tested configurations

Other users have been sharing configurations that work for them on our GitHub site. You may want to check that to see if anyone else has gotten your model of camera working already, or share a configuration setup that works for you.

Manual Configuration

Most Important Parameters

  • platform: (Required) Must always be set to Camera-ffmpeg.
  • name: (Required) Set the camera name for display in the Home app.
  • source: (Required) FFmpeg options on where to find and how to decode your camera's video stream. The most basic form is -i followed by your camera's URL.
  • stillImageSource: If your camera also provides a URL for a still image, that can be defined here with the same syntax as source. If not set, the plugin will grab one frame from source.

Config Example

{
  "platform": "Camera-ffmpeg",
  "cameras": [
    {
      "name": "Camera Name",
      "videoConfig": {
        "source": "-i rtsp://username:[email protected]:554",
        "stillImageSource": "-i http://example.com/still_image.jpg",
        "maxStreams": 2,
        "maxWidth": 1280,
        "maxHeight": 720,
        "maxFPS": 30
      }
    }
  ]
}

Optional Parameters

  • motion: Exposes the motion sensor for this camera. This can be triggered with dummy switches, MQTT messages, or via HTTP, depending on what features are enabled in the config. (Default: false)
  • doorbell: Exposes the doorbell device for this camera. This can be triggered with dummy switches, MQTT messages, or via HTTP, depending on what features are enabled in the config. (Default: false)
  • switches: Enables dummy switches to trigger motion and/or doorbell, if either of those are enabled. When enabled there will be an additional switch that triggers the motion or doorbell event. See the project site for more detailed instructions. (Default: false)
  • motionTimeout: The number of seconds after triggering to reset the motion sensor. Set to 0 to disable resetting of motion trigger for MQTT or HTTP. (Default: 1)
  • motionDoorbell: Rings the doorbell when motion is activated. This allows for motion alerts to appear on Apple TVs. (Default: false)
  • manufacturer: Set the manufacturer name for display in the Home app. (Default: Homebridge)
  • model: Set the model for display in the Home app. (Default: Camera FFmpeg)
  • serialNumber: Set the serial number for display in the Home app. (Default: SerialNumber)
  • firmwareRevision: Set the firmware revision for display in the Home app. (Default: current plugin version)
  • unbridge: Bridged cameras can cause slowdowns of the entire Homebridge instance. If unbridged, the camera will need to be added to HomeKit manually. (Default: false)

Config Example with Manufacturer and Model Set

{
  "platform": "Camera-ffmpeg",
  "cameras": [
    {
      "name": "Camera Name",
      "manufacturer": "ACME, Inc.",
      "model": "ABC-123",
      "serialNumber": "1234567890",
      "firmwareRevision": "1.0",
      "videoConfig": {
        "source": "-i rtsp://username:[email protected]:554",
        "stillImageSource": "-i http://example.com/still_image.jpg",
        "maxStreams": 2,
        "maxWidth": 1280,
        "maxHeight": 720,
        "maxFPS": 30
      }
    }
  ]
}

Optional videoConfig Parameters

  • returnAudioTarget: (EXPERIMENTAL - WIP) The FFmpeg output command for directing audio back to a two-way capable camera. This feature is still in development and a configuration that works today may not work in the future.
  • maxStreams: The maximum number of streams that will be allowed at once to this camera. (Default: 2)
  • maxWidth: The maximum width used for video streamed to HomeKit. If set to 0, the resolution of the source is used. If not set, will use any size HomeKit requests.
  • maxHeight: The maximum height used for video streamed to HomeKit. If set to 0, the resolution of the source is used. If not set, will use any size HomeKit requests.
  • maxFPS: The maximum frame rate used for video streamed to HomeKit. If set to 0, the framerate of the source is used. If not set, will use any frame rate HomeKit requests.
  • maxBitrate: The maximum bitrate used for video streamed to HomeKit, in kbit/s. If not set, will use any bitrate HomeKit requests.
  • forceMax: If set, the settings requested by HomeKit will be overridden with any 'maximum' values defined in this config. (Default: false)
  • vcodec: Set the codec used for encoding video sent to HomeKit, must be H.264-based. You can change to a hardware accelerated video codec with this option, if one is available. (Default: libx264)
  • audio: Enables audio streaming from camera. (Default: false)
  • packetSize: If audio or video is choppy try a smaller value, should be set to a multiple of 188. (Default: 1316)
  • mapvideo: Selects the stream used for video. (Default: FFmpeg automatically selects a video stream)
  • mapaudio: Selects the stream used for audio. (Default: FFmpeg automatically selects an audio stream)
  • videoFilter: Comma-delimited list of additional video filters for FFmpeg to run on the video. If 'none' is included, the default video filters are disabled.
  • encoderOptions: Options to be passed to the video encoder. (Default: -preset ultrafast -tune zerolatency if using libx264)
  • debug: Includes debugging output from the main FFmpeg process in the Homebridge log. (Default: false)
  • debugReturn: Includes debugging output from the FFmpeg used for return audio in the Homebridge log. (Default: false)

More Complicated Example

{
  "platform": "Camera-ffmpeg",
  "cameras": [
    {
      "name": "Camera Name",
      "videoConfig": {
        "source": "-i rtsp://myfancy_rtsp_stream",
        "stillImageSource": "-i http://faster_still_image_grab_url/this_is_optional.jpg",
        "maxStreams": 2,
        "maxWidth": 1280,
        "maxHeight": 720,
        "maxFPS": 30,
        "maxBitrate": 200,
        "vcodec": "h264_omx",
        "audio": false,
        "packetSize": 188,
        "hflip": true,
        "additionalCommandline": "-x264-params intra-refresh=1:bframes=0",
        "debug": true
      }
    }
  ]
}

Camera MQTT Parameters

  • motionTopic: The MQTT topic to watch for motion alerts.
  • motionMessage: The message to watch for to trigger motion alerts. Will use the name of the camera if blank.
  • motionResetTopic: The MQTT topic to watch for motion resets.
  • motionResetMessage: The message to watch for to trigger motion resets. Will use the name of the camera if blank.
  • doorbellTopic: The MQTT topic to watch for doorbell alerts.
  • doorbellMessage: The message to watch for to trigger doorbell alerts. Will use the name of the camera if blank.

Camera MQTT Example

{
  "platform": "Camera-ffmpeg",
  "cameras": [
    {
      "name": "Camera Name",
      "videoConfig": {
        "source": "-i rtsp://myfancy_rtsp_stream"
      },
      "mqtt": {
        "motionTopic": "home/camera",
        "motionMessage": "ON",
        "motionResetTopic": "home/camera",
        "motionResetMessage": "OFF",
        "doorbellTopic": "home/doobell",
        "doorbellMessage": "ON"
      }
    }
  ]
}

Automation Parameters

  • mqtt: Defines the hostname or IP of the MQTT broker to connect to for MQTT-based automation. If not set, MQTT support is not started. See the project site for more information on using MQTT.
  • portmqtt: The port of the MQTT broker. (Default: 1883)
  • tlsmqtt: Use TLS to connect to the MQTT broker. (Default: false)
  • usermqtt: The username used to connect to your MQTT broker. If not set, no authentication is used.
  • passmqtt: The password used to connect to your MQTT broker. If not set, no authentication is used.
  • porthttp: The port to listen on for HTTP-based automation. If not set, HTTP support is not started. See the project site for more information on using HTTP.
  • localhttp: Only allow HTTP calls from localhost. Useful if using helper plugins that translate to HTTP. (Default: false)

Automation Example

{
  "platform": "Camera-ffmpeg",
  "mqtt": "127.0.0.1",
  "porthttp": "8080",
  "cameras": []
}

Rarely Needed Parameters

  • videoProcessor: Defines which video processor is used to decode and encode videos, must take the same parameters as FFmpeg. Common uses would be avconv or the path to a custom-compiled version of FFmpeg. If not set, will use the included version of FFmpeg, or the version of FFmpeg installed on the system if no included version is available.

Rare Option Example

{
  "platform": "Camera-ffmpeg",
  "videoProcessor": "/usr/bin/ffmpeg",
  "cameras": []
}

Credit

Homebridge Camera FFmpeg is based on code originally written by Khaos Tian.

homebridge-camera-ffmpeg's People

Contributors

aaronpearce avatar agentmcbride avatar aremishevsky avatar ay avatar campbellbs avatar cflurin avatar cooperd avatar dannyvancura avatar dependabot-preview[bot] avatar dependabot[bot] avatar dewgew avatar donavanbecker avatar francesco-kriegel avatar georgo avatar khaost avatar krebbi avatar longzheng avatar malhal avatar milmber avatar nebzhb avatar normen avatar northernman54 avatar oznu avatar patelhiren avatar rotx avatar sphtkr avatar sunoo avatar tgerring avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

homebridge-camera-ffmpeg's Issues

No Camera in Home app

Hi,

Installed a clean Homebridge on OSX and added homebridge-camera-ffmpeg.
Configured my camera in the config.json and started the Homebridge without errors.
Add the homebridge to the Home app, but I don't see any camera in the Home App.

What am I missing here.

Config.json
`{
"bridge": {
"name": "Homebridge",
"username": "CC:22:3D:E3:CE:30",
"port": 51826,
"pin": "031-45-154"
},

"platform": "Camera-ffmpeg",
"cameras": [{
    "name": "Webcam 3",
    "videoConfig": {
        "source": "-re -i rtsp://192.168.0.197:554/12",
        "maxStreams": 2,
        "maxWidth": 1280,
        "maxHeight": 720,
        "maxFPS": 30
    }
}]

}`

And the startup screen of the Homebridge:
`[2016-09-29 21:41:04] Loaded plugin: homebridge-camera-ffmpeg
[2016-09-29 21:41:04] Registering platform 'homebridge-camera-ffmpeg.Camera-ffmpeg'
[2016-09-29 21:41:04] ---
[2016-09-29 21:41:04] Loaded config.json with 0 accessories and 0 platforms.
[2016-09-29 21:41:04] ---
Load homebridge-camera-ffmpeg.Camera-ffmpeg
Scan this code with your HomeKit App on your iOS device to pair with Homebridge:

┌────────────┐     
│ 031-45-154 │     
└────────────┘     

[2016-09-29 21:41:04] Homebridge is running on port 51826.`

Mac USB Camera

Hello!

I have a USB camera plugged into my Mac Mini which is recognised in PhotoBooth. I have run the command below, but I am not sure what I should put into the config file.

Server:~ server$ ffmpeg -f avfoundation -list_devices true -i ""

ffmpeg version 3.2.2 Copyright (c) 2000-2016 the FFmpeg developers
  built with Apple LLVM version 8.0.0 (clang-800.0.42.1)
  configuration: --prefix=/usr/local/Cellar/ffmpeg/3.2.2 --enable-shared --enable-pthreads --enable-gpl --enable-version3 --enable-hardcoded-tables --enable-avresample --cc=clang --host-cflags= --host-ldflags= --enable-ffplay --enable-frei0r --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopencore-amrnb --enable-libopencore-amrwb --enable-libopus --enable-librtmp --enable-libschroedinger --enable-libspeex --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-libxvid --enable-opencl --disable-lzma --enable-libopenjpeg --disable-decoder=jpeg2000 --extra-cflags=-I/usr/local/Cellar/openjpeg/2.1.2/include/openjpeg-2.1 --enable-nonfree --enable-vda
  libavutil      55. 34.100 / 55. 34.100
  libavcodec     57. 64.101 / 57. 64.101
  libavformat    57. 56.100 / 57. 56.100
  libavdevice    57.  1.100 / 57.  1.100
  libavfilter     6. 65.100 /  6. 65.100
  libavresample   3.  1.  0 /  3.  1.  0
  libswscale      4.  2.100 /  4.  2.100
  libswresample   2.  3.100 /  2.  3.100
  libpostproc    54.  1.100 / 54.  1.100
[AVFoundation input device @ 0x7fdd5bd003c0] AVFoundation video devices:
[AVFoundation input device @ 0x7fdd5bd003c0] [0] Microsoft® LifeCam Cinema(TM)
[AVFoundation input device @ 0x7fdd5bd003c0] [1] Capture screen 0
[AVFoundation input device @ 0x7fdd5bd003c0] AVFoundation audio devices:
[AVFoundation input device @ 0x7fdd5bd003c0] [0] Microsoft® LifeCam Cinema(TM)
[AVFoundation input device @ 0x7fdd5bd003c0] [1] Built-in Input
: Input/output error

Help flipping stream

Hey,

I have this all setup and working (thank you!) but my camera is actually mounted on the roof upside down. I have added -vf "transpose=2,transpose=2 to the source URL in the config and that has flipped the image in the capture preview but when i actually view the stream it goes back to being upside down.

Is there a way to get this to flip properly?

Thanks

Suddenly doesn't connect

Hi,
I had this running for about a month and recently it stopped working (had a power failure in my house).

Checked using VLC, and I can see the stream.
Here is what I have in the config file (username and password hidden):
{ "platform": "Camera-ffmpeg", "cameras": [ { "name": "IP Cam", "videoConfig": { "source": "-re -i rtsp://192.168.1.2/live/ch0", "maxStreams": 2, "maxWidth": 640, "maxHeight": 360, "maxFPS": 1 } } ] }
And this is the output I see in the terminal (username and password hidden):
-re -i rtsp://192.168.1.2/live/ch0 -threads 0 -vcodec libx264 -an -pix_fmt yuv420p -r 1 -f rawvideo -tune zerolatency -vf scale=1280:720 -b:v 299k -bufsize 299k -payload_type 99 -ssrc 1 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params ZzwYueqo5yUfHHEmp79smYRrlLxR9+bhDDCSW509 srtp://192.168.1.9:64414?rtcpport=64414&localrtcpport=64414&pkt_size=1378

In the app I still see a still image from two weeks ago, but when I click the image it writes loading and nothing happens.

Any ideas?

No audio

There appears ton be no audio sent to HomeKit.

My setup (on OSX) creates a stream with these parameters: -re -f avfoundation -video_size 1280x720 -framerate 30 -i 0:3 -threads 0 -vcodec libx264 -an -pix_fmt yuv420p -r 30 -f rawvideo -tune zerolatency -vf scale=1280:720 -b:v 299k -bufsize 299k -payload_type 99 -ssrc 1 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params XCRzu2fUEo57aBQkoylhjm413JUwnrUP47BQU7yD srtp://192.168.2.57:55484?rtcpport=55484&localrtcpport=55484&pkt_size=1378

Pushing to an mp4 file does have audio, yet the home bridge stream does not.

IP Cam isn't going live

hi, I'm trying to put my dahua security camera into home bridge.
the snapshot is working great but the live view won't work. i see in the log that he starts the stream but it just doesn't get live :/

anybody some ideas why ?

Getting error when viewing the camera

Oct 12 00:03:32 homekit homebridge[10546]: events.js:141
Oct 12 00:03:32 homekit homebridge[10546]: throw er; // Unhandled 'error' event
Oct 12 00:03:32 homekit homebridge[10546]: ^
Oct 12 00:03:32 homekit homebridge[10546]: Error: spawn ffmpeg ENOENT
Oct 12 00:03:32 homekit homebridge[10546]: at exports._errnoException (util.js:907:11)
Oct 12 00:03:32 homekit homebridge[10546]: at Process.ChildProcess._handle.onexit (internal/child_process.js:178:32)
Oct 12 00:03:32 homekit homebridge[10546]: at onErrorNT (internal/child_process.js:344:16)
Oct 12 00:03:32 homekit homebridge[10546]: at nextTickCallbackWith2Args (node.js:442:9)
Oct 12 00:03:32 homekit homebridge[10546]: at process._tickCallback (node.js:356:17)

connecting a pi-camera v2.1

I'm running homebridge-camera-ffmpeg on a Rpi 3.

The best solution I've found so far is Point-to point streaming (https://trac.ffmpeg.org/wiki/StreamingGuide):

raspivid -n -w 480 -h 320 -b 300000 -fps 15 -t 0 -o - | ffmpeg -i - -f mpegts udp://192.168.0.34:8090

note: 192.168.0.34 is the Rpi IP.

raspivid captures the video and sends the data through the pipe to ffmpeg.

pi@rpi4:~ $ raspivid -n -w 480 -h 320 -b 300000 -fps 15 -t 0 -o - | ffmpeg -i - -f mpegts udp://192.168.0.34:8090ffmpeg
  version N-82436-g0613627 Copyright (c) 2000-2016 the FFmpeg developers
  built with gcc 4.9.2 (Raspbian 4.9.2-10)
  configuration: --arch=armel --target-os=linux --enable-gpl --enable-libx264 --enable-nonfree
  libavutil      55. 40.100 / 55. 40.100
  libavcodec     57. 66.103 / 57. 66.103
  libavformat    57. 57.100 / 57. 57.100
  libavdevice    57.  2.100 / 57.  2.100
  libavfilter     6. 67.100 /  6. 67.100
  libswscale      4.  3.101 /  4.  3.101
  libswresample   2.  4.100 /  2.  4.100
  libpostproc    54.  2.100 / 54.  2.100
Input #0, h264, from 'pipe:':
  Duration: N/A, bitrate: N/A
    Stream #0:0: Video: h264 (High), yuv420p(progressive), 480x320, 25 fps, 25 tbr, 1200k tbn, 50 tbc
Output #0, mpegts, to 'udp://192.168.0.34:8090':
  Metadata:
    encoder         : Lavf57.57.100
    Stream #0:0: Video: mpeg2video (Main), yuv420p, 480x320, q=2-31, 200 kb/s, 25 fps, 90k tbn, 25 tbc
    Metadata:
      encoder         : Lavc57.66.103 mpeg2video
    Side data:
      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
Stream mapping:
  Stream #0:0 -> #0:0 (h264 (native) -> mpeg2video (native))
frame= 1381 fps= 16 q=14.2 size=    1950kB time=00:00:55.12 bitrate= 289.8kbits/s speed=0.657x 

config.json:

{
"platform": "Camera-ffmpeg",
	"cameras": [{
		"name": "pi camera",
		"videoConfig": {
			"source": "-re -i udp://192.168.0.34:8090",
			"maxStreams": 2,
			"maxWidth": 480,
			"maxHeight": 320,
			"maxFPS": 15
		}
	}]
}

The next step will be to reduce the stream delay, any idea?

JSON Error

Hello,

I have installed the plugin and FFMPEG but I am getting the following error when I run home bridge:

`crypto.js:74
this._handle.update(data, encoding);
^

TypeError: Data must be a string or a buffer
at Hash.update (crypto.js:74:16)
at Object.generate (/usr/local/lib/node_modules/homebridge/node_modules/hap-nodejs/lib/util/uuid.js:14:11)
at cmdSwitchPlatform.addAccessory (/usr/local/lib/node_modules/homebridge-cmdswitch2/index.js:66:24)
at cmdSwitchPlatform.didFinishLaunching (/usr/local/lib/node_modules/homebridge-cmdswitch2/index.js:40:10)
at emitNone (events.js:91:20)
at API.emit (events.js:185:7)
at Server.run (/usr/local/lib/node_modules/homebridge/lib/server.js:93:13)
at module.exports (/usr/local/lib/node_modules/homebridge/lib/cli.js:40:10)
at Object. (/usr/local/lib/node_modules/homebridge/bin/homebridge:17:22)
at Module._compile (module.js:571:32)
Church-Server:~ churchserver$
`

I have attached my JSON code, I'm not sure if I am formatting this incorrectly.

Thanks,

Sam
screen shot 2016-12-13 at 01 30 27
config.json.zip

Errors on Linux (Raspbian jessie) with ffmpeg arguments/spawn

I was working on something very similar to the folks in #10--I hit an unexpected blip. Whether it's the still image ffmpeg call or the video stream ffmpeg call, something happens with the arguments that causes ffmpeg to not start. I added console.log calls to spit out stdout and stderr and this is what I saw:

-f v4l2 -ss 00:00:01 -i /dev/video1 -vframes 1  -t 1 -s 640x480 -f image2 -
ffmpeg version git-2016-11-09-ab6ffc2
Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 4.9.2 (Raspbian 4.9.2-10)
configuration: --enable-gpl --enable-nonfree --enable-mmal --enable-omx --enable-omx-rpi --enable-libx264
libavutil      55. 35.100 / 55. 35.100
libavcodec     57. 66.101 / 57. 66.101
libavformat    57. 57.100 / 57. 57.100
libavdevice    57.  2.100 / 57.  2.100
libavfilter     6. 66.100 /  6. 66.100
libswscale      4.  3.100 /  4.  3.100
libswresample   2.  4.100 /  2.  4.100
libpostproc    54.  2.100 / 54.  2.100
/dev/video1: could not seek to position 83848.268
Input #0, video4linux2,v4l2, from '/dev/video1':
Duration: N/A, start: 83847.267873, bitrate: 147456 kb/s
Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 640x480, 147456 kb/s,
30 fps,
30 tbr, 1000k tbn, 1000k tbc
[NULL @ 0x28e61c0] Unable to find a suitable output format for ''
: Invalid argument

The Unable to find a suitable output format for '' was the same no matter what, and appeared both for the still and video feed.

I managed to workaround the issue by adding the shell parameter to the spawn options, that is, the options in my modified copy are now {shell: '/bin/bash', env: process.env}. With that option, it works correctly. But I'm not sure why this was necessary or how to better fix it. I'm also running the plugin on macOS and don't have this issue. The default shell for the user that this is running under is already /bin/bash. (or at least, I think so:

root@hostname:/opt# su - homebridge
homebridge@hostname:~$ echo $SHELL
/bin/bash

). I'm inferring that something about the command parameters is being interpreted differently by spawn or whatever command interpreter it's using under the hood, but I'm not sure why. Any ideas?

Obviously my "fix" requires a mod that isn't necessary on macOS and probably incurs some performance penalty. Also, the video stream process seems to never terminate with this fix in place...but I have no way of knowing if the shell option causes this, because without it it won't start at all.

VideoConfig help

I'm still really green in this whole thing (my background is more in electronics hardware design than anything software), though I have managed to get ubuntu installed as well as getting homebridge running stably with a number of plug-ins, including harmonyhub, nest, and wemo.

Where I'm at now, I feel a little out of my element trying to get the camera-ffmpeg configured. I'm pretty sure I have all the required software installed, and I was able to add the camera within the Home app, but I am completely certain that I have gone astray in my configuration.

The specific area of concern for me is the "source" field in the VideoConfig. I'm assuming since I'm using a webcam plugged into the same machine that is running homebridge (the same machine as has ffmpeg), I can use the IP address of that machine. I also understand that I may need to have my username and password ahead of that as well. Beyond that, I'm completely stumped as to what to enter. I'm pretty sure it's the port followed by something else. How/Where do I find/set these parameters? (I feel pretty silly asking since it seems pretty common knowledge around here, it's just something I've never previously been exposed to)

Thanks for any help you guys can toss my way.

Fix the camera tcp listening port

How can I fix the tcp port of the camera-ffmpeg in homebridge? I can fix the homebridge port, but didn't found how to fix the camera port :

Here is my config file where I can specify the port 51828 for homebridge, but not for the camera-ffmpeg platform:

{
        "bridge": {
                "name": "Bridge webcams couloir",
                "username": "CC:55:77:88:55:02",
                "port": 51828,
                "pin": "123-45-678"
        },
        "platforms": [{
                "platform": "Camera-ffmpeg",
                "name": "Webcam couloir",
                "cameras": [{
                        "name": "Webcam couloir",
                        "videoConfig": {
                                "source": "-re -i http://webcam/stream.mjpg",
                                "maxStreams": 2,
                                "maxWidth": 1280,
                                "maxHeight": 720,
                                "maxFPS": 30
                        }
                }]
        }]
}

And in the log, when starting, the camera take the a random port. Here, the 54608 tcp port :

[12/5/2016, 4:41:17 PM] Homebridge is running on port 51827.
[12/5/2016, 4:41:17 PM] Webcam salon is running on port 54608.

I'm running homebidge in a docker container, and it would be much better if I could fix that port in the configuration file.

After a few days stillimage is just black in the home app and won't update

i'm using the homebridge-camera-ffmpeg-omx plugin.
Everything works for 3 or 4 days quite well. Still image updates every 10 seconds on local home network.
Updates every 1 minute when remote. Video plays just fine.
Except after 3 or 4 days my still image is just black. won't update. when i click on it in the home app i can see the video stream just fine. When i manually go to my still image url as configured in the config.json that url works just fine.
The only fix is to remove the camera from the home app.
I delete the persist files for the camera.
Then re-pair the camera with the same exact config.json setup.
Then everything works again for a few days.
Any ideas what might be happening?

my config below:

{
"platform": "Camera-ffmpeg",
"cameras": [
{
"name": "FrontDoorIPCam",
"videoConfig": {
"source": "-rtsp_transport udp -re -i rtsp://admin:pass@ip:554/profile2/media.smp",
"stillImageSource": "-f mjpeg -i http://ip:81/image/FrontDoor",
"maxStreams": 6,
"maxWidth": 640,
"maxHeight": 360,
"maxFPS": 30
}
}
]
},

Support for viewing remotely

Hi!

So the server that Homebridge is running on is sitting in my DMZ (I like to live dangerously...) - is there any way that I can view my camera streams remotely?

Thanks to my Apple TV, other kit works fine (like an official Schlage lock, and a WeMo Switch) over 3g when I'm not at home, but I can't view the camera - I assume this is because the phone is being fed an internal address to connect to, not my external address?

Question about special character

Hi
first thanks for your project
I try to stream from zoneminder
this is form of adress

http://10.0.0.1/zm/cgi-bin/nph-zms?mode=jpeg&maxfps=3&buffer=1000&scale=35&monitor=1&user=user&pass=pass

So i configure config.json like this

{
"name": "1-porte",
"videoConfig": {
"source": "-re -i http://10.0.0.1/zm/cgi-bin/nph-zms?mode=jpeg&maxfps=3&buffer=1000&scale=35&monitor=5&user=user&pass=pass"
}
},

Screenshot ok on home page but when i put on video live don't ok

ffmpeg don't like character & ...
So i try to add \ before &
"source": "-re -i http://10.0.0.1/zm/cgi-bin/nph-zms?mode=jpeg\&maxfps=3\&buffer=1000\&scale=35\&monitor=5\&user=user\&pass=pass"

But now it's json not valid ..

Do you have a idea for help me

Thanks a lot

Raspberry PI ( Jessie ) and Vivotek IP Camera ( CC-8130 )

I encountered a minor issue when attempting to set this up on my Raspberry PI with Jessie. The package ffmpeg has been changed to libav-tools, and the ffmpeg command was changed to avconv. After an install of libav-tools and doing a quick edit of ffmpeg.js to change the ffmpeg command to avconv, everything worked fine.

Using VLC to provide HTTP video stream does not know how to work

The use of VLC to provide HTTP video stream in iPad input ULR can see the screen, this plugin in homebridge is correctly loaded, but I config to load the video stream, can give some advice?

The following is the prompt to start the frame
…………………………………………
[2017-02-10 13:04:01] Homebridge is running on port 50397.
[2017-02-10 13:04:01] isightFW is running on port 59765.
http://10.10.10.162:8080 -threads 0 -vcodec libx264 -an -pix_fmt yuv420p -r 30 -f rawvideo -tune zerolatency -vf scale=640:360 -b:v 132k -bufsize 132k -payload_type 99 -ssrc 1 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params fpErzZcaUfswUUfgUlq3Uwk2tt3P+ZByCpBJ3JB/ srtp://10.10.10.113:54949?rtcpport=54949&localrtcpport=54949&pkt_size=1378

Error ffmpeg on MAC

Hello,

I've an error when i open the stream video in AppHome.
Homebridge is installed to my MAC.
The plugin is installed correctly...
The stream video work on VLC.
I don't understand!

the error log:

events.js:160
throw er; // Unhandled 'error' event
^

Error: spawn ffmpeg ENOENT
at exports._errnoException (util.js:1022:11)
at Process.ChildProcess._handle.onexit (internal/child_process.js:193:32)
at onErrorNT (internal/child_process.js:359:16)
at _combinedTickCallback (internal/process/next_tick.js:74:11)
at process._tickCallback (internal/process/next_tick.js:98:9)

Any tweak to get a smoother video

The video I get has a delay of about 3 seconds an is just not running smoothly.

The output line is:

-re -i rtsp://USER:[email protected]:xxxx/ipcam_h264.sdp -threads 0 -vcodec libx264 -an -pix_fmt yuv420p -r 30 -f rawvideo -tune zerolatency -vf scale=640:360 -b: v 132k -bufsize 132k -payload_type 99 -ssrc 1 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params Avm+tJYsEIRaWFMSQdSgYTo5oHzjkUpOd+oRMi6q srtp://x.x.x .x:49546?rtcpport=49546&localrtcpport=49546&pkt_size=1378

If I run rtsp://USER:[email protected]:xxxx/ipcam_h264.sdp in vlc the video has only a second dely, if any.

So is there anything I can use to tweak it? Even cloning the repo is an option for me to change source, just wouldnt know where to start.

live stream not working, home overview updates fine

I'm having difficulties with the live stream not working. The overview will show me stills of the camera but the live stream will not load. Just sits spinning and spinning. I've confirmed i can play my stream URL with ffplay at 1280x960 @ 25fps. I'm running a mac VM with 4gb of memory.

Any input would be much appreciated.

Any success with Amcrest/Foscam cameras?

I'm having a difficult time getting my Amcrest and Foscam cameras working with this. Has anyone had any luck? If so, would you mind sharing your configuration?

Error: spawn ffmpeg ENOENT

I get this error message after starting homebridge.

events.js:160
throw er; // Unhandled 'error' event
^

Error: spawn ffmpeg ENOENT
at exports._errnoException (util.js:1026:11)
at Process.ChildProcess._handle.onexit (internal/child_process.js:193:32)
at onErrorNT (internal/child_process.js:359:16)
at _combinedTickCallback (internal/process/next_tick.js:74:11)
at process._tickCallback (internal/process/next_tick.js:98:9)

I can pair my camera in the Home app. But no stream is showing. What should I do?

Can't load the camera

I tried to do the same example from the README file for the config of the cameras. But I had no success loading the camera accessory. Can anyone share there copy of the config file? Thanks. I tried the source using VLC player and it work fine. That's probably not the issue though.

What to use for stream settings when using ffmpeg on macbook isight camera?

I've got it running and I was able to add the camera to my iOS device but when I touch the black window to see the camera it crashes, I assume because I don't the the proper stream in the config.
"source": "-re -i rtsp://myfancy_rtsp_stream", <-----WHAT SHOULD I USE HERE?

I already installed FFMPEG on my macbook.

Crash when try to view camera online

Hello!
In home screen i see screenshot from camera, but when i try to view online homekit crash with error:

/usr/local/lib/node_modules/homebridge/node_modules/hap-nodejs/lib/StreamController.js:445
var data = Buffer.from(value, 'base64');
^

TypeError: base64 is not a function
at Function.from (native)

I use imac, FFMPEG also install on system.

ffmpeg version 3.0.2 Copyright (c) 2000-2016 the FFmpeg developers
built with Apple LLVM version 7.3.0 (clang-703.0.31)

What there can be a problem?
Thank you!

Very low frame rate with high CPU usage

It's possible the box I'm using is too underpowered, but I cannot get ffmpeg to use less than about 300% CPU on my machine, and it cannot keep up to realtime.

I have configured my cameras to only do 1 FPS, but I am not sure they respect that setting entirely. However, I cannot get any video reliably no matter what the settings used.

These are two d-link 2330L cameras.

Streaming not working

Hi,

I can't figure out why the live stream isn't working. I've tried the command of the logfiles on the commandline:

ffmpeg -re -i 'rtsp://192.168.178.210:554/mcast/11' -threads 0 -vcodec 'h264_omx' -an -r 15 -f rawvideo -tune zerolatency -vf 'scale=1280:720' -b:v 299k -bufsize 299k -payload_type 99 -ssrc 1 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params wtr+AvPulFvzgHjcVfhj96z5guybAH175gTVwy/P 'srtp://192.168.178.36:57758?rtcpport=57758&localrtcpport=57758&pkt_size=1378'

Output:

ffmpeg version N-81800-gf013ba4 Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 4.9.2 (Raspbian 4.9.2-10)
configuration: --arch=armel --target-os=linux --enable-gpl --enable-libx264 --enable-nonfree --enable-omx --enable-omx-rpi
libavutil 55. 30.100 / 55. 30.100
libavcodec 57. 58.103 / 57. 58.103
libavformat 57. 51.100 / 57. 51.100
libavdevice 57. 0.102 / 57. 0.102
libavfilter 6. 63.100 / 6. 63.100
libswscale 4. 1.100 / 4. 1.100
libswresample 2. 2.100 / 2. 2.100
libpostproc 54. 0.100 / 54. 0.100
Guessed Channel Layout for Input Stream #0.1 : mono
Input #0, rtsp, from 'rtsp://192.168.178.210:554/mcast/11':
Metadata:
title : Media Presentation
comment : mpeg4
Duration: N/A, start: 0.000000, bitrate: N/A
Stream #0:0: Video: h264 (Main), yuv420p, 1280x720, 25 fps, 25 tbr, 90k tbn, 180k tbc
Stream #0:1: Audio: pcm_alaw, 8000 Hz, mono, s16, 64 kb/s
Codec AVOption tune (Tune the encoding params (cf. x264 --fullhelp)) specified for output file #0 (srtp://192.168.178.36:57758?rtcpport=57758&localrtcpport=57758&pkt_size=1378) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
[h264_omx @ 0x2abf680] Using OMX.broadcom.video_encode
[h264_omx @ 0x2abf680] OMX_GetHandle(OMX.broadcom.video_encode) failed: 80001005
Output #0, rtp, to 'srtp://192.168.178.36:57758?rtcpport=57758&localrtcpport=57758&pkt_size=1378':
Metadata:
title : Media Presentation
comment : mpeg4
Stream #0:0: Video: h264, none, q=2-31, 15 fps
Metadata:
encoder : Lavc57.58.103 h264_omx
Stream mapping:
Stream #0:0 -> #0:0 (h264 (native) -> h264 (h264_omx))
Error while opening encoder for output stream #0:0 - maybe incorrect parameters such as bit_rate, rate, width or height

homebridge-part config is:

    "platform": "Camera-ffmpeg",
    "cameras":
    [{
        "name": "Kamera Garten",
        "videoConfig": {
            "source": "-re -i rtsp://192.168.178.210:554/mcast/11",
            "stillImageSource": "-i http://192.168.178.4/snapshots/210/image.jpg",
            "maxStreams": 3,
            "maxWidth": 1280,
            "maxHeight": 720,
            "maxFPS": 25
        }
    }]

Please help me :)

Crash on start

Hi !

Thank you for the great job you're doing.
However, I got a crash when starting homebridge, can you help me ?

homebridge-plugins/homebridge-camera-ffmpeg/ffmpeg.js:253
  var controlService = new Service.CameraControl();
                       ^

TypeError: Service.CameraControl is not a function
    at FFMPEG.createCameraControlService (/var/www/xxxx/plugins/Siri/homebridge-plugins/homebridge-camera-ffmpeg/ffmpeg.js:253:24)
    at new FFMPEG (/var/www/xxxx/plugins/Siri/homebridge-plugins/homebridge-camera-ffmpeg/ffmpeg.js:113:8)
    at /var/www/xxxx/plugins/Siri/homebridge-plugins/homebridge-camera-ffmpeg/index.js:52:26
    at Array.forEach (native)
    at ffmpegPlatform.didFinishLaunching (/var/www/xxxx/plugins/Siri/homebridge-plugins/homebridge-camera-ffmpeg/index.js:41:13)
    at emitNone (events.js:67:13)
    at API.emit (events.js:166:7)
    at Server.run (/var/www/xxxx/plugins/Siri/homebridge/lib/server.js:90:13)
    at module.exports (/var/www/xxxx/plugins/Siri/homebridge/lib/cli.js:40:10)
    at Object.<anonymous> (/var/www/xxxx/plugins/Siri/homebridge/bin/homebridge:17:22)

handleStreamRequest and prepareStream not called. The snapshot works

First of all thanks @KhaosT for the plugin! Really cool.
I have been trying to get my cameras working for the past week, but so far I wasn't able to get it to work (I am running this on a raspberry pi).

I am sure of a couple of things;

  1. The streams works in VLC
  2. I have installed FFMPEG and it works
  3. The snapshot function works, I can see the snapshot being updated in the home app about every 10seconds.
  4. The rest of homebridge works
  5. The same thing happens while running it on my Mac

When I add the cameras to homebridge everything succeeds, and I can successfully view the snapshot being updated. But when I tap on a camera, it starts loading for about 4-5 seconds and the app then says that the camera doesn't respond.
I have tried running the FFMPEG command manually and then I see the following output;

  Duration: N/A, bitrate: N/A
    Stream #0:0: Video: mjpeg, yuvj422p(pc, bt470bg/unknown/unknown), 640x480, 25 tbr, 25 tbn, 25 tbc
Codec AVOption tune (Tune the encoding params (cf. x264 --fullhelp)) specified for output file #0 (rtp://127.0.0.1:51293) has not been used for any stream. The most likely reason is either wrong type (e.g. a video option with no video streams) or that it is a private option of some encoder which was not actually used for any stream.
[swscaler @ 0x2ba7350] deprecated pixel format used, make sure you did set range correctly
[h264_omx @ 0x2b68fc0] Using OMX.broadcom.video_encode
Output #0, rtp, to 'rtp://127.0.0.1:51293':
  Metadata:
    encoder         : Lavf57.51.100
    Stream #0:0: Video: h264 (h264_omx), yuv420p, 1280x720, q=2-31, 300 kb/s, 30 fps, 90k tbn, 30 tbc
    Metadata:
      encoder         : Lavc57.58.103 h264_omx
Stream mapping:
  Stream #0:0 -> #0:0 (mjpeg (native) -> h264 (h264_omx))
SDP:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=No Name
c=IN IP4 127.0.0.1
t=0 0
a=tool:libavformat 57.51.100
m=video 51293 RTP/AVP 96
b=AS:300
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1

Press [q] to stop, [?] for help
frame=   37 fps=2.6 q=-0.0 Lsize=      43kB time=00:00:01.16 bitrate= 303.7kbits/s dup=6 drop=0 speed=0.0833x    
video:43kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.361587%
Exiting normally, received signal 2.

And this is the last output I see from homebridge;

[12/11/2016, 12:06:31 PM] Homebridge is running on port 51826.
[12/11/2016, 12:06:31 PM] IPCam04 is running on port 45767.
[12/11/2016, 12:06:31 PM] IPCam04 is running on port 36020.
HANDLE SNAPSHOT
HANDLE SNAPSHOT
HANDLE SNAPSHOT

I have added a console.log with the function name in every function in ffmpeg.js so that's why you see the HANDLE SNAPSHOT.
But I don't see the FFMPEG command being printed to the console when I open the stream on any on my devices, which should happen. While debugging on my Mac using console.log I found out that both FFMPEG.prototype.prepareStream and FFMPEG.prototype.handleStreamRequest aren't being called.
I think this is as far as I can get with debugging, and I don't have any ideas left on what it could be.

If you have any ideas or suggestions it is greatly appreciated. Thanks in advance.

max delay reached

Hey,

First of all thanks for the work on the plug-in

I am experiencing an issue with it though, the stream says its live but if there is any movement on the camera it freezes until it eventually disconnects.

tried running

ffmpeg -re -i rtsp://xx.x.x.xx:554/stream1 -threads 0 -vcodec libx264 -an -pix_fmt yuv420p -r 30 -f rawvideo -tune zerolatency -vf scale=1280:720 -b:v 300k -bufsize 300k -f rtp rtp://127.0.0.1:5129

and got this output:

`pi@raspberrypi:~ $ ffmpeg -re -i rtsp://xx.x.x.xx:554/stream1 -threads 0 -vcodec libx264 -an -pix_fmt yuv420p -r 30 -f rawvideo -tune zerolatency -vf scale=1280:720 -b:v 300k -bufsize 300k -f rtp rtp://127.0.0.1:5129
ffmpeg version git-2017-02-04-b1e2192 Copyright (c) 2000-2017 the FFmpeg developers
built with gcc 4.9.2 (Raspbian 4.9.2-10)
configuration: --arch=armel --target-os=linux --enable-gpl --enable-libx264 --enable-nonfree
libavutil 55. 46.100 / 55. 46.100
libavcodec 57. 75.100 / 57. 75.100
libavformat 57. 66.101 / 57. 66.101
libavdevice 57. 2.100 / 57. 2.100
libavfilter 6. 72.100 / 6. 72.100
libswscale 4. 3.101 / 4. 3.101
libswresample 2. 4.100 / 2. 4.100
libpostproc 54. 2.100 / 54. 2.100
Input #0, rtsp, from 'rtsp://10.0.1.11:554/stream1':
Metadata:
title : Ambarella streaming
comment : Ambarella streaming
Duration: N/A, start: 0.000000, bitrate: N/A
Stream #0:0: Video: h264 (Main), yuvj420p(pc, progressive), 848x480 [SAR 1:1 DAR 53:30], 14.99 fps, 14.99 tbr, 90k tbn, 29.97 tbc
Stream #0:1: Audio: aac (LC), 48000 Hz, mono, fltp
[swscaler @ 0x33c3b20] deprecated pixel format used, make sure you did set range correctly
[libx264 @ 0x333bdc0] VBV maxrate unspecified, assuming CBR
[libx264 @ 0x333bdc0] using SAR=159/160
[libx264 @ 0x333bdc0] using cpu capabilities: ARMv6 NEON
[libx264 @ 0x333bdc0] profile High, level 3.1
Output #0, rtp, to 'rtp://127.0.0.1:5129':
Metadata:
title : Ambarella streaming
comment : Ambarella streaming
encoder : Lavf57.66.101
Stream #0:0: Video: h264 (libx264), yuv420p, 1280x720 [SAR 159:160 DAR 53:30], q=-1--1, 300 kb/s, 30 fps, 90k tbn, 30 tbc
Metadata:
encoder : Lavc57.75.100 libx264
Side data:
cpb: bitrate max/min/avg: 0/0/300000 buffer size: 300000 vbv_delay: -1
SDP:
v=0
o=- 0 0 IN IP4 127.0.0.1
s=Ambarella streaming
c=IN IP4 127.0.0.1
t=0 0
a=tool:libavformat 57.66.101
m=video 5129 RTP/AVP 96
b=AS:300
a=rtpmap:96 H264/90000
a=fmtp:96 packetization-mode=1

Stream mapping:
Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
Press [q] to stop, [?] for help
Past duration 0.929985 too large 66kB time=00:00:02.33 bitrate= 233.1kbits/s dup=70 drop=0 speed=0.357x
[rtsp @ 0x32ce310] max delay reached. need to consume packetbitrate= 223.6kbits/s dup=88 drop=5 speed=0.374x
[rtsp @ 0x32ce310] RTP: missed 705 packets
[h264 @ 0x3352050] error while decoding MB 17 22, bytestream -57
[h264 @ 0x3352050] concealing 456 DC, 456 AC, 456 MV errors in I frame
[rtsp @ 0x32ce310] max delay reached. need to consume packet
[rtsp @ 0x32ce310] RTP: missed 3 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packetbitrate= 222.0kbits/s dup=90 drop=5 speed=0.282x
[rtsp @ 0x32ce310] RTP: missed 412 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packet
[rtsp @ 0x32ce310] RTP: missed 5 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packetbitrate= 232.8kbits/s dup=376 drop=5 speed=0.429x
[rtsp @ 0x32ce310] RTP: missed 245 packets
[h264 @ 0x34e6c90] error while decoding MB 43 20, bytestream -15
[h264 @ 0x34e6c90] concealing 536 DC, 536 AC, 536 MV errors in I frame
[rtsp @ 0x32ce310] max delay reached. need to consume packet
[rtsp @ 0x32ce310] RTP: missed 8 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packet
[rtsp @ 0x32ce310] RTP: missed 6 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packet
[rtsp @ 0x32ce310] RTP: missed 948 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packetbitrate= 233.3kbits/s dup=384 drop=5 speed=0.429x
[rtsp @ 0x32ce310] RTP: missed 1365 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packetbitrate= 252.3kbits/s dup=504 drop=5 speed=0.412x
[rtsp @ 0x32ce310] RTP: missed 256 packets
[h264 @ 0x34e6c90] error while decoding MB 41 9, bytestream -43
[h264 @ 0x34e6c90] concealing 1121 DC, 1121 AC, 1121 MV errors in I frame
[rtsp @ 0x32ce310] max delay reached. need to consume packet
[rtsp @ 0x32ce310] RTP: missed 4 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packet
[rtsp @ 0x32ce310] RTP: missed 582 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packet
[rtsp @ 0x32ce310] RTP: missed 518 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packet
[rtsp @ 0x32ce310] RTP: missed 41 packets
More than 1000 frames duplicated
[rtsp @ 0x32ce310] max delay reached. need to consume packetbitrate= 260.2kbits/s dup=1388 drop=5 speed=0.458x
[rtsp @ 0x32ce310] RTP: missed 2786 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packet
[rtsp @ 0x32ce310] RTP: missed 534 packets
[h264 @ 0x34d7540] error while decoding MB 35 16, bytestream -23
[h264 @ 0x34d7540] concealing 756 DC, 756 AC, 756 MV errors in I frame
[rtsp @ 0x32ce310] max delay reached. need to consume packet
[rtsp @ 0x32ce310] RTP: missed 6 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packetbitrate= 261.0kbits/s dup=1391 drop=5 speed=0.458x
[rtsp @ 0x32ce310] RTP: missed 1271 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packetbitrate= 262.5kbits/s dup=1393 drop=5 speed=0.448x
[rtsp @ 0x32ce310] RTP: missed 5 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packet
[rtsp @ 0x32ce310] RTP: missed 3 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packet
[rtsp @ 0x32ce310] RTP: missed 2771 packets
[rtsp @ 0x32ce310] max delay reached. need to consume packetbitrate= 263.6kbits/s dup=2672 drop=5 speed=0.44x
[rtsp @ 0x32ce310] RTP: missed 9 packets
[h264 @ 0x34d7540] error while decoding MB 39 23, bytestream -39ate= 264.8kbits/s dup=3168 drop=5 speed=0.433x
[h264 @ 0x34d7540] concealing 381 DC, 381 AC, 381 MV errors in I frame
frame= 3243 fps= 13 q=24.0 Lsize= 3514kB time=00:01:48.06 bitrate= 266.4kbits/s dup=3186 drop=5 speed=0.431x
video:3479kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 1.016440%
[libx264 @ 0x333bdc0] frame I:13 Avg QP:25.20 size: 29016
[libx264 @ 0x333bdc0] frame P:3230 Avg QP:15.00 size: 986
[libx264 @ 0x333bdc0] mb I I16..4: 19.8% 65.7% 14.5%
[libx264 @ 0x333bdc0] mb P I16..4: 0.0% 0.0% 0.0% P16..4: 5.3% 0.3% 0.4% 0.0% 0.0% skip:93.9%
[libx264 @ 0x333bdc0] 8x8 transform intra:64.3% inter:78.9%
[libx264 @ 0x333bdc0] coded y,uvDC,uvAC intra: 51.4% 42.2% 6.2% inter: 1.2% 3.1% 0.1%
[libx264 @ 0x333bdc0] i16 v,h,dc,p: 25% 32% 8% 34%
[libx264 @ 0x333bdc0] i8 v,h,dc,ddl,ddr,vr,hd,vl,hu: 32% 19% 26% 3% 3% 3% 5% 3% 5%
[libx264 @ 0x333bdc0] i4 v,h,dc,ddl,ddr,vr,hd,vl,hu: 38% 26% 11% 3% 4% 4% 8% 3% 4%
[libx264 @ 0x333bdc0] i8c dc,h,v,p: 62% 16% 21% 1%
[libx264 @ 0x333bdc0] Weighted P-Frames: Y:0.0% UV:0.0%
[libx264 @ 0x333bdc0] ref P L0: 94.2% 1.0% 3.9% 1.0%
[libx264 @ 0x333bdc0] kb/s:263.63`

This is a copy of the plug in section in my config.json
`"platforms": [

            {
                    "platform": "Camera-ffmpeg",
                    "cameras": [{
                            "name": "Sala Cam",
                            "videoConfig": {
                                    "source": "-rtsp_transport tcp -re -i rtsp://xx.x.x.xx:554/stream1",
                                    "maxStreams": 2,
                                    "maxWidth": 640,
                                    "maxHeight": 480,
                                    "maxFPS": 15
                            }
                    }]

`

I have tried:

  1. Adding -rtsp_transport tcp before -re -i
  2. udo usermod -aG video homebridge
  3. apt-get install libomxil-bellagio-bin
  4. apt-get install libomxil-bellagio-dev

And none have improved the issue. Does anyone have any other suggestions?

Appreciate it

Using the wiki to document working configs

Does it make sense to use the wiki to document known working configs? There is so much variability in cameras, would it make it easier for everyone if there was a place with working configs?

Config for my DCS-930L

I've successfully setup Homebridge with my camera and I can see updated still images in HomeKit. How cool is that. Many thanks.
But I am not able to make the video work. Could you kindly help?

My config is the following:

"videoConfig": {
"source": "-re -i rtsp:://xxxxxxxx:[email protected]/mjpeg.cgi",
"stillImageSource": "-i http://xxxxxxxx:[email protected]/image.jpg",
"maxStreams": 1,
"maxWidth": 640,
"maxHeight": 480,
"maxFPS": 5
}

I can see the camera from a browser with this:
http://xxxxxxxx:[email protected]/mjpeg.cgi

Thank you!

Homebridge not working with camera enabled

When I add a webcam to the file config.json I get this when i start homebridge
`*** WARNING *** The program 'node' uses the Apple Bonjour compatibility layer of Avahi.
*** WARNING *** Please fix your application to use the native API of Avahi!
*** WARNING *** For more information see http://0pointer.de/avahi-compat?s=libdns_sd&e=node
*** WARNING *** The program 'node' called 'DNSServiceRegister()' which is not supported (or only supported partially) in the Apple Bonjour compatibility layer of Avahi.
*** WARNING *** Please fix your application to use the native API of Avahi!
*** WARNING *** For more information see http://0pointer.de/avahi-compat?s=libdns_sd&e=node&f=DNSServiceRegister
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] Loaded plugin: homebridge-camera-ffmpeg
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] Registering platform 'homebridge-camera-ffmpeg.Camera-ffmpeg'
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] ---
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] Loaded plugin: homebridge-gpio-wpi
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] Registering accessory 'homebridge-gpio-wpi.GPIO'
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] ---
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] Loaded plugin: homebridge-people
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] Registering accessory 'homebridge-people.people'
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] ---
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] Loaded plugin: homebridge-pi
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] Registering accessory 'homebridge-pi.PiTemperature'
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] ---
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] Loaded config.json with 4 accessories and 0 platforms.
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] ---
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] Loading 4 accessories...
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] [GPIO23] Initializing GPIO accessory...
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] [Temperatura Raspberry Pi] Initializing PiTemperature accessory...
[Sun Oct 02 2016 14:28:51 GMT+0200 (CEST)] [Temperatura Raspberry Pi] Model BCM2709/a02082 Serial 000000003d014e6a
/usr/local/lib/node_modules/homebridge/lib/api.js:46
if (name.indexOf('.') == -1) {
^

TypeError: Cannot read property 'indexOf' of undefined
at API.accessory (/usr/local/lib/node_modules/homebridge/lib/api.js:46:11)
at Server._loadAccessories (/usr/local/lib/node_modules/homebridge/lib/server.js:251:42)
at Server.run (/usr/local/lib/node_modules/homebridge/lib/server.js:78:38)
at module.exports (/usr/local/lib/node_modules/homebridge/lib/cli.js:40:10)
at Object. (/usr/local/lib/node_modules/homebridge/bin/homebridge:17:22)
at Module._compile (module.js:541:32)
at Object.Module._extensions..js (module.js:550:10)
at Module.load (module.js:456:32)
at tryModuleLoad (module.js:415:12)
at Function.Module._load (module.js:407:3)`

My config.json file:
{ "bridge": { "name": "Homebridge", "username": "CA:21:3F:E2:DE:24", "port": 51826, "pin": "031-45-154" }, "accessories": [ { "accessory": "GPIO", "name": "GPIO23", "pin": 23 }, { "accessory": "PiTemperature", "name": "Raspberry Pi Temperature" }, { "platform": "Camera-ffmpeg", "cameras": [ { "name": "Camera1", "videoConfig": { "source": "-re -i rtsp://192.168.1.21:555/webcam0.sdp", "maxStreams": 1, "maxWidth": 640, "maxHeight": 480, "maxFPS": 10 } } ] }, { "accessory" : "people", "name" : "People", "people" : [ { "name" : "My PC", "target" : "192.168.1.5" }, { "name" : "WiFi Extender", "target" : "192.168.1.2" } ], "threshold" : 15 } ] }

only first image is shown, using v4l2rtspserver and a pi camera

hi,

i see the stream "working" as the camera displays the first image (including a counter when this was taken) but no video.

fullsizeoutput_2e06

i run homebridge on a pi, my ffmpeg version ist:
ffmpeg version 0.8.17-6:0.8.17-2+rpi1+deb7u2, Copyright (c) 2000-2014 the Libav developers built on Jun 18 2016 00:05:51 with gcc 4.6.3

a secondary pi is providing the stream using v4l2rtspserver which works in vlc, except some grey picture at the beginning of the stream.

i run v4l2rtspserver -F 30 -W 640 -H 480 -c /dev/video0 and my homebridge config basically looks like the example provided:

{ "platform": "Camera-ffmpeg", "cameras": [ { "name": "Basement Cam", "videoConfig": { "source": "-re -i rtsp://192.168.178.47:8554/unicast", "maxStreams": 2, "maxWidth": 640, "maxHeight": 480, "maxFPS": 30 } }

any idea how to get to video output working? is it a problem with my ffmpeg version?

do you have any other recommendations to stream video from /dev/video0 device using v4l2 drivers on a pi?

ffmpeg not called?

Hi
the still picture is working - but the streaming not.
It looks like FFMPEG is not called at all - i also didn't see a command in the log.

What i found is:
Accessory [Garage] Getting value for Characteristic "Streaming Status" +0ms
Accessory [Garage] Got Characteristic "Streaming Status" value: AQEA +0ms
EventedHTTPServer [::ffff:192.168.0.62] HTTP Response is finished +0ms
EventedHTTPServer [::ffff:192.168.0.62] HTTP request: /resource +2s
HAPServer [B3:E9:3D:77:63:E4] HAP Request: POST /resource +0ms

I didn't find any FFMPEG call when i open the stream. In fact i didn't see anything on the DEBUG console when i start the stream.
My config is this:
"name": "Garage",
"videoConfig": {
"source": "-re -i rtsp://192.168.0.105/proxyStream-1",
"stillImageSource": "-i http://user:[email protected]/Streaming/channels/1/picture?snapShotImageType=JPEG",
"maxStreams": 2,
"maxWidth": 1920,
"maxHeigh": 1080,
"maxFPS": 25
}

The source is a "LIVE555 Proxy Server". If i open the rtsp://192.168.0.105/proxyStream-1 with VLC it works perfect. I also see on the LIve555 that these stream is connected. If i try to connect with home bridge it doesn't show any client connected...
So it looks like the stream is not tried to connect at all.. :-(
Any Idea how to fix or troubleshoot?

Gruß
Thorsten

Cant see camera in Home App

The plugin initializes and the log shows that the camera is assigned a port.

Any idea?

Running on a synology diskstation in a docker container.

[ffmpeg error on RPi2 Raspbian Jessie] -bash: -threads: command not found

I'm running into an issue when adding a NEO Coolcam to HomeKit using homebridge on my RPi2. The accessory is added but I only see a the grey videorecorder icon indicating there's no image available.

The IP camera URL is working in VLC (on a Mac) so I ran the command below in the RPi2 terminal (this is the command including parameters thrown in the console by homebridge when trying to open the camera accessory in HomeKit):

pi@homesystem:~ $ ffmpeg -re -i http://192.168.1.251:8001/videostream.cgi?user=xx&pwd=xx -threads 0 -vcodec libx264 -an -pix_fmt yuv420p -r 25 -f rawvideo -tune zerolatency -vf scale=640:360 -b:v 132k -bufsize 132k -payload_type 99 -ssrc 1 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params WKD4mXCDsn4/5lYxUq250YSg4Yq9jdhwPxkwIqOP srtp://192.168.1.112:64874?rtcpport=64874&localrtcpport=64874&pkt_size=1378

This outputs:

[2] 1396
[3] 1397
-bash: -threads: command not found
[4] 1398
[3] Exit 127 pwd=xx -threads 0 -vcodec libx264 -an -pix_fmt yuv420p -r 25 -f rawvideo -tune zerolatency -vf scale=640:360 -b:v 132k -bufsize 132k -payload_type 99 -ssrc 1 -f rtp -srtp_out_suite AES_CM_128_HMAC_SHA1_80 -srtp_out_params WKD4mXCDsn4/5lYxUq250YSg4Yq9jdhwPxkwIqOP srtp://192.168.1.112:64874?rtcpport=64874
[4]- Done localrtcpport=64874

Any thoughts are much appreciated :) thanks!

Homebridge crashes

Hi got this...

events.js:154
      throw er; // Unhandled 'error' event
      ^

Error: spawn ffmpeg ENOENT
    at exports._errnoException (util.js:890:11)
    at Process.ChildProcess._handle.onexit (internal/child_process.js:182:32)
    at onErrorNT (internal/child_process.js:348:16)
    at _combinedTickCallback (node.js:377:13)
    at process._tickCallback (node.js:401:11)
pi@raspberrypi ~ $ events.js:154
      throw er; // Unhandled 'error' event
      ^

Error: spawn ffmpeg ENOENT

Any idea?

Possible to add a Doorbell service to ffmpeg config?

I'm using ffmpeg for my DoorBird camera stream 👍 and tried to use the Doorbell service as a standalone service, this didnt work for long. Wanted to see if its possible to combine the two and get a video doorbell working?

Could this be done inside ffmpeg as an optional accessory?

Errors when connecting to Camera stream on Mac

I'm trying to install camera-ffmpeg as a platform, but it's not showing up on the Home App.

I've tried running

ffmpeg -re -i rtsp://[user]:[password]@192.168.1.4/play1.sdp -threads 0 -vcodec libx264 -an -pix_fmt yuv420p -r 30 -f rawvideo -tune zerolatency -vf scale=1280:720 -b:v 299k -bufsize 299k - | /Applications/VLC.app/Contents/MacOS/VLC'

It opens VLC, but there are errors showing in the Terminal Screen. Full output:

ffmpeg version 3.1.4 Copyright (c) 2000-2016 the FFmpeg developers
built with llvm-gcc 4.2.1 (LLVM build 2336.11.00)
configuration: --prefix=/Volumes/Ramdisk/sw --enable-gpl --enable-pthreads --enable-version3 --enable-libspeex --enable-libvpx --disable-decoder=libvpx --enable-libmp3lame --enable-libtheora --enable-libvorbis --enable-libx264 --enable-avfilter --enable-libopencore_amrwb --enable-libopencore_amrnb --enable-filters --enable-libgsm --enable-libvidstab --enable-libx265 --disable-doc --arch=x86_64 --enable-runtime-cpudetect
libavutil 55. 28.100 / 55. 28.100
libavcodec 57. 48.101 / 57. 48.101
libavformat 57. 41.100 / 57. 41.100
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 47.100 / 6. 47.100
libswscale 4. 1.100 / 4. 1.100
libswresample 2. 1.100 / 2. 1.100
libpostproc 54. 0.100 / 54. 0.100
VLC media player 2.1.5 Rincewind (revision 2.1.4-59-g5f258d5)
Cannot connect to server socket err = No such file or directory
Cannot connect to server socket
jack server is not running or cannot be started
[0x100434660] main libvlc: Running vlc with the default interface. Use 'cvlc' to use vlc without interface.
Input #0, rtsp, from 'rtsp://[user]:[password]@192.168.1.4/play1.sdp':
Metadata:
title : Thomas
Duration: N/A, start: 0.000000, bitrate: N/A
Stream #0:0: Video: h264 (Baseline), yuvj420p(pc), 1280x720, 15 fps, 30 tbr, 90k tbn, 30 tbc
Stream #0:1: Audio: aac (LC), 8000 Hz, mono, fltp
[swscaler @ 0x7febd5049800] deprecated pixel format used, make sure you did set range correctly
[libx264 @ 0x7febd4805e00] VBV maxrate unspecified, assuming CBR
[libx264 @ 0x7febd4805e00] using cpu capabilities: MMX2 SSE2Fast SSSE3 SSE4.2 AVX
[libx264 @ 0x7febd4805e00] profile High, level 3.1
[rawvideo @ 0x7febd4804c00] Using AVStream.codec to pass codec parameters to muxers is deprecated, use AVStream.codecpar instead.
Output #0, rawvideo, to 'pipe:':
Metadata:
title : Thomas
encoder : Lavf57.41.100
Stream #0:0: Video: h264 (libx264), yuv420p, 1280x720, q=-1--1, 299 kb/s, 30 fps, 30 tbn, 30 tbc
Metadata:
encoder : Lavc57.48.101 libx264
Side data:
cpb: bitrate max/min/avg: 0/0/299000 buffer size: 299000 vbv_delay: -1
Stream mapping:
Stream #0:0 -> #0:0 (h264 (native) -> h264 (libx264))
Press [q] to stop, [?] for help
Past duration 0.999992 too large
Last message repeated 11 times
frame= 14 fps=0.0 q=38.0 size= 13kB time=00:00:00.46 bitrate= 230.5kbits/Past duration 0.999992 too large
Last message repeated 13 times
[rtsp @ 0x7febd4800000] max delay reached. need to consume packet
[rtsp @ 0x7febd4800000] RTP: missed 10 packets
[rtsp @ 0x7febd4800000] max delay reached. need to consume packet
[rtsp @ 0x7febd4800000] RTP: missed 1 packets
[rtsp @ 0x7febd4800000] max delay reached. need to consume packet
[rtsp @ 0x7febd4800000] RTP: missed 20 packets
[h264 @ 0x7febd402da00] concealing 1142 DC, 1142 AC, 1142 MV errors in P frame
Past duration 0.999992 too large
frame= 29 fps= 29 q=33.0 size= 25kB time=00:00:00.96 bitrate= 214.5kbits/Past duration 0.999992 too large
Last message repeated 1 times
[rtsp @ 0x7febd4800000] max delay reached. need to consume packet
[rtsp @ 0x7febd4800000] RTP: missed 1 packets
[h264 @ 0x7febd4807400] concealing 452 DC, 452 AC, 452 MV errors in P frame
Past duration 0.999992 too large
Last message repeated 1 times
frame= 33 fps= 22 q=33.0 size= 29kB time=00:00:01.10 bitrate= 216.4kbits/[rtsp @ 0x7febd4800000] max delay reached. need to consume packet
[rtsp @ 0x7febd4800000] RTP: missed 1 packets
[h264 @ 0x7febd402da00] corrupted macroblock 31 42 (total_coeff=-1)
[h264 @ 0x7febd402da00] error while decoding MB 31 42
[h264 @ 0x7febd402da00] concealing 258 DC, 258 AC, 258 MV errors in P frame
Past duration 0.999992 too large
Last message repeated 11 times
frame= 60 fps= 30 q=29.0 size= 52kB time=00:00:02.00 bitrate= 211.3kbits/Past duration 0.999992 too large
Last message repeated 3 times
[rtsp @ 0x7febd4800000] max delay reached. need to consume packet
[rtsp @ 0x7febd4800000] RTP: missed 2 packets
[rtsp @ 0x7febd4800000] max delay reached. need to consume packet
[rtsp @ 0x7febd4800000] RTP: missed 29 packets
Past duration 0.999992 too large
Last message repeated 2 times
frame= 67 fps= 26 q=28.0 size= 58kB time=00:00:02.23 bitrate= 211.2kbits/Past duration 0.999992 too large
Last message repeated 1 times

FWIW: config.json

{
"platform": "Camera-ffmpeg",
"cameras": [
{
"name": "Thomas BabyCam",
"videoConfig": {
"source": "-re -i rtsp://[user]:[pass]@192.168.1.4/play1.sdp",
"maxStreams": 2,
"maxWidth": 1280,
"maxHeight": 720,
"maxFPS": 30
}
}
]
}

Video and Image don't load. Camera password protected

Is it possible to use the plugin for password protected cameras? If so, where should I provide username and password (I tried to pass them in the URL)? Everything seems to work fine but the video and the still image don't load
Thank you!

Synology

Hello

Do you think it's possible to install it on docker on synology ?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.