Comments (36)
Looking at that - the issue is likely that the framerate support in gstreamer still isn't merged I'm afraid. There are patches available that need testing, review and integration.
If you can rebuild libcamera on your platform, try adding these two patches and retest.
https://patchwork.libcamera.org/project/libcamera/list/?series=3487
from libcamera.
Thanks for the info, that is good to know. It doesn't explain why libcamera-hello selects an inapropriate sensor format
for these framerates / resolutions though, right ?
from libcamera.
It should at least select 1332 × 990 for 720p and 640x480, right ?
https://www.raspberrypi.com/documentation/accessories/camera.html
(And ideally, 1332 × 990 is binned instead of cropped. Not sure if someone bothered to bring those modes out of the broadcom blob into the linux OS sensor driver(s) though).
from libcamera.
I'll leave the libcamera-hello parts to RPi - I only know the gstreamer side I'm afraid. I have pinged the developers working on framerate support in gstreamer earlier today, so hopefully it would progress. I've wanted it in for a while too.
from libcamera.
(Which actually, if you could test, and report back - would help progress that)
from libcamera.
I did look into the libcamera rpi code.
Looks like the framerate is not even accounted for when selecting the "best" sensor format ?
It only takes resolution and bit depth into account.
But on the other hand, even just using the resolution I'd think for 720p and 640x480 the algorithm should already return
1332 × 990.
from libcamera.
I'll leave the libcamera-hello parts to RPi - I only know the gstreamer side I'm afraid. I have pinged the developers working on framerate support in gstreamer earlier today, so hopefully it would progress. I've wanted it in for a while too.
Do you know where the "format selection" code from gst libcamerasrc can be found ? Or is it just the same as here in libcamera. In this case, I don't think the patches would change anything on a rpi with the HQ camera.
Since HQ 2028 × 1520p40 maxes out at 40fps (at least according to their doc here https://www.raspberrypi.com/documentation/accessories/camera.html )
from libcamera.
Do not use sudo
for libcamera commands - it really shouldn't be needed and is a very bad habit to get into.
Yes the mode selection algorithm is always going to have some conditions that are sub-optimal. I thought there had been discussions over including framerate and it had been rejected, but I'm not directly involved.
libcamera-vid --width 1280 --height 720 --framerate 60 --mode 1332:990:10:P --save-pts foo.txt
will save the timestamps of the captured frames in foo.txt
. I've just done that and got deltas of 16.66ms, or 60fps.
libcamera-vid --width 1280 --height 720 --framerate 50 --mode 2028:1080:12:P --save-pts foo.txt
has given me deltas of 19.99ms, so almost spot on 50fps.
libcamera-hello --list-cameras
will tell you the limits for each mode as advertised by the kernel driver.
1332x990 - max 120.05fps
2028x1080 max 50.03fps
2028x1520 max 40.01fps
4056x3040 max 10fps.
from libcamera.
Do not use
sudo
for libcamera commands - it really shouldn't be needed and is a very bad habit to get into.Yes the mode selection algorithm is always going to have some conditions that are sub-optimal. I thought there had been discussions over including framerate and it had been rejected, but I'm not directly involved.
libcamera-vid --width 1280 --height 720 --framerate 60 --mode 1332:990:10:P --save-pts foo.txt
will save the timestamps of the captured frames infoo.txt
. I've just done that and got deltas of 16.66ms, or 60fps.libcamera-vid --width 1280 --height 720 --framerate 50 --mode 2028:1080:12:P --save-pts foo.txt
has given me deltas of 19.99ms, so almost spot on 50fps.
libcamera-hello --list-cameras
will tell you the limits for each mode as advertised by the kernel driver. 1332x990 - max 120.05fps 2028x1080 max 50.03fps 2028x1520 max 40.01fps 4056x3040 max 10fps.
Thanks for these helpfull tipps, while I cannot get rid of the sudo yet both tools you posted really help.
I can confirm that manually specifying the sensor resolution "works", aka I get 16.6ms frame delta(s)
I can also confirm that not manually specifying the sensor resolution (but setting 720p60fps) give frame delta(s) of 19.99ms.
I understand that the mode selection is not easy, but silently ignoring the fps doesn't sound like a good idea to me ;)
from libcamera.
This probably is a gst-rpicamsrc issue, but nonetheless imporant for here: for some reason, with gst-rpicamsrc I am getting about 15fps (according to my gstreamer fpsdisplaysink running on another x86 pc)
Tx pipeline:
sudo gst-launch-1.0 libcamerasrc camera-name=/base/soc/i2c0mux/i2c@1/imx477@1a ! capsfilter caps=video/x-raw,width=1280,height=720,format=NV12,framerate=60/1 ! v4l2convert ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1,h264_profile=1,h264_level=11,video_bitrate=5000000,h264_i_frame_period=30,h264_minimum_qp_value=10" ! "video/x-h264,level=(string)4" ! queue ! h264parse config-interval=-1 ! rtph264pay mtu=1024 ! udpsink host=10.42.0.1 port=5600
Rx pipeline:
gst-launch-1.0 -v udpsrc port=5600 caps = "application/x-rtp, media=(string)video, encoding-name=(string)H264, payload=(int)96" ! rtph264depay ! decodebin ! videoconvert ! fpsdisplaysink
~15fps (my decoder is well capable of 60fps and also gst doesn't report dropped frames)
Quite far away from either 30fps, 50fps or the actually wanted 60fps ;)
from libcamera.
Add "interlace-mode=(string)progressive" to your capsfilter after libcamerasrc, and remove the v4l2convert. You don't want to give gstreamer the opportunity to do any software processing of the image.
Otherwise, if you haven't applied the patches I mentioned above, you simply can't specifiy the framerate through the gstlibcamerasrc yet (you can put the filter on, but it won't take effect) - So I could guess that the stream is running at 15FPS perhaps because you're in a dark room, and the AGC is choosing to extend the exposure time to get a higher brightness image.
from libcamera.
Add "interlace-mode=(string)progressive" to your capsfilter after libcamerasrc, and remove the v4l2convert. You don't want to give gstreamer the opportunity to do any software processing of the image.
Otherwise, if you haven't applied the patches I mentioned above, you simply can't specifiy the framerate through the gstlibcamerasrc yet (you can put the filter on, but it won't take effect) - So I could guess that the stream is running at 15FPS perhaps because you're in a dark room, and the AGC is choosing to extend the exposure time to get a higher brightness image.
Thanks for the info, I can confirm I was in a dark room and when I re-run the pipeline above in a lighter environment I get the (expected) 30fps (here I was just surprised by the 15 fps even though I expected at least 30fps, knowing that setting the fps won't work but libcamera defaults to 30fps). But your explanation makes total sense ;)
Yeah we've been wanting to get rid of the v4l2convert for ages, but it just doesn't work. E.g. the following pipeline
sudo gst-launch-1.0 libcamerasrc -vvv camera-name=/base/soc/i2c0mux/i2c@1/imx477@1a ! capsfilter caps="video/x-raw,width=1280,height=720,format=NV12,framerate=60/1,interlace-mode=(string)progressive" ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1,h264_profile=1,h264_level=11,video_bitrate=5000000,h264_i_frame_period=30,h264_minimum_qp_value=10" ! "video/x-h264,level=(string)4" ! queue ! h264parse config-interval=-1 ! rtph264pay mtu=1024 ! udpsink host=10.42.0.1 port=5600
shows:
Setting pipeline to PAUSED ... [0:20:45.230789618] [2355] INFO Camera camera_manager.cpp:293 libcamera v0.0.1+21-7c855784 [0:20:45.258419463] [2357] INFO RPI raspberrypi.cpp:1414 Registered camera /base/soc/i2c0mux/i2c@1/imx477@1a to Unicam device /dev/media4 and ISP device /dev/media1 Pipeline is live and does not need PREROLL ... Pipeline is PREROLLED ... Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0.GstLibcameraPad:src: caps = video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720, colorimetry=(string)2:4:5:4 [0:20:45.263414841] [2361] INFO Camera camera.cpp:1026 configuring streams: (0) 1280x720-NV12 [0:20:45.263834800] [2357] INFO RPI raspberrypi.cpp:800 Sensor: /base/soc/i2c0mux/i2c@1/imx477@1a - Selected sensor format: 2028x1080-SBGGR12_1X12 - Selected unicam format: 2028x1080-pBCC ERROR: from element /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0: Internal data stream error. Additional debug info: ../src/gstreamer/gstlibcamerasrc.cpp(311): processRequest (): /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0: streaming stopped, reason not-negotiated (-4) Execution ended after 0:00:00.835618054 Setting pipeline to NULL ... Freeing pipeline ...
from libcamera.
Since we already build gstreamer ourselves to keep gst-rpicamsrc as an option for the users (switching between non-legacy and legacy camera stack) I've asked the dev responsible to apply the patches. I'l let you know if they work.
from libcamera.
-mode 1332:990:10:P
Do you know if it is already possible to set the sensor mode (as shown by 6by9 with libcamera-vid) with gst libcamerasrc ?
Because without that I am afraid we won't be able to do 720p60fps with gstreamer libcamerasrc anyways.
from libcamera.
-mode 1332:990:10:P
Do you know if it is already possible to set the sensor mode (as shown by 6by9 with libcamera-vid) with gst libcamerasrc ? Because without that I am afraid we won't be able to do 720p60fps with gstreamer libcamerasrc anyways.
I'm not sure, I suspect that feature for explicitly setting a raw sensor mode is not yet implemented in the gstreamer gstlibcamerasrc, and will need explicit investigation. If you are willing to do this, let us know and we'll help,
from libcamera.
Yeah we've been wanting to get rid of the v4l2convert for ages, but it just doesn't work.
Note that currently the gstreamer element for libcamera might need explicit specification for the following properties:
- colorimetery
- framerate
- interlace-mode
Try adding the colorimetry in there too.
capsfilter caps="video/x-raw,width=1280,height=720,format=NV12,framerate=60/1,interlace-mode=(string)progressive,colorimetry=bt709"
I believe I have tested the example at:
https://github.com/kbingham/linux-cameras/wiki/GStreamer-use-cases-with-libcamera#rtp-streaming
from libcamera.
I can confirm that setting both the interlace mode and colorimetry allows you to get rid of v4l2convert. Thanks !
from libcamera.
Quick question: Is it safe to install libcamera from https://git.libcamera.org/libcamera/libcamera.git and use on a rpi ? Or should we build and install from here.
from libcamera.
I see RPi have a couple of patches on top of libcamera at the moment. To maintain existing behaviour, I think you should probably continue to use this tree for the time being.
from libcamera.
To test the linked patches, I've performed the following steps:
-
clone and install libcamera from raspberrypi/libcamera (main)
-
use
gst-launch-1.0 libcamerasrc camera-name=/base/soc/i2c0mux/i2c@1/imx477@1a ! capsfilter caps="video/x-raw,width=1280,height=720,format=NV12,framerate=30/1,interlace-mode=(string)progressive,colorimetry=bt709" ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1,h264_profile=1,h264_level=11,video_bitrate=5000000,h264_i_frame_period=30,h264_minimum_qp_value=10" ! "video/x-h264,level=(string)4" ! queue ! h264parse config-interval=-1 ! rtph264pay mtu=1024 ! udpsink host=10.42.0.1 port=5600
measure the fps at receiver: ~15fps
=> expected -
apply 2 linked patches
-
build and install again
-
Execute above pipeline
Expected result: measure 30fps at receiver
Actual result: measured ~15fps
So I can't validate the patche(s) work, at least no yet (will double check)
Fresh & latest raspbian, removed libcamera-dev and libcamera-apps. installed gstreamer to compile
Linux raspberrypi 5.15.61-v7l+
from libcamera.
Full log (forgive me the sudo ;) )
openhd@raspberrypi:~/libcamera $ sudo gst-launch-1.0 libcamerasrc -vvv camera-name=/base/soc/i2c0mux/i2c@1/imx477@1a ! capsfilter caps="video/x-raw,width=1280,height=720,format=NV12,framerate=30/1,interlace-mode=(string)progressive,colorimetry=bt709" ! v4l2h264enc extra-controls="controls,repeat_sequence_header=1,h264_profile=1,h264_level=11,video_bitrate=5000000,h264_i_frame_period=30,h264_minimum_qp_value=10" ! "video/x-h264,level=(string)4" ! queue ! h264parse config-interval=-1 ! rtph264pay mtu=1024 ! udpsink host=10.42.0.1 port=5600 Setting pipeline to PAUSED ... [0:42:00.594449767] [8722] INFO Camera camera_manager.cpp:293 libcamera v0.0.0+3866-0c55e522 [0:42:00.623239339] [8723] INFO RPI raspberrypi.cpp:1374 Registered camera /base/soc/i2c0mux/i2c@1/imx477@1a to Unicam device /dev/media4 and ISP device /dev/media1 Pipeline is live and does not need PREROLL ... Pipeline is PREROLLED ... Setting pipeline to PLAYING ... New clock: GstSystemClock /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0.GstLibcameraPad:src: caps = video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:src: caps = video/x-raw, width=(int)1280, height=(int)720, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 /GstPipeline:pipeline0/GstCapsFilter:capsfilter1.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 /GstPipeline:pipeline0/GstQueue:queue0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 /GstPipeline:pipeline0/GstQueue:queue0.GstPad:src: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:sink: caps = video/x-h264, stream-format=(string)byte-stream, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 Redistribute latency... /GstPipeline:pipeline0/v4l2h264enc:v4l2h264enc0.GstPad:sink: caps = video/x-raw, width=(int)1280, height=(int)720, format=(string)NV12, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709 [0:42:00.680455141] [8727] INFO Camera camera.cpp:1035 configuring streams: (0) 1280x720-NV12 /GstPipeline:pipeline0/GstCapsFilter:capsfilter0.GstPad:sink: caps = video/x-raw, format=(string)NV12, width=(int)1280, height=(int)720 [0:42:00.680932168] [8723] INFO RPI raspberrypi.cpp:761 Sensor: /base/soc/i2c0mux/i2c@1/imx477@1a - Selected sensor format: 2028x1080-SBGGR12_1X12 - Selected unicam format: 2028x1080-pBCC /GstPipeline:pipeline0/GstH264Parse:h264parse0.GstPad:src: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, codec_data=(buffer)01428028ffe100232742802895a014016e84000003000400000300f38a8000989600017d79bdee01e244d401000528ce025c80 /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:src: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)428028, sprop-parameter-sets=(string)"J0KAKJWgFAFuhAAAAwAEAAADAPOKgACYlgABfXm97gHiRNQ\=\,KM4CXIA\=", payload=(int)96, ssrc=(uint)1540806756, timestamp-offset=(uint)2327953635, seqnum-offset=(uint)29751, a-framerate=(string)30 /GstPipeline:pipeline0/GstUDPSink:udpsink0.GstPad:sink: caps = application/x-rtp, media=(string)video, clock-rate=(int)90000, encoding-name=(string)H264, packetization-mode=(string)1, profile-level-id=(string)428028, sprop-parameter-sets=(string)"J0KAKJWgFAFuhAAAAwAEAAADAPOKgACYlgABfXm97gHiRNQ\=\,KM4CXIA\=", payload=(int)96, ssrc=(uint)1540806756, timestamp-offset=(uint)2327953635, seqnum-offset=(uint)29751, a-framerate=(string)30 /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0.GstPad:sink: caps = video/x-h264, stream-format=(string)avc, alignment=(string)au, level=(string)4, profile=(string)baseline, width=(int)1280, height=(int)720, pixel-aspect-ratio=(fraction)1/1, framerate=(fraction)30/1, interlace-mode=(string)progressive, colorimetry=(string)bt709, chroma-format=(string)4:2:0, bit-depth-luma=(uint)8, bit-depth-chroma=(uint)8, parsed=(boolean)true, codec_data=(buffer)01428028ffe100232742802895a014016e84000003000400000300f38a8000989600017d79bdee01e244d401000528ce025c80 /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: timestamp = 2328035608 /GstPipeline:pipeline0/GstRtpH264Pay:rtph264pay0: seqnum = 29751 ^Chandling interrupt. Interrupt: Stopping pipeline ... Execution ended after 0:00:11.775799274 Setting pipeline to NULL ... Freeing pipeline ...
from libcamera.
I've also tried setting 50fps in a light environment - getting 30fps in this case
from libcamera.
Actually, I am not sure if I did everyhting correctly - I've uninstalled the (patched) libcamera build using sudo ninja -C build uninstall
but gst-inspect-1.0 still shows me libcamerasrc.
Any ideas ?
from libcamera.
Yeah, okay, my mistake - aparently you have to set export GST_PLUGIN_PATH=$(pwd)/build/src/gstreamer
such that gstreamer then actually uses the now build plugin (I suspect that libcamerasrc also comes with the default rpi os gstreamer installation)
Now I can report -
- Setting 720p50fps -> measured 50fps output
- Setting 720p30fps -> measured 30fps output (even in low light)
- Setting 720p60fps ->
ERROR: from element /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0: Internal data stream error. Additional debug info: ../src/gstreamer/gstlibcamerasrc.cpp(312): processRequest (): /GstPipeline:pipeline0/GstLibcameraSrc:libcamerasrc0: streaming stopped, reason not-negotiated (-4)
So 1) and 2) and 3) pretty much validate - setting the framerate from the gstreamer side now works, 3) is just the result of what we found out about the sensor selection.
In my humble opinion, the current raspberry pi sensor mode selecton code is suboptimal. After all, hello_video /mmal always had the "feature" of selecting a sensor mode that can fulfill the given framerate request, if there is one. The current behaviour is much less intuitive than previous common mmal solutions,and doesn't even log a warning - an unexperienced user can set 720p60 (for example) , and libcamera will silently output 50fps instead.
It is also not compatible with libcamerasrc (unless setting the sensor mode is added there).
from libcamera.
In my humble opinion, the current raspberry pi sensor mode selecton code is suboptimal. After all, hello_video /mmal always had the "feature" of selecting a sensor mode that can fulfill the given framerate request, if there is one. The current behaviour is much less intuitive than previous common mmal solutions,and doesn't even log a warning - an unexperienced user can set 720p60 (for example) , and libcamera will silently output 50fps instead.
There is no direct API in libcamera to allow sensor mode selection. The system selects a sensor mode based on what output resolution was requested. Again, there is no way to factor framerate into this selection routine with the libcamera API. As an alternative, libcamera-apps have added the --mode
and --viewfinder-mode
command line arguments to manually override the sensor mode selection with a user choice. This is equivalent to the --md
argument in the legacy mmal camera stack.
from libcamera.
In my humble opinion, the current raspberry pi sensor mode selecton code is suboptimal. After all, hello_video /mmal always had the "feature" of selecting a sensor mode that can fulfill the given framerate request, if there is one. The current behaviour is much less intuitive than previous common mmal solutions,and doesn't even log a warning - an unexperienced user can set 720p60 (for example) , and libcamera will silently output 50fps instead.
There is no direct API in libcamera to allow sensor mode selection. The system selects a sensor mode based on what output resolution was requested. Again, there is no way to factor framerate into this selection routine with the libcamera API. As an alternative, libcamera-apps have added the
--mode
and--viewfinder-mode
command line arguments to manually override the sensor mode selection with a user choice. This is equivalent to the--md
argument in the legacy mmal camera stack.
Since a resolution is always tied to a max (sometimes also min framerate), this sounds like a not ideal design to me. Either libcamera (as a library) should expose the functionality to query all sensor modes and then select a specific sensor mode,
Offloading the selection completely to the (library) user.
Or it should allow setting a specific resolution@framerate (for video), and then figure out the right sensor mode for this request.
from libcamera.
I think it would be possible to work around this issue specifically here (with the HQ camera) though. What about modifying the (rpi) sensor mode selection code to select 1332 × 990 instead of 2028x1080 when the user selects 720p ? Aka implement the following algorithm:
Take the "smallest" sensor mode that is equal or greater than the requested resolution ? I can only see benefits from this default behaviour, no downsides.
For examle, less load on the CSI, perhaps less load on the ISP / memory (depending on where cropping happens if there is any), and most importantly: In the ideal scenario, lower resolution modes are not cropped but binned on the CMOS, resulting in better image quality.
from libcamera.
I am wondering - should merge requests concerning code in libcamera/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp be done here (since it is rpi-specific) ?
from libcamera.
Some logs on this topic: Requesting 720p withut specifying the sensor mode, I get:
[0:03:01.857414251] [880] INFO Camera camera.cpp:1026 configuring streams: (0) 1280x720-YUV420 ... [0:03:01.857923710] [881] DEBUG RPI raspberrypi.cpp:167 Format: 1332x990 fmt SRGGB10 Score: 2377.47 (best 2377.47) [0:03:01.857977636] [881] DEBUG RPI raspberrypi.cpp:167 Format: 2028x1080 fmt SRGGB12 Score: 314.5 (best 314.5) [0:03:01.858014802] [881] DEBUG RPI raspberrypi.cpp:167 Format: 2028x1520 fmt SRGGB12 Score: 1717.7 (best 314.5) [0:03:01.858050691] [881] DEBUG RPI raspberrypi.cpp:167 Format: 4056x3040 fmt SRGGB12 Score: 2604.7 (best 314.5) [0:03:01.858220930] [881] INFO RPI raspberrypi.cpp:800 Sensor: /base/soc/i2c0mux/i2c@1/imx477@1a - Selected sensor format: 2028x1080-SBGGR12_1X12 - Selected unicam format: 2028x1080-pBCC
It is not documented in code what's better - a higher or lower score. I'd assume a lower score ? But how can 2028x1080 achieve a better score for a resolution of 720p than 1332x990 ?
from libcamera.
I am wondering - should merge requests concerning code in libcamera/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp be done here (since it is rpi-specific) ?
Merge requests for any code in the libcamera tree (e.g. the Raspberry Pi pipeline handler in this case) should be done though the libcamera dev mailing list. You can find the instructions here.
from libcamera.
I think it would be possible to work around this issue specifically here (with the HQ camera) though. What about modifying the (rpi) sensor mode selection code to select 1332 × 990 instead of 2028x1080 when the user selects 720p ? Aka implement the following algorithm: Take the "smallest" sensor mode that is equal or greater than the requested resolution ? I can only see benefits from this default behaviour, no downsides. For examle, less load on the CSI, perhaps less load on the ISP / memory (depending on where cropping happens if there is any), and most importantly: In the ideal scenario, lower resolution modes are not cropped but binned on the CMOS, resulting in better image quality.
Unfortunately, this will cause a regression. The 1332x990 mode does not actually use binning, rather it scales in the Bayer domain. This causes a significant loss of image quality when compared with direct binning. The reason for using scaling over binning is to achieve a faster framerate readout because of limitation in the sensor electronics.
One approach to make things easier would be for libcamera-apps to essentially duplicate what the pipeline handler does for mode selection, but have framerate accounted for as well, assuming it was provided in the command line. This way, you only need to specify --framerate 120
on the command line, and libcamera-apps will choose the mode that matches the requested 120fps, regardless of output resolution.
from libcamera.
I am wondering - should merge requests concerning code in libcamera/src/libcamera/pipeline/raspberrypi/raspberrypi.cpp be done here (since it is rpi-specific) ?
Merge requests for any code in the libcamera tree (e.g. the Raspberry Pi pipeline handler in this case) should be done though the libcamera dev mailing list. You can find the instructions here.
Thanks for letting me know. I don't think I'l invest the time to properly go through this hustle - In my opinion, the sensor mode selection code is flawed, but we now have a simple workaround.
https://github.com/OpenHD/libcamera/pull/2
from libcamera.
I think it would be possible to work around this issue specifically here (with the HQ camera) though. What about modifying the (rpi) sensor mode selection code to select 1332 × 990 instead of 2028x1080 when the user selects 720p ? Aka implement the following algorithm: Take the "smallest" sensor mode that is equal or greater than the requested resolution ? I can only see benefits from this default behaviour, no downsides. For examle, less load on the CSI, perhaps less load on the ISP / memory (depending on where cropping happens if there is any), and most importantly: In the ideal scenario, lower resolution modes are not cropped but binned on the CMOS, resulting in better image quality.
Unfortunately, this will cause a regression. The 1332x990 mode does not actually use binning, rather it scales in the Bayer domain. This causes a significant loss of image quality when compared with direct binning. The reason for using scaling over binning is to achieve a faster framerate readout because of limitation in the sensor electronics.
One approach to make things easier would be for libcamera-apps to essentially duplicate what the pipeline handler does for mode selection, but have framerate accounted for as well, assuming it was provided in the command line. This way, you only need to specify
--framerate 120
on the command line, and libcamera-apps will choose the mode that matches the requested 120fps, regardless of output resolution.
I don't think that's true. The data needs to be cropped anyways before going through the isp. And eliminating those wasted pixels as early as possible should be the standard approach. The only thing in this specific case where one perhaps could make a point is that the 2028x1080 provides 12bpp instead of the 10bpp in 1332x990. But from my testing, this 12bpp is just the default param of libcamera for some reason. Not sure if the ISP actually makes use of 12bpp in video and/or if that makes a difference in quality.
Also,sad that here is no pixel binning in the OS imx477 driver for 720p. Quite sure mmal had it. But I know how stubborn those vendors can be in regards to IP.
from libcamera.
I think it would be possible to work around this issue specifically here (with the HQ camera) though. What about modifying the (rpi) sensor mode selection code to select 1332 × 990 instead of 2028x1080 when the user selects 720p ? Aka implement the following algorithm: Take the "smallest" sensor mode that is equal or greater than the requested resolution ? I can only see benefits from this default behaviour, no downsides. For examle, less load on the CSI, perhaps less load on the ISP / memory (depending on where cropping happens if there is any), and most importantly: In the ideal scenario, lower resolution modes are not cropped but binned on the CMOS, resulting in better image quality.
Unfortunately, this will cause a regression. The 1332x990 mode does not actually use binning, rather it scales in the Bayer domain. This causes a significant loss of image quality when compared with direct binning. The reason for using scaling over binning is to achieve a faster framerate readout because of limitation in the sensor electronics.
One approach to make things easier would be for libcamera-apps to essentially duplicate what the pipeline handler does for mode selection, but have framerate accounted for as well, assuming it was provided in the command line. This way, you only need to specify
--framerate 120
on the command line, and libcamera-apps will choose the mode that matches the requested 120fps, regardless of output resolution.
actually, I just had a look at the source code - 1332x990 is binned and cropped
https://github.com/raspberrypi/linux/blob/rpi-5.15.y/drivers/media/i2c/imx477.c#L999
from libcamera.
actually, I just had a look at the source code - 1332x990 is binned and cropped https://github.com/raspberrypi/linux/blob/rpi-5.15.y/drivers/media/i2c/imx477.c#L999
That comment is wrong, the fast fps mode definitely uses scaling together with cropping.
from libcamera.
Seeing that you have an acceptable solution in your fork, I'll close this issue down now.
The change at raspberrypi/rpicam-apps#403 will also work by providing a similar mode selection directly in libcamera-apps.
from libcamera.
Related Issues (20)
- IMX519 on bookworm
- RPI4B OV5647 Camera black screen! HOT 7
- libcamerasrc doesn't work with CM4 RPI Cam V2 and V3 HOT 5
- Can't start gstreamer pipeline after update HOT 4
- Raspberry Pi Camera Module 3 not available on RPi 5/bookworm HOT 2
- False colours from global shutter camera on RPi5 HOT 3
- Enabling sensor HDR for Camera Module v3 HOT 2
- RPI5 - Omnivision 9281 8 Bit Mode HOT 6
- No json for imx290 HOT 1
- libcamerify causes motion daemon to create zombie processes HOT 26
- "libqt5widgets" doesn't exist did you mean "libqt5widgets5" in your readme? HOT 2
- libcamerify causes motion daemon to create zombie processes
- `
- libcamerify not working rpi5 bookworm. Not enough buffers HOT 7
- Range format is ambiguous / libcamera is not installed HOT 8
- Installation issue HOT 2
- Since 6.6.31 Kernel: FramebufferAllocator allocates too much space HOT 19
- Camera frontend timeout fires too aggressively (crashing libcamerasrc on RPi5) HOT 5
- Mmap permission issue on Android 14 HOT 2
- libcamerasrc produces buffers with invalid size on Raspberry Pi 3 with official RPi Cam v2.1 attached HOT 26
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from libcamera.