Git Product home page Git Product logo

Comments (33)

seajayshore avatar seajayshore commented on May 30, 2024 2

Hey @aditdoshi333 for extrinsic (spatial) calibration between Lidar & Camera I used the same tool you linked above. I think you've had feedback on other issues now so maybe you solved it already but incase you're still struggling here's my current status:

Before anything else I tried to make sure I had a good intrinsic calibration of the camera & lens. I have only used the ROS basic / checkerboard based tool so far but I recently got Kalibr working so will hopefully have even better soon.

Then what worked best for me with the livox_camera_calib (lidar extrinsics) tool was to record 4 different static scenes with good light, pretty clear geometric structure (e.g. corner of a simple building).

I don't get very good results with this tool using just single scenes but when using this set of 3-4 varied scenes with the "multi_calib.launch" process it produced pretty impressive results.

As noted in another issue you asked (this one) to use this extrinsic in R3Live you have to invert/transpose the rotation (3x3) part of the matrix.

I'm still honestly confused how to transform the translation extrinsic correctly but for my data in R3LIVE it looks great to just leave translation extrinsic as [0,0,0] and enable it to estimate extrinsic in the config file.

If you are still having issues and you're sure your camera intrinsics, lidar/camera extrinsic are all correct then maybe you have issues with either:

  • Bad time-syncronision between sensors: My setup is hardware synchronized across all sensors.
  • Incorrect units used by the IMU data: For a while I was accidentally using IMU data in units of g instead of m/s^2. Somehow R3Live still worked but the pose was shaky and the lidar/camera projection was never as good as it should. I fixed that and finally I have very good output from R3LIVE.

from r3live.

seajayshore avatar seajayshore commented on May 30, 2024 2

@redheli Sorry for slow reply. I use an Edmund Optics Cr-series 3.5mm lens with f/4 aperture (exact model here).

Edmund Optics have various similar cameras with a wide variety of focal lengths. Some have adjustable aperture and others are fixed aperture. Some cameras have special properties (e.g. wateproof, more ruggedised, etc.). It's easy to be overwhelmed with choices!

Whatever lens you choose also make sure its "image circle" is larger than the sensor size or the section of the sensor that you want to actually see light through the lens.

I use the Cr-series as they are supposed to be extremely stable & ruggedised - meaning they can survive with vibrations & remain calibrated after a long time in use. (I don't confirm this myself but their testing looks good to me).

Alternatively the C-Series is maybe better as a more "basic" version where you can also adjust aperture.


@farhad-dalirani Sorry no-one managed to reply to you it seems. But I checked your issue and looks like you solved it with better calibration! Well done - hopefully it's all good for you now!

from r3live.

farhad-dalirani avatar farhad-dalirani commented on May 30, 2024 2

Radar? do you mean LiDAR?

camera_ext_R and camera_ext_t are extrinsic parameters between Camera and LiDAR.
However, pay attention camera_ext_R is from the camera to LiDAR. If your calibration software gives you camera_ext_R from LiDAR to camera, you need to use the inverse of it.

from r3live.

seajayshore avatar seajayshore commented on May 30, 2024 1

@aditdoshi333 Incase it helps, I use a Basler a2a1920-160uc camera which is similar to the one you use and I get very good results so far.

Both my camera & your camera have USB interface and as so they don't have PTP-timestamp capability like with Ethernet cameras. You have to use trigger & internal timestamp counters carefully to obtain good timestamping & synchronisation with these USB cameras. (You can't trust the timestamp given by ROS at the time of receiving the image from USB, you have to use the camera's internal timestamp).

I know that R3LIVE/R2LIVE tries to estimate time-sync online but I wonder if your drift is caused by either poor time synchronisation - or else maybe poor calibration. Most of the issues I find trying to make R3LIVE & similar packages to work properly come from these two factors...

from r3live.

ziv-lin avatar ziv-lin commented on May 30, 2024

Have you configured the hardware correctly (e.g., the bad result of calibration)? And more, is there any wrong with the input camera data (e.g., underexposure or overexposure? the dramatic jump of image timestamp?).

from r3live.

aditdoshi333 avatar aditdoshi333 commented on May 30, 2024

I think it is an exposure issue. What do you suggest auto exposure or manual set?

from r3live.

ziv-lin avatar ziv-lin commented on May 30, 2024

For my case, I prefer to set the exposure mode as auto-exposure.

from r3live.

aditdoshi333 avatar aditdoshi333 commented on May 30, 2024

Yeah thanks @ziv-lin , I think I need to try with a better camera.

I wanted to try the FLIR camera same as yours but that is not in stock. So I ordered ( https://www.edmundoptics.com/p/basler-ace-aca1300-200uc-color-usb-30-camera/3419/ ). Do you have any comments on camera selection? Any thing to keep in mind for r3live best performance?

Thanks

from r3live.

ziv-lin avatar ziv-lin commented on May 30, 2024

No, I have no suggestions on the selection of the camera.

from r3live.

aditdoshi333 avatar aditdoshi333 commented on May 30, 2024

Okay thanks

from r3live.

aditdoshi333 avatar aditdoshi333 commented on May 30, 2024

Hello @seajayshore,

I got my new Basler camera and I am trying to calibrate it with lidar using (https://github.com/hku-mars/livox_camera_calib) but I am struggling a lot with the same. Can you please throw some light on how you calibrated your camera with lidar?

from r3live.

Camilochiang avatar Camilochiang commented on May 30, 2024

I tried kalibr a couple of months ago @seajayshore but the distortion model that r2live and r3live is not available on kalibr so not sure if it will work better, or at least didn’t in my case.
also kalibr is for calibrating between camera and imu, and r2 and r3live use calibration between camera and LiDAR (correction to imu is done inside the code I thing)

let us know how it goes in your case!

from r3live.

seajayshore avatar seajayshore commented on May 30, 2024

@Camilochiang thanks for the comments - and your other issues/questions on R3Live & others! They have helped me!

For the distortion model:
It seems this is a problem of people using different names for the same thing!

  • Brown-Conrady Model (in research papers) = Plumb Bob Model (ROS & some calibration apps) = Radial + Tangential Model (Kalibr, OpenCV & others)

These are all just different names for the same thing (some sources here, here, here + more if you google these keywords)...
So if you use the "pinhole-radtan" model in Kalibr it gives k1, k2, p1, p2 parameters needed by ROS/R3Live/VINS, etc. (But no k3 parameter... Reading on OpenVINS github it seems this doesn't matter for them so who knows...)

Anyway, I computed the new intrinsic in Kalibr yesterday and will use it to update all my dependant calibrations ASAP. Will let you know if the results are better/worse.

As for camera / IMU calibration:
Yes I have seen R2/R3LIVE extrinsic are a bit ambiguous... I read that they "treat the LiDAR frame and IMU (the LiDAR built-in IMU) as the same" (comment here) so that suggests to really use the lidar-camera extrinsic as stated in the VIO config file.

I am doing Camera/IMU calibration because I have an external IMU and want to try and correct for Lidar-to-IMU extrinsics in the Lidar front-end. Currently I don't find a good Lidar-to-IMU calibration tool (until this is released) so I am going to try and use my Camera-to-Lidar + Camera-to-IMU extrinsics to calculate one (as suggested here).

I'll update when I manage to complete.

from r3live.

Camilochiang avatar Camilochiang commented on May 30, 2024

Hei @seajayshore Thanks for your post. I will recheck the models as now im curious and I dont remember the details! Thanks!

I have a similar situation using an external BMI088 with a Livox Mid-70 (We are interested in been able to use the LiDAR at really short distances, so any other LiDAR is not really useful)

Some comments about an external IMU that may help you

  • If I remember properly R2Live and R3Live expect gravity (z axis) as 1 g (9.8 m-s2). Please recheck previous issues
  • At some point I corrected the factory offset of the IMU and this didn't help and if I remember properly make worse the behavior of R2Live. Not sure how the authors nor Livox is handling the IMU output before sending it...
  • When I have tried R3live correcting by the extrinsic matrix ( that is really similar to the datasheet as I have the IMU in the same position and direction as the LiDARs than include an IMU from Livox), I actually get worse results. What i find interesting is that in the authors have actually that line commented in their yaml file and I quote # Rough extrinsic value, form CAD model, is not correct enough, but can be online calibrated in our datasets.. As you say they expect the Lidar and IMU to be basically at the same place, what is actually not true, specially using an external IMU that is not close to your LiDAR. I would recomend to keep the IMU as close as possible to the LiDAR (I actually screw mine to the top of the LiDAR)

from r3live.

aditdoshi333 avatar aditdoshi333 commented on May 30, 2024

Hello @Camilochiang @seajayshore,

Thanks a lot for the inputs, I have one more doubt what kind of image input do we need to give in r3live rectified or distorted? My issue is I am able to calibrate extrinsic pretty well using (https://github.com/hku-mars/livox_camera_calib) as every RGB edge is matching with the lidar edge so that looks good. But even after inverting the matrix, I am not able to get a similar result in r3live. And I think syncing is not an issue as I am testing on a static scene.

Calibration output:
Screenshot 2022-03-18 at 12 24 01 PM

r3live output (Same scene that of calibration)
Screenshot 2022-03-08 at 7 21 02 PM

I tried improving intrinsics but it is coming similar every time and even the projection error is quite low. I am really confused about where I am going wrong. Any light is appreciated.

Thank you guys

from r3live.

Camilochiang avatar Camilochiang commented on May 30, 2024

hei @aditdoshi333 , my apologies but what is your problem? I have seen that image of you before and I dont get /understand where is your issue? For me look quite good.

from r3live.

aditdoshi333 avatar aditdoshi333 commented on May 30, 2024

Hello @Camilochiang ,

The problem is while calibrating I am getting an exact edge-to-edge matching of RGB and lidar. But after inverting the matrix when I am running in r3live there is an offset in color mapping. For example in the above r3live output image if you see the edge of the pixel is coming at the center and the whole color mapping is wrong. But in the calibration output image, all the edges are properly aligned.

Thank you

from r3live.

Camilochiang avatar Camilochiang commented on May 30, 2024

Sorry, what do you mean with the edge of the pixel? Do you mean the edge of the point cloud?Like the colors look correct to me. Could you share a picture to see the real colors ?

I think that I see what you mean. Can you share the matrix that you got before and after "inverting"?
Thanks

EDIT: Now I understand what you mean I think. Is hard to see the missmatching in 3D-

from r3live.

aditdoshi333 avatar aditdoshi333 commented on May 30, 2024

Hello @Camilochiang,

Sorry for the late reply. Ya, surely I can share the intrinsic and extrinsic with you.

Camera instrinsics :
ost.txt

Camera extrinsic:
Raw:
[[0.00811628,-0.999494,-0.0307492,-0.0686258],
[-0.0043803,0.0307144,-0.999519,0.0145342],
[0.999957,0.00824706,-0.0041288,-0.0395998],
[0,0,0,1]]

Inverted:
[ 0.00811628, -0.0043803 , 0.999957 , 0.04021875],
[-0.999494 , 0.0307144 , 0.00824706, -0.0687109 ],
[-0.0307492 , -0.999519 , -0.0041288 , 0.01225352],
[ 0. , 0. , 0. , 1. ]

I am highly thankful to you for your support.

from r3live.

Camilochiang avatar Camilochiang commented on May 30, 2024

Everything looks coherent for me @aditdoshi333 .
Do you also give the transformation values to the camera_ext_R of R3Live? (Last colum of your inverted matrix)?
R3live use the inverted matrix (without the last column! so a matrix of 3x3) as ext_R and an additional camera_ext_t array with the last column of the inverted matrix. What I have observe is that R3Live work better with the camera_ext_t array is set to 0, even if this is not true, but as I mentioned above is cos in my case the Lidar and IMU are really close.
Maybe give it a try like that?

from r3live.

aditdoshi333 avatar aditdoshi333 commented on May 30, 2024

Hello @Camilochiang,

I tried both ways of giving the last column and even setting it 0 for translation. But nothing seems working. Even in my case lidar and imu are really close as I am using livox avia. Do you have any comments on the intrinsics of the camera? I am really blank on how to further debug this issue.

Thank a lot for the help.

from r3live.

Camilochiang avatar Camilochiang commented on May 30, 2024

mmmm

This is my configuration. Give it a try just to check how it looks

3live_vio:
   image_width: 1920
   image_height: 1080
   camera_intrinsic:
      [1344.726575185332, 0.0, 939.2360719294088,
      0.0,  1336.144803263191, 536.1585308487656,
      0.0, 0.0, 1.0 ] 
   camera_dist_coeffs: [-0.01841394411245675, 1.403767536853783, -0.004610713008191326, -0.005306367151226178, -3.079483857895877]  #k1, k2, p1, p2, k3
   # Fine extrinsic value. form camera-LiDAR calibration.
   camera_ext_R:
         [0.0129924, -0.0115002, 0.999849,
            -0.999817,  -0.0141637, 0.0128291,
            0.014014,  -0.999834,  -0.0116821]
   # camera_ext notes: 
   #camera_ext_t: [-0.0139635, 0.0542981, -0.0104072] 
   camera_ext_t: [0,0,0] 

from r3live.

aditdoshi333 avatar aditdoshi333 commented on May 30, 2024

Umm no luck..

Do you think is there anything related to fov? Because in this case fov of the camera is more then lidar.

from r3live.

Camilochiang avatar Camilochiang commented on May 30, 2024

Mmm it could be but not sure. You can try limiting the FOV of your camera no? You need to cut the image via software (not reducing the quality), recalibrate with these modified images and check if it works!

from r3live.

ly-uuu avatar ly-uuu commented on May 30, 2024

Hello @Camilochiang,

Sorry for the late reply. Ya, surely I can share the intrinsic and extrinsic with you.

Camera instrinsics : ost.txt

Camera extrinsic: Raw: [[0.00811628,-0.999494,-0.0307492,-0.0686258], [-0.0043803,0.0307144,-0.999519,0.0145342], [0.999957,0.00824706,-0.0041288,-0.0395998], [0,0,0,1]]

Inverted: [ 0.00811628, -0.0043803 , 0.999957 , 0.04021875], [-0.999494 , 0.0307144 , 0.00824706, -0.0687109 ], [-0.0307492 , -0.999519 , -0.0041288 , 0.01225352], [ 0. , 0. , 0. , 1. ]

I am highly thankful to you for your support.

why change the last column?

from r3live.

farhad-dalirani avatar farhad-dalirani commented on May 30, 2024

Hi @aditdoshi333 @Camilochiang @seajayshore ,

I think my question in this issue #157 is related to what your discussed. It would be great if you look at it.

from r3live.

redheli avatar redheli commented on May 30, 2024

@aditdoshi333 Incase it helps, I use a Basler a2a1920-160uc camera which is similar to the one you use and I get very good results so far.

Both my camera & your camera have USB interface and as so they don't have PTP-timestamp capability like with Ethernet cameras. You have to use trigger & internal timestamp counters carefully to obtain good timestamping & synchronisation with these USB cameras. (You can't trust the timestamp given by ROS at the time of receiving the image from USB, you have to use the camera's internal timestamp).

I know that R3LIVE/R2LIVE tries to estimate time-sync online but I wonder if your drift is caused by either poor time synchronisation - or else maybe poor calibration. Most of the issues I find trying to make R3LIVE & similar packages to work properly come from these two factors...

Hi @seajayshore , Could I ask what lens you use for the Basler a2a1920-160uc camera?
I am using livox HAP and zed 2i camera, the result seems ok. But it is rolling shutter, so I want to try your camera setting.

from r3live.

fanshixiong avatar fanshixiong commented on May 30, 2024

@seajayshore How did you solve the drifting problem? I have a drifting problem with Livox Lidar Mid-70 + MYNT IMU + MYNT camera.
I explained it in detail. It would be great if look at it:
#173

from r3live.

farhad-dalirani avatar farhad-dalirani commented on May 30, 2024

@fanshixiong
30 frames per second image.
Super accurate camera-lidar calibration: https://github.com/AFEICHINA/extended_lidar_camera_calib

from r3live.

fanshixiong avatar fanshixiong commented on May 30, 2024

@farhad-dalirani thanks.
The imu and lidar of my device are not together, so I need to calibrate lidar and imu, imu and camera. Lidar and camera should not be calibrated. Is there any more accurate imu and camera calibration method?

from r3live.

farhad-dalirani avatar farhad-dalirani commented on May 30, 2024

@fanshixiong
1- R3Live heavily depended on camera and LiDAR calibration! A huge drift happens if the calibration between those two is bad.

2- I used Velodyne-16 and it does not have an internal IMU. I used an external IMU and put it under LiDAR in a way that the axes of LiDAR and IMU are aligned. This approximation is sufficient.

from r3live.

fanshixiong avatar fanshixiong commented on May 30, 2024

/@farhad-dalirani thank you for your reply.
In my device, I use a mynt camera with its own imu. Devices for cameras and imu are placed under the radar.
I calibrated the camera and imu through kalibr. Calibrated the radar and imu using their engineering:hku-mars/LiDAR_IMU_Init.
Which parameter in the config does the calibration between the radar and the camera you mentioned reflect?
This is the calibration parameter of radar and imu:

Lidar_front_end:
   lidar_type: 1   # 1 for Livox-avia, 3 for Ouster-OS1-64
   N_SCANS: 6
   using_raw_point: 1
   point_step: 1
   lidar_imu_rotm:
      # LiDAR is mounted rotated by 90 deg
      #[1, 0, 0,
      # 0, 0, 1,
      # 0, -1, 0]
      [ 0.016511, -0.999700,  0.018083,
       0.057071,  0.018999,  0.998189,
       -0.998234, -0.015449,  0.057368]
   lidar_imu_tranm: 
      [0.039342, 0.077608, 0.037443]

Here are the calibration parameters of the camera and imu:

r3live_vio:
   image_width: 1280
   image_height: 720
   camera_intrinsic:
       [655.005, 0, 679.029,
       0, 656.097, 358.596,
      0, 0, 1]
   camera_dist_coeffs: [-0.238605, 0.0435143, 0.000366211, -0.00272751, 0]  #k1, k2, p1, p2, k3
   
   # Fine extrinsic value. form imu2camera calibration.
   camera_ext_R:
         [0.999998,  0.00183758, 0.000849753,
         0.00184018,   -0.999994, -0.00307635,
         0.000844095,  0.00307791,   -0.999995]
   camera_ext_t: [0.0993128, 0.0117891, -0.176605] 

Do you mean that you only need to calibrate the lidar and camera, and put it directly into the parameters of vio, don't need to calibrate the imu and camera?
Thanks.

from r3live.

fanshixiong avatar fanshixiong commented on May 30, 2024

@farhad-dalirani Can you provide your contact information, we can exchange specific questions, my email is: [email protected].

from r3live.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.