Git Product home page Git Product logo

Comments (20)

tlin442 avatar tlin442 commented on May 30, 2024 2

@jorgef1299, I am getting extremely good results (position drift of <1m over ~200-300m translation and rotation).

You need to patch the VIO rectification to use the fisheye model, then you can use the kannala-brandt-4 parameters from realsense factory calibration to undistort and crop. My patches to get it working on my setup are on my fork of R3live.

Camera->LiDAR calibration is done via https://github.com/hku-mars/livox_camera_calib, but I directly feed the undistorted image as an input use no distortion model during the calibration.

from r3live.

tlin442 avatar tlin442 commented on May 30, 2024 1

@seajayshore I did a lidar to camera calibration and then used the realsense's factory extrinsics for the rest. You can use kalibr to get camera to imu extrinsics if you don't have factory imu to camera calibration.

from r3live.

ziv-lin avatar ziv-lin commented on May 30, 2024

I think the key point in your problem is to calibrate both the temporal and spatial extrinsic among IMU-LiDAR-Camera sensors, which will finally determine the overall performance of R3LIVE with your sensors setup.

from r3live.

Camilochiang avatar Camilochiang commented on May 30, 2024

I have a similar setup and as ziv-lin comment is way easier if you get a I2C IMU. You can even get the same that you have in LIvox AVIA (BMI088)[ for example here] (https://wiki.seeedstudio.com/Grove-6-Axis_Accelerometer%26Gyroscope%28BMI088%29/)

from r3live.

tlin442 avatar tlin442 commented on May 30, 2024

I think temporal and spatial extrinsics shouldn't be too difficult. Camera/IMU input is timestamped by host driver and Livox is synced with system time via PTP. I also have Camera->IMU extrinsics from factory and can perform Camera->LiDAR calibration at a slightly later date. FastLIO2 works with the current setup and approximate IMU->LiDAR extrinsics.

My main issue is that r3live immediately diverges on my setup. I'm not sure why, but I think it could be one of the following:

  • IMU->LiDAR extrinsics being wrong. I am using imu_transformer to get the IMU samples in LiDAR frame
  • The cameras on the T265 being Mono Fisheye

from r3live.

tlin442 avatar tlin442 commented on May 30, 2024

I've fixed my problem by calibrating system extrinsics and transforming the incoming point cloud by the LiDAR->IMU extrinsics in the LiDAR front end. Thanks for the help!

from r3live.

seajayshore avatar seajayshore commented on May 30, 2024

Hi, can I ask how you calibrated the Lidar-IMU extrinsics? Are you using a particular tool for this?

from r3live.

jorgef1299 avatar jorgef1299 commented on May 30, 2024

Hi @tlin442! I have the same configuration as you (Livox Mid-70 + Intel T265). I would like to know if you were able to get good results with this algorithm and how did you perform the lidar-camera calibration (since T265 has a fisheye lens).
Thanks

from r3live.

jorgef1299 avatar jorgef1299 commented on May 30, 2024

Thank you @tlin442!

from r3live.

jorgef1299 avatar jorgef1299 commented on May 30, 2024

Hi @tlin442! I was trying your fork of R3Live and I am facing some issues with the fisheye image of the Realsense T265. The camera-frame is always 2, but that only happens with this camera. I think it may be related to the image encoding being "mono8". Have you ever faced this issue? Do you do any preprocessing to the images coming from the camera?

Also, I'm using the original image size (848x800px) instead of 1280x1024, is it ok?

Thanks

image

from r3live.

tlin442 avatar tlin442 commented on May 30, 2024

@jorgef1299 sounds like your lidar isn't time-stamping it's messages properly. Are you using pps or ptp with the livox?

from r3live.

jorgef1299 avatar jorgef1299 commented on May 30, 2024

@tlin442 I'm using PTP with the livox. That part is working because I can see the Lidar mapping at the right of R3Live window.
Can I use directly the mono8 fisheye image of the T265 camera or does it require some processing?

from r3live.

tlin442 avatar tlin442 commented on May 30, 2024

@jorgef1299 I directly use the mono8 image via /fisheye2/image_raw. Are you transforming the IMU frame correctly?

from r3live.

kakghiroshi avatar kakghiroshi commented on May 30, 2024

@jorgef1299, I am getting extremely good results (position drift of <1m over ~200-300m translation and rotation).

You need to patch the VIO rectification to use the fisheye model, then you can use the kannala-brandt-4 parameters from realsense factory calibration to undistort and crop. My patches to get it working on my setup are on my fork of R3live.

Camera->LiDAR calibration is done via https://github.com/hku-mars/livox_camera_calib, but I directly feed the undistorted image as an input use no distortion model during the calibration.

hi,I am planning to use https://github.com/hku-mars/livox_camera_calib to calibrate extrinsic between lidar and a fisheye camera with 190° FOV and it‘s distortion model is kb4,could you please specify how do you patch the VIO rectification and process the image so that they can feeed in the algorithm with all distortion set to 0? If any used code is open source ,please post the link in the reply. : )

from r3live.

tlin442 avatar tlin442 commented on May 30, 2024

@gara-9527

Please see my fork at https://github.com/tlin442/r3live

from r3live.

kakghiroshi avatar kakghiroshi commented on May 30, 2024

@tlin442 Thanks for your quick reply!
Actually,I'm confused about the part of lidar-camera calibration.I want to know how did you manged to use https://github.com/hku-mars/r3live/issues/url to calibrate the sensors.You mentioned that the image needs to be undistorted,which function do you use to achieve that? what I use is cv::fisheye::undistortImage.After undistortion and set the distort parameter to 0 ,can this algorithm work? Or is there anything else that needs to be changed? Please let me know.Many thanks again!

from r3live.

tlin442 avatar tlin442 commented on May 30, 2024

@tlin442 Thanks for your quick reply! Actually,I'm confused about the part of lidar-camera calibration.I want to know how did you manged to use https://github.com/hku-mars/r3live/issues/url to calibrate the sensors.You mentioned that the image needs to be undistorted,which function do you use to achieve that? what I use is cv::fisheye::undistortImage.After undistortion and set the distort parameter to 0 ,can this algorithm work? Or is there anything else that needs to be changed? Please let me know.Many thanks again!

Yes. I directly used the undistorted fisheye image with zero lens distortion as an input.

from r3live.

hr2894235132 avatar hr2894235132 commented on May 30, 2024

I've fixed my problem by calibrating system extrinsics and transforming the incoming point cloud by the LiDAR->IMU extrinsics in the LiDAR front end. Thanks for the help!

Hi

First I'd like to thank you for the source code release, it appears to be running well with the provided datasets. I'm now trying to run r3live using a Livox MID-70 and Realsense T265 (and its internal IMU).

Could you please let me know if the following is possible:

  • Custom IMU->LiDAR extrinsics. Due to packaging constraints, my LiDAR is not mounted aligned with the IMU.
  • Fisheye correction on input images from the T265.

Also, is it possible to use this in pure localization mode - i.e. disable RGB map generation for real-time operation?

Many thanks!

Hello,I would like to ask if you have succeeded with mid-70?

from r3live.

farhad-dalirani avatar farhad-dalirani commented on May 30, 2024

@tlin442 Hi,
I have a drifting problem, I explained my problem and setup with detail in this issue:
#157
It would be great if you look at it. I found you answers related to my problem. 👍

from r3live.

fanshixiong avatar fanshixiong commented on May 30, 2024

Hello, thank you very much for your work. When I use r3live, there is a lot of drift, especially at the corners. The straight line is not bad, but the same parameter configuration has achieved good results in fast-lio2.
My r3live version is the version with external imu rotation added, the warehouse is here: @tiny442 https://github.com/tlin442/r3live
The lidar I use is livox mid-70 with external imu, mynt-D1000 camera and imu, and the modified livox driver.
My computer is ubuntu20.04.
Here is my config file:

Lidar_front_end:
   lidar_type: 1   # 1 for Livox-avia, 3 for Ouster-OS1-64
   N_SCANS: 6
   using_raw_point: 1
   point_step: 1
   lidar_imu_rotm:
      # LiDAR is mounted rotated by 90 deg
      #[1, 0, 0,
      # 0, 0, 1,
      # 0, -1, 0]
      [ 0.016511, -0.999700,  0.018083,
       0.057071,  0.018999,  0.998189,
       -0.998234, -0.015449,  0.057368]
   lidar_imu_tranm: 
      [0.039342, 0.077608, 0.037443]

r3live_common:
   if_dump_log: 0                   # If recording ESIKF update log. [default = 0]
   record_offline_map: 1            # If recording offline map. [default = 1]
   pub_pt_minimum_views: 3          # Publish points which have been render up to "pub_pt_minimum_views" time. [default = 3]
   minimum_pts_size: 0.01           # The minimum distance for every two points in Global map (unit in meter). [default = 0.01] 
   image_downsample_ratio: 1        # The downsample ratio of the input image. [default = 1]
   estimate_i2c_extrinsic: 1        # If enable estimate the extrinsic between camera and IMU. [default = 1] 
   estimate_intrinsic: 1            # If enable estimate the online intrinsic calibration of the camera lens. [default = 1] 
   maximum_vio_tracked_pts: 600     # The maximum points for tracking. [default = 600]
   append_global_map_point_step: 4  # The point step of append point to global map. [default = 4]

   res_path: "/home/frans/code/r3live_proj/catkin_ws_r3live/src/r3live_/res"

r3live_vio:
   image_width: 1280
   image_height: 720
   camera_intrinsic:
       [655.005, 0, 679.029,
       0, 656.097, 358.596,
      0, 0, 1]
   camera_dist_coeffs: [-0.238605, 0.0435143, 0.000366211, -0.00272751, 0]  #k1, k2, p1, p2, k3
   
   # Fine extrinsic value. form imu2camera calibration.
   camera_ext_R:
         [0.999998,  0.00183758, 0.000849753,
         0.00184018,   -0.999994, -0.00307635,
         0.000844095,  0.00307791,   -0.999995]
   camera_ext_t: [0.0993128, 0.0117891, -0.176605] 

   
r3live_lio:        
   lio_update_point_step: 4   # Point step used for LIO update.  
   max_iteration: 2           # Maximum times of LIO esikf.
   lidar_time_delay: -0.092132       # The time-offset between LiDAR and IMU, provided by user. 
   filter_size_corner: 0.30   
   filter_size_surf: 0.30
   filter_size_surf_z: 0.30
   filter_size_map: 0.30

The imu_tools used for the internal reference calibration of the imu, radar and imu use is your job:hku-mars/LiDAR_IMU_Init, the camera and the imu use kalibr calibration, and the reprojection error is about 1.5 pixels.
2023-03-30 16-56-58 的屏幕截图

The test results on fast-lio2 are better, and the test results on vins are average, but there are obvious drifts in r3live.
Is there any good solution to solve the drift? Thanks for the reply.
2023-03-30 16-47-49 的屏幕截图
2023-03-30 16-48-04 的屏幕截图
There is a large drift when facing the wall, but there is no drift in fast-lio2.
Thanks.

from r3live.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.