Comments (57)
@ziv-lin I am using the same repo. But I think it is calibration lidar and RGB not imu and rgb. Do I have to matrix from this repo directly in the config file?
In our work, we treat the LiDAR frame and IMU (the LiDAR built-in IMU) as the same in our implementation. By this, the extrinsic between camera and IMU is the same as the extrinsic as extrinsic of camera-LiDAR.
Okay great. Then I can use the matrix directly. I am attaching an image from fastlio coloring just to make sure the calibration mat is correct.
from r3live.
@ziv-lin I am using the same repo. But I think it is calibration lidar and RGB not imu and rgb. Do I have to matrix from this repo directly in the config file?
In our work, we treat the LiDAR frame and IMU (the LiDAR built-in IMU) as the same in our implementation. By this, the extrinsic between camera and IMU is the same as the extrinsic as extrinsic of camera-LiDAR.
Okay great. Then I can use the matrix directly. I am attaching an image from fastlio coloring just to make sure the calibration mat is correct.
Hi! @aditdoshi333 May I ask you how to add the camera rgb information to the lidar in the fast_lio, I meet some problems.
Hello,
You can change the struct of point cloud and add rgb. I color the incoming point cloud before mapping. I am using the livox calibration code for coloring every frame.
from r3live.
You can get the detail about the relation among sensors by referring to our paper, which have given the detailed definitions of these sensor frames.
from r3live.
Yes @ziv-lin I read details regarding hardware setup in your paper. But I am not able to find which transformation matrix is being used?
Camera to lidar or lidar to camera
Thank you
from r3live.
See R2LIVE section III.B, our extrinsic denote the sensor frame w.r.t. IMU frame.
from r3live.
Okay thanks for the reply. But I am getting unexpected output.
The yellow color is of walls and pillars but on the map, it is showing on the ceiling. I am using a transformation matrix wrt IMU frame. Any clue what is happening here?
Thank you
from r3live.
Hello,
I read your paper every transformation is wrt to imu frame. I have a calibration matrix between lidar and RGB. But not with imu are you using some static matrix between lidar and imu because there is no such information in the config file. I am using livox avia with your custom driver.
If you are using such a matrix I can use that to get transformation between RGB and IMU.
Thank you
from r3live.
As suggested the following my hardware and software configuration
from r3live.
Okay thanks for the reply. But I am getting unexpected output.
The yellow color is of walls and pillars but on the map, it is showing on the ceiling. I am using a transformation matrix wrt IMU frame. Any clue what is happening here?
Thank you
It seems that the calibration between the camera and IMU is too bad, did you have a good calibration of your camera intrinsic, and the extrinsic between IMU and camera?
from r3live.
Hello @ziv-lin ,
I think calibration is good enough because I am using the same matrix in fastlio2 (modified) for coloring point cloud and doing a decent job. But I think the way I am putting the matrix there is some with that. Is there any repo or something you can suggest for imu and rgb camera?
from r3live.
For calibrating the intrinsic of the camera, you can use the tools provided by OpenCV or Matlab. And to perform the LiDAR-camera calibration, I recommend you to use this repo: https://github.com/hku-mars/livox_camera_calib
from r3live.
@ziv-lin I am using the same repo. But I think it is calibration lidar and RGB not imu and rgb. Do I have to matrix from this repo directly in the config file?
from r3live.
Hello @ziv-lin ,
I think calibration is good enough because I am using the same matrix in fastlio2 (modified) for coloring point cloud and doing a decent job. But I think the way I am putting the matrix there is some with that. Is there any repo or something you can suggest for imu and rgb camera?
How about the color of the point cloud in a static environment (i.e., don't move the sensors), is that correct?
from r3live.
@ziv-lin I tried the in a static environment and it looks better but there is still some offset. The following is the output
The objects are painted on the wall in real they are on a table. Looks like there is some static offset
from r3live.
@ziv-lin I am using the same repo. But I think it is calibration lidar and RGB not imu and rgb. Do I have to matrix from this repo directly in the config file?
In our work, we treat the LiDAR frame and IMU (the LiDAR built-in IMU) as the same in our implementation. By this, the extrinsic between camera and IMU is the same as the extrinsic as extrinsic of camera-LiDAR.
from r3live.
@ziv-lin I tried the in a static environment and it looks better but there is still some offset. The following is the output
The objects are painted on the wall in real they are on a table. Looks like there is some static offset
Is this the output of r3live or others?
from r3live.
@ziv-lin I tried the in a static environment and it looks better but there is still some offset. The following is the output
The objects are painted on the wall in real they are on a table. Looks like there is some static offsetIs this the output of r3live or others?
Its from r3live
from r3live.
@ziv-lin Sorry to bother but there is one more issue. Sometimes I am getting the following error ( Same config file and same bag file)
from r3live.
@ziv-lin I tried the in a static environment and it looks better but there is still some offset. The following is the output
The objects are painted on the wall in real they are on a table. Looks like there is some static offsetIs this the output of r3live or others?
Its from r3live
It seems that your configuration is correct. What is the problem about R3LIVE?
from r3live.
@ziv-lin I tried the in a static environment and it looks better but there is still some offset. The following is the output
The objects are painted on the wall in real they are on a table. Looks like there is some static offsetIs this the output of r3live or others?
Its from r3live
It seems that your configuration is correct. What is the problem about R3LIVE?
The color is not correct. There is an offset in the above sample it looks as if the objects are pasted on a wall. But it is not like that. And such color is coming only if the scene is static. If I am moving the setup the color is messed up fully
from r3live.
Oh, sorry, my mistake... What is your image resolution?
from r3live.
its 1920 x 1080
from r3live.
Might be I got the problem you meet with, the default image resolution I used is 1280 X 1024 and I should open this configuration for you to config.
from r3live.
Okay thanks a lot @ziv-lin
from r3live.
Can you replace all 1280 in codes to 1920 to see is that OK? I will commit a hotfix in tonight or tomorrow.
from r3live.
Okay sure, I will try and update.
from r3live.
r3live/r3live/src/r3live_vio.cpp
Line 381 in a5a4d84
r3live/r3live/src/r3live_vio.cpp
Line 387 in a5a4d84
r3live/r3live/src/r3live_vio.cpp
Line 391 in a5a4d84
r3live/r3live/src/r3live_vio.cpp
Line 397 in a5a4d84
r3live/r3live/src/r3live_vio.cpp
Line 1069 in a5a4d84
from r3live.
Thank you for you reporting such issues~, this actually a bug.
from r3live.
Hey @ziv-lin, I updated the image resolution. Thanks a lot for a quick fix. It improved the coloring but there is still significant bleeding. It looks like a calibration issue but I am not sure is it related to image size or aspect ratio?
from r3live.
Can you try and move the sensors to the mapping results? Our algorithm can online calibrate the extrinsic to make it more correct.
from r3live.
Sorry, my bad, you should also change the image_heigh 1024 to 1080, e.g., in the following codes:
r3live/r3live/src/r3live_vio.cpp
Line 387 in a5a4d84
r3live/r3live/src/r3live_vio.cpp
Line 397 in a5a4d84
r3live/r3live/src/r3live_vio.cpp
Line 1069 in a5a4d84
from r3live.
@ziv-lin I changed height and width as you suggested. But the issue is still the same image color is bleeding near the edges. And one more thing I am still getting the following error sometimes.
Whenever I get this error I need to rerun the r3live node.
from r3live.
Can you share your data with me,include both your rosbag files and your configurations?
from r3live.
Ya sure.
link : https://drive.google.com/drive/folders/1IPIvW-gzIYbW7z8jygCI1WhXZv1lE7YO?usp=sharing
from r3live.
I can not open the folder, is there any problem?
from r3live.
@ziv-lin I am able to open in incognito mode also. Should I upload somewhere else then google drive?
from r3live.
I can download the config files, but can't get the rosbag file, can you set put them together in the same directory?
from r3live.
I can download the config files, but can't get the rosbag file, can you set put them together in the same directory?
Done. Please check and let me know
from r3live.
I can download the config files, but can't get the rosbag file, can you set put them together in the same directory?
Done. Please check and let me know
That right now, I will try your bag tonght if possible.
from r3live.
hi, ziv-lin,
I share a short rosbag for calib, with hight image resolution and livox avia pointcloud.
To try r3live, Which config file should I modify ?
链接: https://pan.baidu.com/s/1y0xd2kICGSgKEhat-Bt3hQ 密码: kroh
height: 2048 width: 3072
camera matrix 1745.304795 0.000000 1519.690772 0.000000 1749.029333 1067.018842 0.000000 0.000000 1.000000
distortion -0.102255 0.116367 -0.000031 0.003545 0.000000
extrinsic.txt -0.016054,-0.999561,-0.0248822,-0.0664396 -0.00713346,0.0249993,-0.999662,0.0192496 0.999846,-0.0158711,-0.00753167,0.0634879 0,0,0,1
Take a glance look, it seems that your result is correct?
from r3live.
hi, ziv-lin,
I share a short rosbag for calib, with hight image resolution and livox avia pointcloud.
To try r3live, Which config file should I modify ?
链接: https://pan.baidu.com/s/1y0xd2kICGSgKEhat-Bt3hQ 密码: kroh
height: 2048 width: 3072
camera matrix 1745.304795 0.000000 1519.690772 0.000000 1749.029333 1067.018842 0.000000 0.000000 1.000000
distortion -0.102255 0.116367 -0.000031 0.003545 0.000000
extrinsic.txt -0.016054,-0.999561,-0.0248822,-0.0664396 -0.00713346,0.0249993,-0.999662,0.0192496 0.999846,-0.0158711,-0.00753167,0.0634879 0,0,0,1
999846,-0.0158711,-0.00753167,0.0634879 0,0,0,1Take a glance look, it seems that your result is correct?
Sorry, I upload rosbag and camera param(intrinsic and extrinsic)again in one link .
链接: https://pan.baidu.com/s/1IMkYRubNFjIX-z5UOR5mhA 密码: deja
I noticed that the resolution is in your work is 1280 X 1024 . However my camera in this test is 2048 X 3072, Which config file should I modify in R3Live? Change all of 1280 X 1024 to 2048 X 3072 ?
from r3live.
Yes, you should change all 1280 -> 2048, and 1024 -> 3072 (why is your height larger than width? is there any wrong?)
The fix of this bug will be pushed in this repo withing tease two days.
from r3live.
Yes, you should change all 1280 -> 2048, and 1024 -> 3072 (why is your height larger than width? is there any wrong?) The fix of this bug will be pushed in this repo withing tease two days.
Sorry, my bad,Resolution is actually 2048 X 3072 (height *width) .
Thanks very much for your reply,Looking forward to the next update
from r3live.
@ziv-lin I think the issue is that this:
r3live/r3live/src/r3live_vio.cpp
Lines 381 to 384 in 28f5365
Malforms the camera calibration matrix here when the camera width isn't 1280 pixels.
r3live/r3live/src/r3live_vio.cpp
Lines 195 to 196 in 28f5365
I had the same issue undistorting my input with a 848*800 camera.
from r3live.
I am now fixing and testing this problem, please wait with patience.
from r3live.
@aditdoshi333 It runs your data carefully, it seems that your problem is your calibration is not accurate enough, causing the color rendering is not so accurate.
In addition, the print of ""****** Remove_outlier_using_ransac_pnp error*****"" is caused by the delayed incoming image message (see following figure). You can ignore this print since it didn't play too much effect after R3LIVE has received enough image frames.
If your play the bag with ''-s 5'', this warning disappear:
rosbag play YOUR.bag -s 5
from r3live.
Hi~, @aditdoshi333 @jxx315 I have just pushed a commit that fixes this bug, now R3LIVE allows you to set the image resolution correctly, can you have a try of this version? Please let me know if you found any bugs and problems.
from r3live.
You can now set your own image resolution by modifying these two configs:
r3live/config/r3live_config.yaml
Line 19 in 4d386cc
r3live/config/r3live_config.yaml
Line 20 in 4d386cc
from r3live.
@ziv-lin Thanks for the quick fix. I will run the new commit and update you in the next 2 days. And I will also check calibration at my end. Thanks for all the efforts.
One more doubt the calibration between camera and imu is also for texture mapping or it will used in mapping also. I mean preciseness of calibration is it proportional to the accuracy of mapping?
Thank you
from r3live.
Hei @aditdoshi333 . In my experience a more precise calibration will give you a better mapping. See here for example (Both pictures are a top view):
Not so good calibration:
A better calibration:
You can clearly see that the better the calibration, the better is the alignment of the incoming scans (white points). This is of course as the intrinsic and extrinsic parameters play an important role to say from were the points are coming. I have to do a better calibration in any case, you can see than in my "better calibration" is not perfect
from r3live.
great!
from r3live.
@ziv-lin I am using the same repo. But I think it is calibration lidar and RGB not imu and rgb. Do I have to matrix from this repo directly in the config file?
In our work, we treat the LiDAR frame and IMU (the LiDAR built-in IMU) as the same in our implementation. By this, the extrinsic between camera and IMU is the same as the extrinsic as extrinsic of camera-LiDAR.
Okay great. Then I can use the matrix directly. I am attaching an image from fastlio coloring just to make sure the calibration mat is correct.
Hi! @aditdoshi333 May I ask you how to add the camera rgb information to the lidar in the fast_lio, I meet some problems.
from r3live.
Hello @ziv-lin ,
Sorry for the delay. I am yet to calibrate the camera and test it. I am sick so will try out asap.
from r3live.
@ziv-lin I am using the same repo. But I think it is calibration lidar and RGB not imu and rgb. Do I have to matrix from this repo directly in the config file?
In our work, we treat the LiDAR frame and IMU (the LiDAR built-in IMU) as the same in our implementation. By this, the extrinsic between camera and IMU is the same as the extrinsic as extrinsic of camera-LiDAR.
Okay great. Then I can use the matrix directly. I am attaching an image from fastlio coloring just to make sure the calibration mat is correct.
Hi! @aditdoshi333 May I ask you how to add the camera rgb information to the lidar in the fast_lio, I meet some problems.
Hello,
You can change the struct of point cloud and add rgb. I color the incoming point cloud before mapping. I am using the livox calibration code for coloring every frame.
Thanks very much! I will try!
from r3live.
@aditdoshi333 I am so sorry to hear that. Please take good care of your body. Hoping every goes well.
from r3live.
Hello @ziv-lin,
I can confirm that the color offset is because of poor calibration. Sorry for the trouble.
Thank you for all your efforts.
from r3live.
@ziv-lin I am using the same repo. But I think it is calibration lidar and RGB not imu and rgb. Do I have to matrix from this repo directly in the config file?
In our work, we treat the LiDAR frame and IMU (the LiDAR built-in IMU) as the same in our implementation. By this, the extrinsic between camera and IMU is the same as the extrinsic as extrinsic of camera-LiDAR.
Okay great. Then I can use the matrix directly. I am attaching an image from fastlio coloring just to make sure the calibration mat is correct.
Hi! @aditdoshi333 May I ask you how to add the camera rgb information to the lidar in the fast_lio, I meet some problems.
Hello,
You can change the struct of point cloud and add rgb. I color the incoming point cloud before mapping. I am using the livox calibration code for coloring every frame.
Hi,I did same work as you did。Did you do the time sync between lidar and camera?I mean hard sync,cuz my result is not good,I don't have accurate color with pointcloud.
from r3live.
Related Issues (20)
- Get pointcloud data from ros messages fail HOT 6
- Image input error HOT 2
- Segmentation fault on ikd-tree algorithm HOT 1
- 您好,请问有碰到过这个问题吗? HOT 1
- Can r3live run in the AirSim? HOT 1
- 林博,你好,你有遇到过移动过程中path和odometry抖动得非常剧烈的情况吗?隔一段时间后,产生严重漂移。 HOT 1
- Is it possible to use Livox Mid 360 (a spinning lidar) and two cameras on two side of the lidar and use the algorithm to colorize 360 degrees data? HOT 4
- LIO/VIO cost time HOT 1
- Disable VIO subsystem HOT 1
- why the Per-frame cost time is much different. HOT 1
- Ouster Rosbag link not working HOT 2
- Map memory size is increasing all the time HOT 1
- Drifts even though it's not moving. HOT 1
- Compitability with lslidar? HOT 1
- [r3live_mapping -3] has died ! HOT 1
- Error accessing the Google Drive/MS OneDrive Link for the Dataset HOT 5
- 这个代码是对应r3live++ 论文还是r3live的论文? HOT 1
- pcd文件是如何保存的 HOT 1
- About r3live_config_nclt.yaml HOT 1
- Formula question HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from r3live.