hku-mars / r3live Goto Github PK
View Code? Open in Web Editor NEWA Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package
License: GNU General Public License v2.0
A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package
License: GNU General Public License v2.0
Hello @ziv-lin,
I tried r3live to build large-area meshes. In most cases, it is doing a pretty good job but in some, I got the following problem.
I am getting such holes all over the surface. It is quite strange as the point is uniform and there are no holes. I am running the default config and launch file. Let me know if I can improve this by any means.
Thanks
Hello @ziv-lin ,
Thanks for sharing r3live paper. Currently for a college project I am using r2live but I am not able to generate good map when the environment has limited features. From the example video it looks like r3live is solving this problem. I am eagerly waiting for the code release if possible can you please share the tentative date of the code release. I know you are waiting for the reviewer comment, If possible please share the timelines.
Thank you
Hi
Thanks for your outstanding work and open source it.
I modified the project and adapted it for spinning lidar. The results are shown in the figure:
. I found the result is misalignment. The external parameter from my camera to IMU is set to rough external value and estimate_Intrinsic is set to 1. Do you have any other suggestions besides recalibrating the accurate initial value? thank you very much!
Hello @ziv-lin,
I had an issue while mapping sometimes I am getting the large drift using r3live. But if I use vanilla fastlio 2 there is no visible drift. I was wondering how is that possible? Like for LIO r3live is also using fastlio right. Do I have to do any config changes? In some cases, r3live is performing exceptionally well. And I don't think there is any compute limitation as I am using 64 core 512 GB RAM.
Thank you
Hello @ziv-lin ,
Thanks for sharing your great works. The accuracy and detail of R3LIVE real-time mapping are impressive, but there are still some problems. I built R3LIVE according to the instructions and run "hkust_campus_seq_00.bag", the results I got are not consistent with the PCD file published on GitHub.
As shown in the following screenshot, the picture above is the result I got, the picture below is the PCD result published on GitHub. There exist obvious height error in my testing. In the test of FAST-LIO2, the problem of height error also appears.
Did I miss something?
Best regards
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.hpp:19,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.cpp:1:
/usr/include/flann/ext/lz4.h:196:57: error: conflicting declaration ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.hpp:12,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.cpp:1:
/usr/include/lz4.h:196:57: note: previous declaration as ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.hpp:19,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.cpp:1:
/usr/include/flann/ext/lz4.h:249:72: error: conflicting declaration ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.hpp:12,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.cpp:1:
/usr/include/lz4.h:249:72: note: previous declaration as ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.cpp:69:
/usr/include/flann/ext/lz4.h:196:57: error: conflicting declaration ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.cpp:59:
/usr/include/lz4.h:196:57: note: previous declaration as ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.cpp:69:
/usr/include/flann/ext/lz4.h:249:72: error: conflicting declaration ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.cpp:59:
/usr/include/lz4.h:249:72: note: previous declaration as ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:70,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:48:
/usr/include/flann/ext/lz4.h:196:57: error: conflicting declaration ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:60,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:48:
/usr/include/lz4.h:196:57: note: previous declaration as ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:70,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:48:
/usr/include/flann/ext/lz4.h:249:72: error: conflicting declaration ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:60,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:48:
/usr/include/lz4.h:249:72: note: previous declaration as ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:70,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_lio.cpp:48:
/usr/include/flann/ext/lz4.h:196:57: error: conflicting declaration ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:60,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_lio.cpp:48:
/usr/include/lz4.h:196:57: note: previous declaration as ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:70,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_lio.cpp:48:
/usr/include/flann/ext/lz4.h:249:72: error: conflicting declaration ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:60,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_lio.cpp:48:
/usr/include/lz4.h:249:72: note: previous declaration as ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp: In member function ‘void R3LIVE::load_vio_parameters()’:
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:441:68: warning: format ‘%d’ expects argument of type ‘int’, but argument 3 has type ‘std::vector<double, std::allocator >::size_type {aka long unsigned int}’ [-Wformat=]
printf( "Load camera data size = %d, %d, %d, %d\n", ( int ) camera_intrinsic_data.size(), camera_dist_coeffs_data.size(),
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
camera_ext_R_data.size(), camera_ext_t_data.size() );
^
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:441:68: warning: format ‘%d’ expects argument of type ‘int’, but argument 4 has type ‘std::vector<double, std::allocator >::size_type {aka long unsigned int}’ [-Wformat=]
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:441:68: warning: format ‘%d’ expects argument of type ‘int’, but argument 5 has type ‘std::vector<double, std::allocator >::size_type {aka long unsigned int}’ [-Wformat=]
r3live/r3live/CMakeFiles/r3live_mapping.dir/build.make:158: recipe for target 'r3live/r3live/CMakeFiles/r3live_mapping.dir/src/loam/IMU_Processing.cpp.o' failed
make[2]: *** [r3live/r3live/CMakeFiles/r3live_mapping.dir/src/loam/IMU_Processing.cpp.o] Error 1
make[2]: *** Waiting for unfinished jobs....
r3live/r3live/CMakeFiles/r3live_mapping.dir/build.make:62: recipe for target 'r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live.cpp.o' failed
make[2]: *** [r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live.cpp.o] Error 1
r3live/r3live/CMakeFiles/r3live_mapping.dir/build.make:86: recipe for target 'r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live_lio.cpp.o' failed
make[2]: *** [r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live_lio.cpp.o] Error 1
r3live/r3live/CMakeFiles/r3live_mapping.dir/build.make:206: recipe for target 'r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live_vio.cpp.o' failed
make[2]: *** [r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live_vio.cpp.o] Error 1
CMakeFiles/Makefile2:2731: recipe for target 'r3live/r3live/CMakeFiles/r3live_mapping.dir/all' failed
make[1]: *** [r3live/r3live/CMakeFiles/r3live_mapping.dir/all] Error 2
Makefile:140: recipe for target 'all' failed
make: *** [all] Error 2
Invoking "make -j8 -l8" failed
dji@dji-NUC11PAHi7:/catkin_ws$ 终端输入:/catkin_ws$ sudo mv /usr/include/flann/ext/lz4.h /usr/include/flann/ext/lz4.h.bak
终端输入:: command not found
dji@dji-NUC11PAHi7:
dji@dji-NUC11PAHi7:/catkin_ws$ sudo mv /usr/include/flann/ext/lz4hc.h /usr/include/flann/ext/lz4.h.bak/catkin_ws$
dji@dji-NUC11PAHi7:
dji@dji-NUC11PAHi7:/catkin_ws$ sudo ln -s /usr/include/lz4.h /usr/include/flann/ext/lz4.h/catkin_ws$ sudo ln -s /usr/include/lz4hc.h /usr/include/flann/ext/lz4hc.h
dji@dji-NUC11PAHi7:
dji@dji-NUC11PAHi7:~/catkin_ws$ catkin_make
Base path: /home/dji/catkin_ws
Source space: /home/dji/catkin_ws/src
Build space: /home/dji/catkin_ws/build
Devel space: /home/dji/catkin_ws/devel
Install space: /home/dji/catkin_ws/install
[ 0%] Built target geometry_msgs_generate_messages_eus
[ 0%] Built target geometry_msgs_generate_messages_cpp
[ 0%] Built target geometry_msgs_generate_messages_py
[ 0%] Built target geometry_msgs_generate_messages_lisp
[ 0%] Built target geometry_msgs_generate_messages_nodejs
[ 4%] Built target test_timer
[ 9%] Built target r3live_LiDAR_front_end
[ 9%] Built target r3live_generate_messages_py
[ 9%] Built target r3live_generate_messages_cpp
[ 9%] Built target r3live_generate_messages_lisp
[ 18%] Built target fastlio_mapping
[ 18%] Built target r3live_generate_messages_nodejs
[ 18%] Built target _fast_lio_generate_messages_check_deps_Pose6D
[ 20%] Built target r3live_generate_messages_eus
[ 22%] Building CXX object r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live_lio.cpp.o
[ 25%] Building CXX object r3live/r3live/CMakeFiles/r3live_mapping.dir/src/loam/IMU_Processing.cpp.o
[ 27%] Building CXX object r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live.cpp.o
[ 29%] Building CXX object r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live_vio.cpp.o
[ 34%] Built target fast_lio_generate_messages_cpp
[ 36%] Built target fast_lio_generate_messages_py
[ 40%] Built target fast_lio_generate_messages_eus
[ 77%] Built target r3live_meshing
[ 79%] Built target fast_lio_generate_messages_nodejs
[ 79%] Built target r3live_generate_messages
[ 81%] Built target fast_lio_generate_messages_lisp
[ 81%] Built target fast_lio_generate_messages
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp: In member function ‘void R3LIVE::load_vio_parameters()’:
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:441:68: warning: format ‘%d’ expects argument of type ‘int’, but argument 3 has type ‘std::vector<double, std::allocator >::size_type {aka long unsigned int}’ [-Wformat=]
printf( "Load camera data size = %d, %d, %d, %d\n", ( int ) camera_intrinsic_data.size(), camera_dist_coeffs_data.size(),
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
camera_ext_R_data.size(), camera_ext_t_data.size() );
^
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:441:68: warning: format ‘%d’ expects argument of type ‘int’, but argument 4 has type ‘std::vector<double, std::allocator >::size_type {aka long unsigned int}’ [-Wformat=]
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:441:68: warning: format ‘%d’ expects argument of type ‘int’, but argument 5 has type ‘std::vector<double, std::allocator >::size_type {aka long unsigned int}’ [-Wformat=]
[ 84%] Linking CXX executable /home/dji/catkin_ws/devel/lib/r3live/r3live_mapping
/usr/bin/ld: warning: libopencv_imgcodecs.so.3.2, needed by /opt/ros/melodic/lib/libcv_bridge.so, may conflict with libopencv_imgcodecs.so.4.5
[100%] Built target r3live_mapping
dji@dji-NUC11PAHi7:~/catkin_ws$ source ~/catkin_ws/devel/setup.bash
Hello,
Thanks for sharing the code. I want to run r3live on my collected data using livox avia. In r2live you suggested using your custom livox driver (https://github.com/ziv-lin/livox_ros_driver_for_R2LIVE). Is it the same for r3live?
Thank you
The provided launch file looks like support Ouster but without image input. So is it possible to map with both Ouster 64 and Image?
Hello,
Great work team! Any idea when the code will be released ? Thanks
Hi,
I have re-read the code and paper and I'm there's no communication from VIO to LIO. Correct me if I'm wrong.
What is this preferable than R2LIVE that was a factor-graph compounding both sensors? If we found ourselves in a highly degraded environment (such as a tunnel) and the LIO lost itself, what would happen to R3LIVE?
Hei. I'm trying to get some information from some topics. Could you please confirm the following:
Also, at specific times I would like to publish all points that are visible in the camera field of view, or just simply for now all the points. Do you know which topic I should use for that? there are multiple RGB_map topics, so one option will be to hijack your cv_keyboard_callback and trigger it from another topic/sign. What do you think?
REQUIRED process [r3live_mapping-3] has died!
process has died [pid 4660, exit code -11, cmd /home/dji/catkin_ws/devel/lib/r3live/r3live_mapping __name:=r3live_mapping __log:=/home/dji/.ros/log/94ab4d12-6ea4-11ec-b07f-1c697aacadba/r3live_mapping-3.log].
log file: /home/dji/.ros/log/94ab4d12-6ea4-11ec-b07f-1c697aacadba/r3live_mapping-3*.log
Initiating shutdown!
Hello @ziv-lin,
I am using r3live and I just want to thank you for such amazing work. R3live is quite robust compare to other SLAM work. I just want to know is it possible to tune r3live for more robustness? I have enough computing capacity and I don't need real-time also.
Is it possible to change a few params in the config file by which r3live can work better in low feature areas? And is there any effect of lighting conditions on mapping performance? I understand that texture would be dark but other than that any effect on the map generation part?
Thank you
The follow error happens when playing a bag file from the dataset. RViz is shutdown with all the other nodes. It seems to be when the first image is received, but not much else is shown even in the log files. Sorry for my lack of expertise in more means of investigation.
I am using ROS Melodic, and I compiled the code with OpenCV 4.5.3. Compilation went well.
您好,我是一个刚接触SLAM的新手,在运行完您的数据集后,我也想录制一个自己的数据集,我使用了livox mid-40和zed-2i录制,并标定了相机和雷达的外参,修改了roslaunch文件的参数
<param name="LiDAR_pointcloud_topic" type="string" value= "laser_cloud_flat" />
<param name="IMU_topic" type="string" value= "/zed2i/zed_node/imu" />
<param name="Image_topic" type="string" value= "/zed2i/zed_node/left/image_rect_color" />
Everything has been setup properly on our end. We are using a Livox Avia and RealSense D455 RGB stream. Your custom Livox driver has been implemented. Unfortunately, frames are captured and points added to the map, but no RGB points are displayed in the RViz panel.
| System-time | LiDAR-frame | Camera-frame | Pts in maps | Memory used (Mb) |
| 18:22:26 | 691 | 1000 | 1647769 | 994 I capture the keyboard input!!!
Further, when attempting to save the map, the following is output in the terminal:
Export the whole offline map to file: /home/vmind/r3live_output/test.r3live
Saving globale map...
Saving global map: 100 %
Save globale map cost 270.114 ms
Saving views and camera poses 100 % ...
Save views and camera poses cost 72.0792 ms
Save Rgb points to /home/visionarymind/r3live_output/rgb_pt
Saving offline map 100% ...
Total have 0 points.
Now write to: /home/vmind/r3live_output/rgb_pt
The r3live_bag.launch file has been modified accordingly to accept the RealSense image stream:
<param name="Image_topic" type="string" value= "/camera/color/image_raw" />
Would you have any advice on what issue could be or how to troubleshoot further?
希望能提供相机的配置信息,以及驱动程序!我目前在大陆无法取得avia,受力只有horizon,但是还是很惊喜,看到港大再次分享先进的技术!FAST-LIO已经开始做一些项目相关的测试,看到真彩,还是按耐不住想试试!谢谢!
您好,感谢您开源论文和代码。
关于VIO光度误差update部分,我对于数据关联有些疑问,论文中,是将视觉地图点,通过kdtree找到已经渲染颜色的激光地图点,并构建光度残差吗?那么这么做的话,误差项却并没有包含几何误差。
谢谢!
Hei zin!
Again amazing work. I have the impression that the camera_ext_R is rotated in your config.yaml compared with the output of the recomended calibration method. When I try just replacing the matrix directly from the livox tool , r3live do not crash but is not able to map.
Please correct me if im wrong. If not, would be good to mentioned it somewhere.
Greetings!
Will R3live support external IMU?
error: invalid initialization of reference of type ‘const cv::ParallelLoopBody&’ from expression of type ‘render_pts_in_voxels_mp(std::shared_ptr<Image_frame>&, std::unordered_set<std::shared_ptr<RGB_Voxel> >*, const double&)::<lambda(const cv::Range&)>’
{ thread_render_pts_in_voxel(r.start, r.end, img_ptr, &g_voxel_for_render, obs_time); });
目前尝试使用mid70+外置imu的配置方式,代码可以跑通,但是由于imu和lidar外参不同步效果并不好
请问如何可以完成imu外参配置?
Hei guys. A couple of question.
What is the effect of time sync in your code? I have been able to run FASTLIO2 and R2live without time sync and I was asking me what would be the effect? would this mean less sharp point cloud?
Additionally, when first published FAST-LIO could calculate state estimation between 10 - 50 Hz. A posteriori FAST-LIO2 can estimate state at 10-100 Hz, but at the moment R2live do not allow to change the publishing frequency.
Thanks in advance!
Will this build on something other than intel hardware?
================================================================================REQUIRED process [r3live_mapping-3] has died!
process has died [pid 7193, exit code -11, cmd /home/crz/SLAM/R3Live/devel/lib/r3live/r3live_mapping __name:=r3live_mapping __log:=/home/crz/.ros/log/b600fbac-6ad9-11ec-b8ae-2cf05d2b732e/r3live_mapping-3.log].
log file: /home/crz/.ros/log/b600fbac-6ad9-11ec-b8ae-2cf05d2b732e/r3live_mapping-3*.log
Initiating shutdown!
[================================================================================
编译过后运行r3live_bag.lanuch,使用您提供的bag运行就会提示出这个错误,我的ros是melodic
请问林博,R3LIVE可以独立运行VIO和LIO吗?我看R3LIVE使用的VIO系统相比于R2LIVE比较轻量,请问它是怎么达到比R2LIVE更好的精度呢,主要是LIO部分的作用吗?
Hi
First I'd like to thank you for the source code release, it appears to be running well with the provided datasets.
I'm now trying to run r3live using a Livox MID-70 and Realsense T265 (and its internal IMU).
Could you please let me know if the following is possible:
Also, is it possible to use this in pure localization mode - i.e. disable RGB map generation for real-time operation?
Many thanks!
Hi, Lin. It's really a great job, especially the reconstruct module.
However, I occurred some weird problems.
=============================================================
App name : R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package
Build date : Jan 7 2022 14:23:18
CPU infos : 11th Gen Intel(R) Core(TM) i7-11800H @ 2.30GHz
RAM infos : 31.21GB Physical Memory 24.81GB Virtual Memory
OS infos : Linux 5.4.0-92-generic (x86_64)
=============================================================
+++++++++++++++++++++++++++++++++++++++++++++++
Here is the your software environments:
GCC version : 8.4.0
Boost version : 1.65.1
Eigen version : 3.3.4
OpenCV version : 3.4.16
+++++++++++++++++++++++++++++++++++++++++++++++
I can't reconstruct and texture mesh when I use hkust_campus_seq_01.bag .And sometimes with the error
catkin_r3live_ws/src/r3live/r3live/src/loam/IMU_Processing.cpp, 132, check_state fail !!!! -8.28157 -10.0624 -6.91589
It seems that there is only RGB_map_0 point cloud.
After save the map , the point cloud is very sparse.
Hello everyone,
just faced a compilation failure in ROS Melodic regarding conflict between system libraries.
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/vinicius/uwo_ws/src/r3live/r3live/src/r3live.cpp:69:
/usr/include/flann/ext/lz4.h:196:57: error: conflicting declaration ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/vinicius/uwo_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/vinicius/uwo_ws/src/r3live/r3live/src/r3live.cpp:59:
/usr/include/lz4.h:196:57: note: previous declaration as ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/vinicius/uwo_ws/src/r3live/r3live/src/r3live.cpp:69:
/usr/include/flann/ext/lz4.h:249:72: error: conflicting declaration ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/vinicius/uwo_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/vinicius/uwo_ws/src/r3live/r3live/src/r3live.cpp:59:
/usr/include/lz4.h:249:72: note: previous declaration as ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
r3live/r3live/CMakeFiles/r3live_mapping.dir/build.make:62: recipe for target 'r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live.cpp.o' failed
A solution can be found in the following link: ethz-asl/lidar_align#16
In r3live_vio_cpp
file,the image_callback
function in line 339 need to add push_back
function。
void R3LIVE::image_callback( const sensor_msgs::ImageConstPtr &msg )
1. {
2. std::unique_lock< std::mutex > lock( mutex_image_callback );
3. if ( sub_image_typed == 2 )
4. {
5. return; // Avoid subscribe the same image twice.
6. }
7. sub_image_typed = 1;
8.
9. if ( g_flag_if_first_rec_img )
10.
11. {
12. g_flag_if_first_rec_img = 0;
13. m_thread_pool_ptr->commit_task( &R3LIVE::service_process_img_buffer, this );
14. }
15.
16. cv::Mat temp_img = cv_bridge::toCvCopy( msg, sensor_msgs::image_encodings::BGR8 )->image.clone();
17. process_image( temp_img, msg->header.stamp.toSec() );
18. }
need add g_received_img_msg.push_back(msg);
at the line 8 to get msg.
The LIDAR reflectance/intensity of each point can help to indicate to what structure/object these point belong too.
At the moment you are using bayesian update for the color assignment but isn't better to also use the reflectance/intensity value information to assign color to each point? This can help, specially in the edges, where multiple points tend to get the color of the background structure.
At the moment the output point cloud of R3LIVE do not contain LIDAR reflectance/intensity values. Having this values will be usefull to test the above hypotesis :D
Hello,
I am really excited to try out new pipeline R3live with livox avia. It looks really robust. I need one favour from your end. Currently for a college project I am using livox avia with fastlio 2. But I need rgb colored point cloud or mesh. I tried modifying fastlio code by coloring every livox point cloud msg with rgb image in pre-processing stage. For calibration between lidar and camera I am using (https://github.com/hku-mars/livox_camera_calib). But the result is very noisy, other than rgb points there are many white and black points. I have already made sure that camera and livox are in sync. Can you throw some light on your approach of coloring point or texturing mesh? I know you are going to release the code by 31 dec but I have to submit early. If possible please throw some light for the same.
Thank you
Within CAD files of your handheld device could you also add schematic and explanations about connecting different sensors to embed them together? Time Synchronization, hardware (lidar, camera, battery, laptop) and software? You don't have to go into too much detail, but still, a little more detailed than explained in the article. It seems to me that it would help me a lot.
Thank you in advance,
Best regards,
Antun Jakopec
Great paper and videos 😍
Can't wait to try out the pipeline. Especially to build textured meshes, loving it!
I was wondering if R3LIVE will have support for system with camera, IMU and spinning LIDARs (Ouster, Velodyne).
I didn't find any clue in the paper about that...
The only clue I found on this R2LIVE thread where the author was saing that spinning LIDARs will be supported in the next iteration of the pipeline, which I asume, is R3Live...
So, my question are:
Once again, thank you so much for the contribution, great job 👏
Thank you for opening source the r3live code.
r3live/r3live/src/loam/IMU_Processing.cpp
Line 415 in 28f5365
Build is crashing due to dependency to cpuid.h
, which doesn't exist on Arm.
In file included from /home/dchen/catkin_ws/src/r3live/r3live/src/tools/test_timer.cpp:49:0:
/home/dchen/catkin_ws/src/r3live/r3live/src/tools/tools_logger.hpp:1323:10: fatal error: cpuid.h: No such file or directory
#include <cpuid.h>
^~~~~~~~~
compilation terminated.
r3live/r3live/CMakeFiles/test_timer.dir/build.make:62: recipe for target 'r3live/r3live/CMakeFiles/test_timer.dir/src/tools/test_timer.cpp.o' failed
make[2]: *** [r3live/r3live/CMakeFiles/test_timer.dir/src/tools/test_timer.cpp.o] Error 1
CMakeFiles/Makefile2:4821: recipe for target 'r3live/r3live/CMakeFiles/test_timer.dir/all' failed
make[1]: *** [r3live/r3live/CMakeFiles/test_timer.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
In file included from /home/dchen/catkin_ws/src/r3live/r3live/src/loam/LiDAR_front_end.cpp:5:0:
/home/dchen/catkin_ws/src/r3live/r3live/src/loam/../tools/tools_logger.hpp:1323:10: fatal error: cpuid.h: No such file or directory
#include <cpuid.h>
^~~~~~~~~
compilation terminated.
r3live/r3live/CMakeFiles/r3live_LiDAR_front_end.dir/build.make:62: recipe for target 'r3live/r3live/CMakeFiles/r3live_LiDAR_front_end.dir/src/loam/LiDAR_front_end.cpp.o' failed
make[2]: *** [r3live/r3live/CMakeFiles/r3live_LiDAR_front_end.dir/src/loam/LiDAR_front_end.cpp.o] Error 1
CMakeFiles/Makefile2:4954: recipe for target 'r3live/r3live/CMakeFiles/r3live_LiDAR_front_end.dir/all' failed
make[1]: *** [r3live/r3live/CMakeFiles/r3live_LiDAR_front_end.dir/all] Error 2
Makefile:140: recipe for target 'all' failed
make: *** [all] Error 2
Invoking "make -j2 -l2" failed
您好,林博,请问支持使用zed相机里的内置IMU和RS16雷达来进行复现吗
In the FAST_LIO, there has a software sync enable between external imu and Livox lidar.
Is that possible in the R3LIVE too?
I tried to add the software sync enable between external imu and Livox lidar.
And lio part has already been worked, but the camera frame almost has no update.
I can not see any camera infomation in the rviz.
Do you have any suggestions?
请问r3live做不到实时对吗,因为在跑的时候,发现越往后越慢,最终会爆内存
r3live/r3live/src/r3live_vio.cpp
Line 701 in 4d386cc
I think it‘s a mistake.
It should be
mat_pre << fx / pt_3d_cam( 2 ), 0, -fx * pt_3d_cam( 0 ) / pt_3d_cam( 2 ) / pt_3d_cam( 2 ), 0, fy / pt_3d_cam( 2 ), -fy * pt_3d_cam( 1 ) / pt_3d_cam( 2 ) / pt_3d_cam( 2 );
right?
Hello,
Thanks for sharing the code. In the config YAML file, there is extrinsic between camera and lidar, I want to know that the matrix is the camera to lidar or lidar to the camera?
And is there any reference for fine calibration between camera and lidar?
Thank you
I was trying to git clone the project onto my disk, for source code reading; however, there seems to be some file name problem under windows, the whole information is given as below,
D:\MyFolder>git clone --recursive https://github.com/hku-mars/r3live.git
Cloning into 'r3live'...
remote: Enumerating objects: 662, done.
remote: Counting objects: 100% (662/662), done.
remote: Compressing objects: 100% (555/555), done.
remote: Total 662 (delta 114), reused 649 (delta 101), pack-reused 0Receiving objects: 100% (662/662), 12.11 MiB | 5.11
MiB/s
Receiving objects: 100% (662/662), 17.12 MiB | 6.33 MiB/s, done.
Resolving deltas: 100% (114/114), done.
error: invalid path 'papers/R2LIVE: A Robust, Real-time, LiDAR-Inertial-Visual tightly-coupled state Estimator and?
mapping.pdf'
fatal: unable to checkout working tree
warning: Clone succeeded, but checkout failed.
You can inspect what was checked out with 'git status'
and retry with 'git restore --source=HEAD :/'
Hi, Lin. R3live is really an amazing job.
I ran into a bug when testing "hkust_campus_seq_00". Do you have any ideas? It seems to appear around the 15881st image and the 10587th LiDAR.
| System-time | LiDAR-frame | Camera-frame | Pts in maps | Memory used (Mb) |
| 23:38:24 | 10587 | 15881 | 54814318 | 26043
Catching a cv exception: OpenCV(3.4.10) /home/qzj/Downloads/tool/opencv-3.4.10/modules/calib3d/src/calibration.cpp:1099: error: (-2:Unspecified error) in function 'void cvFindExtrinsicCameraParams2(const CvMat*, const CvMat*, const CvMat*, const CvMat*, CvMat*, CvMat*, int)'
DLT algorithm needs at least 6 points for pose estimation from 3D-2D point correspondences. (expected: 'count >= 6'), where
'count' is 5
must be greater than or equal to
'6' is 6
My environment:
i try Eigen 3.3.4 3.4 but get errors, one of lines
‘ScalarBinaryOpTraits’ in namespace ‘Eigen’ does not name a template type
using ReturnScalar = typename Eigen::ScalarBinaryOpTraits<
anyone get ideal about this?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.