Git Product home page Git Product logo

hku-mars / r3live Goto Github PK

View Code? Open in Web Editor NEW
1.9K 49.0 407.0 22.56 MB

A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package

License: GNU General Public License v2.0

CMake 0.21% C++ 99.35% C 0.25% GLSL 0.19%
slam lidar-slam 3d-reconstruction mesh-reconstruction sensor-fusion lidar-inertial-odometry lidar-camera-fusion 3d-mapping lidar-odometry

r3live's People

Contributors

ziv-lin avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

r3live's Issues

Holes in reconstructed mesh

Hello @ziv-lin,

I tried r3live to build large-area meshes. In most cases, it is doing a pretty good job but in some, I got the following problem.
Screenshot 2022-01-13 at 2 56 21 PM
Screenshot 2022-01-13 at 2 56 41 PM
Screenshot 2022-01-13 at 2 56 55 PM

I am getting such holes all over the surface. It is quite strange as the point is uniform and there are no holes. I am running the default config and launch file. Let me know if I can improve this by any means.

Thanks

Code release

Hello @ziv-lin ,

Thanks for sharing r3live paper. Currently for a college project I am using r2live but I am not able to generate good map when the environment has limited features. From the example video it looks like r3live is solving this problem. I am eagerly waiting for the code release if possible can you please share the tentative date of the code release. I know you are waiting for the reviewer comment, If possible please share the timelines.

Thank you

Spinning LIDAR texture misalignment

Hi
Thanks for your outstanding work and open source it.
I modified the project and adapted it for spinning lidar. The results are shown in the figure:
image. I found the result is misalignment. The external parameter from my camera to IMU is set to rough external value and estimate_Intrinsic is set to 1. Do you have any other suggestions besides recalibrating the accurate initial value? thank you very much!

Large drift issue!

Hello @ziv-lin,

I had an issue while mapping sometimes I am getting the large drift using r3live. But if I use vanilla fastlio 2 there is no visible drift. I was wondering how is that possible? Like for LIO r3live is also using fastlio right. Do I have to do any config changes? In some cases, r3live is performing exceptionally well. And I don't think there is any compute limitation as I am using 64 core 512 GB RAM.

Thank you

The height error in mapping

Hello @ziv-lin ,

Thanks for sharing your great works. The accuracy and detail of R3LIVE real-time mapping are impressive, but there are still some problems. I built R3LIVE according to the instructions and run "hkust_campus_seq_00.bag", the results I got are not consistent with the PCD file published on GitHub.
As shown in the following screenshot, the picture above is the result I got, the picture below is the PCD result published on GitHub. There exist obvious height error in my testing. In the test of FAST-LIO2, the problem of height error also appears.
Did I miss something?

1

2

Best regards

LZ4_STREAMSIZE_U64

In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.hpp:19,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.cpp:1:
/usr/include/flann/ext/lz4.h:196:57: error: conflicting declaration ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.hpp:12,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.cpp:1:
/usr/include/lz4.h:196:57: note: previous declaration as ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.hpp:19,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.cpp:1:
/usr/include/flann/ext/lz4.h:249:72: error: conflicting declaration ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.hpp:12,
from /home/dji/catkin_ws/src/r3live/r3live/src/loam/IMU_Processing.cpp:1:
/usr/include/lz4.h:249:72: note: previous declaration as ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.cpp:69:
/usr/include/flann/ext/lz4.h:196:57: error: conflicting declaration ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.cpp:59:
/usr/include/lz4.h:196:57: note: previous declaration as ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.cpp:69:
/usr/include/flann/ext/lz4.h:249:72: error: conflicting declaration ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.cpp:59:
/usr/include/lz4.h:249:72: note: previous declaration as ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:70,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:48:
/usr/include/flann/ext/lz4.h:196:57: error: conflicting declaration ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:60,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:48:
/usr/include/lz4.h:196:57: note: previous declaration as ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:70,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:48:
/usr/include/flann/ext/lz4.h:249:72: error: conflicting declaration ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:60,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:48:
/usr/include/lz4.h:249:72: note: previous declaration as ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:70,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_lio.cpp:48:
/usr/include/flann/ext/lz4.h:196:57: error: conflicting declaration ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:60,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_lio.cpp:48:
/usr/include/lz4.h:196:57: note: previous declaration as ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:70,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_lio.cpp:48:
/usr/include/flann/ext/lz4.h:249:72: error: conflicting declaration ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/dji/catkin_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live.hpp:60,
from /home/dji/catkin_ws/src/r3live/r3live/src/r3live_lio.cpp:48:
/usr/include/lz4.h:249:72: note: previous declaration as ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp: In member function ‘void R3LIVE::load_vio_parameters()’:
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:441:68: warning: format ‘%d’ expects argument of type ‘int’, but argument 3 has type ‘std::vector<double, std::allocator >::size_type {aka long unsigned int}’ [-Wformat=]
printf( "Load camera data size = %d, %d, %d, %d\n", ( int ) camera_intrinsic_data.size(), camera_dist_coeffs_data.size(),
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
camera_ext_R_data.size(), camera_ext_t_data.size() );
^
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:441:68: warning: format ‘%d’ expects argument of type ‘int’, but argument 4 has type ‘std::vector<double, std::allocator >::size_type {aka long unsigned int}’ [-Wformat=]
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:441:68: warning: format ‘%d’ expects argument of type ‘int’, but argument 5 has type ‘std::vector<double, std::allocator >::size_type {aka long unsigned int}’ [-Wformat=]
r3live/r3live/CMakeFiles/r3live_mapping.dir/build.make:158: recipe for target 'r3live/r3live/CMakeFiles/r3live_mapping.dir/src/loam/IMU_Processing.cpp.o' failed
make[2]: *** [r3live/r3live/CMakeFiles/r3live_mapping.dir/src/loam/IMU_Processing.cpp.o] Error 1
make[2]: *** Waiting for unfinished jobs....
r3live/r3live/CMakeFiles/r3live_mapping.dir/build.make:62: recipe for target 'r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live.cpp.o' failed
make[2]: *** [r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live.cpp.o] Error 1
r3live/r3live/CMakeFiles/r3live_mapping.dir/build.make:86: recipe for target 'r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live_lio.cpp.o' failed
make[2]: *** [r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live_lio.cpp.o] Error 1
r3live/r3live/CMakeFiles/r3live_mapping.dir/build.make:206: recipe for target 'r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live_vio.cpp.o' failed
make[2]: *** [r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live_vio.cpp.o] Error 1
CMakeFiles/Makefile2:2731: recipe for target 'r3live/r3live/CMakeFiles/r3live_mapping.dir/all' failed
make[1]: *** [r3live/r3live/CMakeFiles/r3live_mapping.dir/all] Error 2
Makefile:140: recipe for target 'all' failed
make: *** [all] Error 2
Invoking "make -j8 -l8" failed
dji@dji-NUC11PAHi7:/catkin_ws$ 终端输入:
终端输入:: command not found
dji@dji-NUC11PAHi7:
/catkin_ws$ sudo mv /usr/include/flann/ext/lz4.h /usr/include/flann/ext/lz4.h.bak
dji@dji-NUC11PAHi7:/catkin_ws$ sudo mv /usr/include/flann/ext/lz4hc.h /usr/include/flann/ext/lz4.h.bak
dji@dji-NUC11PAHi7:
/catkin_ws$
dji@dji-NUC11PAHi7:/catkin_ws$ sudo ln -s /usr/include/lz4.h /usr/include/flann/ext/lz4.h
dji@dji-NUC11PAHi7:
/catkin_ws$ sudo ln -s /usr/include/lz4hc.h /usr/include/flann/ext/lz4hc.h
dji@dji-NUC11PAHi7:~/catkin_ws$ catkin_make
Base path: /home/dji/catkin_ws
Source space: /home/dji/catkin_ws/src
Build space: /home/dji/catkin_ws/build
Devel space: /home/dji/catkin_ws/devel
Install space: /home/dji/catkin_ws/install

Running command: "make cmake_check_build_system" in "/home/dji/catkin_ws/build"

Running command: "make -j8 -l8" in "/home/dji/catkin_ws/build"

[ 0%] Built target geometry_msgs_generate_messages_eus
[ 0%] Built target geometry_msgs_generate_messages_cpp
[ 0%] Built target geometry_msgs_generate_messages_py
[ 0%] Built target geometry_msgs_generate_messages_lisp
[ 0%] Built target geometry_msgs_generate_messages_nodejs
[ 4%] Built target test_timer
[ 9%] Built target r3live_LiDAR_front_end
[ 9%] Built target r3live_generate_messages_py
[ 9%] Built target r3live_generate_messages_cpp
[ 9%] Built target r3live_generate_messages_lisp
[ 18%] Built target fastlio_mapping
[ 18%] Built target r3live_generate_messages_nodejs
[ 18%] Built target _fast_lio_generate_messages_check_deps_Pose6D
[ 20%] Built target r3live_generate_messages_eus
[ 22%] Building CXX object r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live_lio.cpp.o
[ 25%] Building CXX object r3live/r3live/CMakeFiles/r3live_mapping.dir/src/loam/IMU_Processing.cpp.o
[ 27%] Building CXX object r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live.cpp.o
[ 29%] Building CXX object r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live_vio.cpp.o
[ 34%] Built target fast_lio_generate_messages_cpp
[ 36%] Built target fast_lio_generate_messages_py
[ 40%] Built target fast_lio_generate_messages_eus
[ 77%] Built target r3live_meshing
[ 79%] Built target fast_lio_generate_messages_nodejs
[ 79%] Built target r3live_generate_messages
[ 81%] Built target fast_lio_generate_messages_lisp
[ 81%] Built target fast_lio_generate_messages
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp: In member function ‘void R3LIVE::load_vio_parameters()’:
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:441:68: warning: format ‘%d’ expects argument of type ‘int’, but argument 3 has type ‘std::vector<double, std::allocator >::size_type {aka long unsigned int}’ [-Wformat=]
printf( "Load camera data size = %d, %d, %d, %d\n", ( int ) camera_intrinsic_data.size(), camera_dist_coeffs_data.size(),
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
camera_ext_R_data.size(), camera_ext_t_data.size() );
^
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:441:68: warning: format ‘%d’ expects argument of type ‘int’, but argument 4 has type ‘std::vector<double, std::allocator >::size_type {aka long unsigned int}’ [-Wformat=]
/home/dji/catkin_ws/src/r3live/r3live/src/r3live_vio.cpp:441:68: warning: format ‘%d’ expects argument of type ‘int’, but argument 5 has type ‘std::vector<double, std::allocator >::size_type {aka long unsigned int}’ [-Wformat=]
[ 84%] Linking CXX executable /home/dji/catkin_ws/devel/lib/r3live/r3live_mapping
/usr/bin/ld: warning: libopencv_imgcodecs.so.3.2, needed by /opt/ros/melodic/lib/libcv_bridge.so, may conflict with libopencv_imgcodecs.so.4.5
[100%] Built target r3live_mapping
dji@dji-NUC11PAHi7:~/catkin_ws$ source ~/catkin_ws/devel/setup.bash

Code release

Hello,

Great work team! Any idea when the code will be released ? Thanks

[Question] Is there communication from VIO to LIO?

Hi,
I have re-read the code and paper and I'm there's no communication from VIO to LIO. Correct me if I'm wrong.

What is this preferable than R2LIVE that was a factor-graph compounding both sensors? If we found ourselves in a highly degraded environment (such as a tunnel) and the LIO lost itself, what would happen to R3LIVE?

Documentation: topic details + get full map at specific times

Hei. I'm trying to get some information from some topics. Could you please confirm the following:

  1. /path : Correspond to the IMU path at the time of recording using the LIDAR time
  2. /camera_path: Correspond to the camera path at the time of recording using the camera time
  3. /camera_odom : same than camera_path but with running time

Also, at specific times I would like to publish all points that are visible in the camera field of view, or just simply for now all the points. Do you know which topic I should use for that? there are multiple RGB_map topics, so one option will be to hijack your cv_keyboard_callback and trigger it from another topic/sign. What do you think?

REQUIRED process [r3live_mapping-3] has died!

REQUIRED process [r3live_mapping-3] has died!
process has died [pid 4660, exit code -11, cmd /home/dji/catkin_ws/devel/lib/r3live/r3live_mapping __name:=r3live_mapping __log:=/home/dji/.ros/log/94ab4d12-6ea4-11ec-b07f-1c697aacadba/r3live_mapping-3.log].
log file: /home/dji/.ros/log/94ab4d12-6ea4-11ec-b07f-1c697aacadba/r3live_mapping-3*.log
Initiating shutdown!

Tuning R3live for more robustness

Hello @ziv-lin,

I am using r3live and I just want to thank you for such amazing work. R3live is quite robust compare to other SLAM work. I just want to know is it possible to tune r3live for more robustness? I have enough computing capacity and I don't need real-time also.

Is it possible to change a few params in the config file by which r3live can work better in low feature areas? And is there any effect of lighting conditions on mapping performance? I understand that texture would be dark but other than that any effect on the map generation part?

Thank you

Runtime error

The follow error happens when playing a bag file from the dataset. RViz is shutdown with all the other nodes. It seems to be when the first image is received, but not much else is shown even in the log files. Sorry for my lack of expertise in more means of investigation.

I am using ROS Melodic, and I compiled the code with OpenCV 4.5.3. Compilation went well.

error_r3live

如何录制自己的数据集?

您好,我是一个刚接触SLAM的新手,在运行完您的数据集后,我也想录制一个自己的数据集,我使用了livox mid-40和zed-2i录制,并标定了相机和雷达的外参,修改了roslaunch文件的参数

    <param name="LiDAR_pointcloud_topic" type="string" value= "laser_cloud_flat" />
    <param name="IMU_topic" type="string" value= "/zed2i/zed_node/imu" />
    <param name="Image_topic" type="string" value= "/zed2i/zed_node/left/image_rect_color" />

但rviz打开后没有任何输出,我的理解是即使imu没有进行标定,也应该会获得一个不太好的结果,请问是哪里出现了问题吗
1

LiDAR / camera frames and points captured but no RGB points

Everything has been setup properly on our end. We are using a Livox Avia and RealSense D455 RGB stream. Your custom Livox driver has been implemented. Unfortunately, frames are captured and points added to the map, but no RGB points are displayed in the RViz panel.

| System-time | LiDAR-frame | Camera-frame |  Pts in maps | Memory used (Mb) |
|   18:22:26  |    691      |    1000      | 1647769      |    994          I capture the keyboard input!!!

Further, when attempting to save the map, the following is output in the terminal:

Export the whole offline map to file: /home/vmind/r3live_output/test.r3live
Saving globale map...
Saving global map: 100 %
Save globale map cost 270.114 ms
Saving views and camera poses 100 % ...
Save views and camera poses cost 72.0792 ms
Save Rgb points to /home/visionarymind/r3live_output/rgb_pt
Saving offline map 100% ...
Total have 0 points.
Now write to: /home/vmind/r3live_output/rgb_pt

The r3live_bag.launch file has been modified accordingly to accept the RealSense image stream:

<param name="Image_topic" type="string" value= "/camera/color/image_raw" />

Would you have any advice on what issue could be or how to troubleshoot further?

全局快门相机的选择

希望能提供相机的配置信息,以及驱动程序!我目前在大陆无法取得avia,受力只有horizon,但是还是很惊喜,看到港大再次分享先进的技术!FAST-LIO已经开始做一些项目相关的测试,看到真彩,还是按耐不住想试试!谢谢!

Frame-to-map Visual-Inertial odometry

您好,感谢您开源论文和代码。
关于VIO光度误差update部分,我对于数据关联有些疑问,论文中,是将视觉地图点,通过kdtree找到已经渲染颜色的激光地图点,并构建光度残差吗?那么这么做的话,误差项却并没有包含几何误差。
谢谢!

Documentation - camera_ext_R is rotated in config.yaml

Hei zin!
Again amazing work. I have the impression that the camera_ext_R is rotated in your config.yaml compared with the output of the recomended calibration method. When I try just replacing the matrix directly from the livox tool , r3live do not crash but is not able to map.

Please correct me if im wrong. If not, would be good to mentioned it somewhere.

Greetings!

还是运行测试的时候的那个问题

在保证运行opencv和编译opencv版本都是3.4.16后,还是一直会崩掉,运行内存时16G.
image
image
image
还有就是,rqt_image_view命令后也是能看见图片的,这个怎么解决呀?

catkin_make error

error: invalid initialization of reference of type ‘const cv::ParallelLoopBody&’ from expression of type ‘render_pts_in_voxels_mp(std::shared_ptr<Image_frame>&, std::unordered_set<std::shared_ptr<RGB_Voxel> >*, const double&)::<lambda(const cv::Range&)>’
{ thread_render_pts_in_voxel(r.start, r.end, img_ptr, &g_voxel_for_render, obs_time); });

是否支持外置IMU

目前尝试使用mid70+外置imu的配置方式,代码可以跑通,但是由于imu和lidar外参不同步效果并不好
请问如何可以完成imu外参配置?

Effect of time sync and publishing frequency

Hei guys. A couple of question.
What is the effect of time sync in your code? I have been able to run FASTLIO2 and R2live without time sync and I was asking me what would be the effect? would this mean less sharp point cloud?

Additionally, when first published FAST-LIO could calculate state estimation between 10 - 50 Hz. A posteriori FAST-LIO2 can estimate state at 10-100 Hz, but at the moment R2live do not allow to change the publishing frequency.

  • Will R3live allow to change this?
  • If yes, What will produce a better result (sharper point cloud /better position estimation)? having less lidar points will allow to faster calculations but is also more promp to fail no? What it should not happen as there is also the camera present...

Thanks in advance!

您好,请问您在测试的时候有遇到过这个问题吗

================================================================================REQUIRED process [r3live_mapping-3] has died!
process has died [pid 7193, exit code -11, cmd /home/crz/SLAM/R3Live/devel/lib/r3live/r3live_mapping __name:=r3live_mapping __log:=/home/crz/.ros/log/b600fbac-6ad9-11ec-b8ae-2cf05d2b732e/r3live_mapping-3.log].
log file: /home/crz/.ros/log/b600fbac-6ad9-11ec-b8ae-2cf05d2b732e/r3live_mapping-3*.log
Initiating shutdown!
[================================================================================

编译过后运行r3live_bag.lanuch,使用您提供的bag运行就会提示出这个错误,我的ros是melodic

LIO VIO单独运行

请问林博,R3LIVE可以独立运行VIO和LIO吗?我看R3LIVE使用的VIO系统相比于R2LIVE比较轻量,请问它是怎么达到比R2LIVE更好的精度呢,主要是LIO部分的作用吗?

Support for external IMU and Fisheye cameras

Hi

First I'd like to thank you for the source code release, it appears to be running well with the provided datasets.
I'm now trying to run r3live using a Livox MID-70 and Realsense T265 (and its internal IMU).

Could you please let me know if the following is possible:

  • Custom IMU->LiDAR extrinsics. Due to packaging constraints, my LiDAR is not mounted aligned with the IMU.
  • Fisheye correction on input images from the T265.

Also, is it possible to use this in pure localization mode - i.e. disable RGB map generation for real-time operation?

Many thanks!

Reconstruct and texture mesh failed!

Hi, Lin. It's really a great job, especially the reconstruct module.
However, I occurred some weird problems.

=============================================================
App name : R3LIVE: A Robust, Real-time, RGB-colored, LiDAR-Inertial-Visual tightly-coupled state Estimation and mapping package
Build date : Jan 7 2022 14:23:18
CPU infos : 11th Gen Intel(R) Core(TM) i7-11800H @ 2.30GHz
RAM infos : 31.21GB Physical Memory 24.81GB Virtual Memory
OS infos : Linux 5.4.0-92-generic (x86_64)
=============================================================
+++++++++++++++++++++++++++++++++++++++++++++++
Here is the your software environments:
GCC version : 8.4.0
Boost version : 1.65.1
Eigen version : 3.3.4
OpenCV version : 3.4.16
+++++++++++++++++++++++++++++++++++++++++++++++

I can't reconstruct and texture mesh when I use hkust_campus_seq_01.bag .And sometimes with the error
catkin_r3live_ws/src/r3live/r3live/src/loam/IMU_Processing.cpp, 132, check_state fail !!!! -8.28157 -10.0624 -6.91589

It seems that there is only RGB_map_0 point cloud.
2022-01-07 14-37-03 的屏幕截图

After save the map , the point cloud is very sparse.

2022-01-07 14-38-17 的屏幕截图

so the final illustration as follows(include rgb_pt.pcd):
2022-01-07 15-35-25 的屏幕截图

Compilation failed in ROS Melodic

Hello everyone,

just faced a compilation failure in ROS Melodic regarding conflict between system libraries.

In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/vinicius/uwo_ws/src/r3live/r3live/src/r3live.cpp:69:
/usr/include/flann/ext/lz4.h:196:57: error: conflicting declaration ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/vinicius/uwo_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/vinicius/uwo_ws/src/r3live/r3live/src/r3live.cpp:59:
/usr/include/lz4.h:196:57: note: previous declaration as ‘typedef struct LZ4_stream_t LZ4_stream_t’
typedef struct { long long table[LZ4_STREAMSIZE_U64]; } LZ4_stream_t;
^~~~~~~~~~~~
In file included from /usr/include/flann/util/serialization.h:9:0,
from /usr/include/flann/util/matrix.h:35,
from /usr/include/flann/flann.hpp:41,
from /usr/include/pcl-1.8/pcl/kdtree/flann.h:50,
from /usr/include/pcl-1.8/pcl/kdtree/kdtree_flann.h:45,
from /home/vinicius/uwo_ws/src/r3live/r3live/src/r3live.cpp:69:
/usr/include/flann/ext/lz4.h:249:72: error: conflicting declaration ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
In file included from /opt/ros/melodic/include/roslz4/lz4s.h:38:0,
from /opt/ros/melodic/include/rosbag/stream.h:46,
from /opt/ros/melodic/include/rosbag/chunked_file.h:46,
from /opt/ros/melodic/include/rosbag/bag.h:41,
from /home/vinicius/uwo_ws/src/r3live/r3live/./src/loam/include/common_lib.h:13,
from /home/vinicius/uwo_ws/src/r3live/r3live/src/r3live.cpp:59:
/usr/include/lz4.h:249:72: note: previous declaration as ‘typedef struct LZ4_streamDecode_t LZ4_streamDecode_t’
typedef struct { unsigned long long table[LZ4_STREAMDECODESIZE_U64]; } LZ4_streamDecode_t;
^~~~~~~~~~~~~~~~~~
r3live/r3live/CMakeFiles/r3live_mapping.dir/build.make:62: recipe for target 'r3live/r3live/CMakeFiles/r3live_mapping.dir/src/r3live.cpp.o' failed

A solution can be found in the following link: ethz-asl/lidar_align#16

g_received_img_msg need to get msg.

In r3live_vio_cpp file,the image_callback function in line 339 need to add push_back function。

void R3LIVE::image_callback( const sensor_msgs::ImageConstPtr &msg )
1. {
2. std::unique_lock< std::mutex > lock( mutex_image_callback );
3. if ( sub_image_typed == 2 )
4. {
5. return; // Avoid subscribe the same image twice.
6. }
7. sub_image_typed = 1;
8. 
9. if ( g_flag_if_first_rec_img )
10. 
11. {
12. g_flag_if_first_rec_img = 0;
13. m_thread_pool_ptr->commit_task( &R3LIVE::service_process_img_buffer, this );
14. }
15. 
16. cv::Mat temp_img = cv_bridge::toCvCopy( msg, sensor_msgs::image_encodings::BGR8 )->image.clone();
17. process_image( temp_img, msg->header.stamp.toSec() );
18. }

need add g_received_img_msg.push_back(msg);at the line 8 to get msg.

[Improvement] Save and use reflectance/intensity values

The LIDAR reflectance/intensity of each point can help to indicate to what structure/object these point belong too.
At the moment you are using bayesian update for the color assignment but isn't better to also use the reflectance/intensity value information to assign color to each point? This can help, specially in the edges, where multiple points tend to get the color of the background structure.

At the moment the output point cloud of R3LIVE do not contain LIDAR reflectance/intensity values. Having this values will be usefull to test the above hypotesis :D

Point cloud coloring

Hello,

I am really excited to try out new pipeline R3live with livox avia. It looks really robust. I need one favour from your end. Currently for a college project I am using livox avia with fastlio 2. But I need rgb colored point cloud or mesh. I tried modifying fastlio code by coloring every livox point cloud msg with rgb image in pre-processing stage. For calibration between lidar and camera I am using (https://github.com/hku-mars/livox_camera_calib). But the result is very noisy, other than rgb points there are many white and black points. I have already made sure that camera and livox are in sync. Can you throw some light on your approach of coloring point or texturing mesh? I know you are going to release the code by 31 dec but I have to submit early. If possible please throw some light for the same.

Thank you

rxlive_handheld

rxlive_handheld

Within CAD files of your handheld device could you also add schematic and explanations about connecting different sensors to embed them together? Time Synchronization, hardware (lidar, camera, battery, laptop) and software? You don't have to go into too much detail, but still, a little more detailed than explained in the article. It seems to me that it would help me a lot.

Thank you in advance,
Best regards,
Antun Jakopec

Support of spinning LIDARs?

Great paper and videos 😍
Can't wait to try out the pipeline. Especially to build textured meshes, loving it!

I was wondering if R3LIVE will have support for system with camera, IMU and spinning LIDARs (Ouster, Velodyne).
I didn't find any clue in the paper about that...

The only clue I found on this R2LIVE thread where the author was saing that spinning LIDARs will be supported in the next iteration of the pipeline, which I asume, is R3Live...

So, my question are:

  • does R3LIVE support spinning LIDARs?
  • if not, is it because the architecture of the system doesn't allow it or becaause it is just not on your plans?

Once again, thank you so much for the contribution, great job 👏

question about lic_point_cloud_undistort

Thank you for opening source the r3live code.

Eigen::Vector3d T_ei( pos_imu + vel_imu * dt + 0.5 * acc_imu * dt * dt + R_i * Lidar_offset_to_IMU - pos_liD_e );

the way of point cloud undistortion in r2live and r3live is different from fastlio2, can you please explain why?
pos_liD_e is lidar pose at timestamp e in global frame,
R_i * Lidar_offset_to_IMU is the compensation for lidar translation by angular velocity in Global frame,
but state_inout.rot_end.transpose() is the rotation from Global to IMU at timestamp e,, why not using the rotation from global to lidar at timestamp e, "R_liD_e.transpose()"?

Build fail on Arm

Build is crashing due to dependency to cpuid.h, which doesn't exist on Arm.

In file included from /home/dchen/catkin_ws/src/r3live/r3live/src/tools/test_timer.cpp:49:0:
/home/dchen/catkin_ws/src/r3live/r3live/src/tools/tools_logger.hpp:1323:10: fatal error: cpuid.h: No such file or directory
 #include <cpuid.h>
          ^~~~~~~~~
compilation terminated.
r3live/r3live/CMakeFiles/test_timer.dir/build.make:62: recipe for target 'r3live/r3live/CMakeFiles/test_timer.dir/src/tools/test_timer.cpp.o' failed
make[2]: *** [r3live/r3live/CMakeFiles/test_timer.dir/src/tools/test_timer.cpp.o] Error 1
CMakeFiles/Makefile2:4821: recipe for target 'r3live/r3live/CMakeFiles/test_timer.dir/all' failed
make[1]: *** [r3live/r3live/CMakeFiles/test_timer.dir/all] Error 2
make[1]: *** Waiting for unfinished jobs....
In file included from /home/dchen/catkin_ws/src/r3live/r3live/src/loam/LiDAR_front_end.cpp:5:0:
/home/dchen/catkin_ws/src/r3live/r3live/src/loam/../tools/tools_logger.hpp:1323:10: fatal error: cpuid.h: No such file or directory
 #include <cpuid.h>
          ^~~~~~~~~
compilation terminated.
r3live/r3live/CMakeFiles/r3live_LiDAR_front_end.dir/build.make:62: recipe for target 'r3live/r3live/CMakeFiles/r3live_LiDAR_front_end.dir/src/loam/LiDAR_front_end.cpp.o' failed
make[2]: *** [r3live/r3live/CMakeFiles/r3live_LiDAR_front_end.dir/src/loam/LiDAR_front_end.cpp.o] Error 1
CMakeFiles/Makefile2:4954: recipe for target 'r3live/r3live/CMakeFiles/r3live_LiDAR_front_end.dir/all' failed
make[1]: *** [r3live/r3live/CMakeFiles/r3live_LiDAR_front_end.dir/all] Error 2
Makefile:140: recipe for target 'all' failed
make: *** [all] Error 2
Invoking "make -j2 -l2" failed

关于设备

您好,林博,请问支持使用zed相机里的内置IMU和RS16雷达来进行复现吗

Is there any way to add a software sync between external imu and Livox lidar?

In the FAST_LIO, there has a software sync enable between external imu and Livox lidar.
Is that possible in the R3LIVE too?
I tried to add the software sync enable between external imu and Livox lidar.
And lio part has already been worked, but the camera frame almost has no update.
I can not see any camera infomation in the rviz.
Do you have any suggestions?

请问, Dataset中的bag有做相机和雷达的同步吗

港大的同学好,
我下载了数据集中的hku_park_01.bag, 发现实际上相机在15hz,lidar在10hz, 时间戳是无法做到对齐的。 是否r3live对于多传感器时间同步方面没有硬性要求? [非同步误差dt也一起做状态估计了?](2021-12-30 18-33-14 的屏幕截图)

关于real-time

请问r3live做不到实时对吗,因为在跑的时候,发现越往后越慢,最终会爆内存

a mistake in vio_esikf?

mat_pre << fx / pt_3d_cam( 2 ), 0, -fx * pt_3d_cam( 0 ) / pt_3d_cam( 2 ), 0, fy / pt_3d_cam( 2 ), -fy * pt_3d_cam( 1 ) / pt_3d_cam( 2 );

I think it‘s a mistake.
It should be
mat_pre << fx / pt_3d_cam( 2 ), 0, -fx * pt_3d_cam( 0 ) / pt_3d_cam( 2 ) / pt_3d_cam( 2 ), 0, fy / pt_3d_cam( 2 ), -fy * pt_3d_cam( 1 ) / pt_3d_cam( 2 ) / pt_3d_cam( 2 );
right?

Transformation matrix

Hello,

Thanks for sharing the code. In the config YAML file, there is extrinsic between camera and lidar, I want to know that the matrix is the camera to lidar or lidar to the camera?

And is there any reference for fine calibration between camera and lidar?

Thank you

git clone failed -- file name problem on windows platform

I was trying to git clone the project onto my disk, for source code reading; however, there seems to be some file name problem under windows, the whole information is given as below,

D:\MyFolder>git clone --recursive https://github.com/hku-mars/r3live.git
Cloning into 'r3live'...
remote: Enumerating objects: 662, done.
remote: Counting objects: 100% (662/662), done.
remote: Compressing objects: 100% (555/555), done.
remote: Total 662 (delta 114), reused 649 (delta 101), pack-reused 0Receiving objects: 100% (662/662), 12.11 MiB | 5.11
MiB/s
Receiving objects: 100% (662/662), 17.12 MiB | 6.33 MiB/s, done.
Resolving deltas: 100% (114/114), done.
error: invalid path 'papers/R2LIVE: A Robust, Real-time, LiDAR-Inertial-Visual tightly-coupled state Estimator and?
mapping.pdf'
fatal: unable to checkout working tree
warning: Clone succeeded, but checkout failed.
You can inspect what was checked out with 'git status'
and retry with 'git restore --source=HEAD :/'

Fixme: Catching a cv exception, DLT algorithm needs at least 6 points for pose estimation from 3D-2D point correspondences.

Hi, Lin. R3live is really an amazing job.

I ran into a bug when testing "hkust_campus_seq_00". Do you have any ideas? It seems to appear around the 15881st image and the 10587th LiDAR.

| System-time | LiDAR-frame | Camera-frame | Pts in maps | Memory used (Mb) |
| 23:38:24 | 10587 | 15881 | 54814318 | 26043
Catching a cv exception: OpenCV(3.4.10) /home/qzj/Downloads/tool/opencv-3.4.10/modules/calib3d/src/calibration.cpp:1099: error: (-2:Unspecified error) in function 'void cvFindExtrinsicCameraParams2(const CvMat*, const CvMat*, const CvMat*, const CvMat*, CvMat*, CvMat*, int)'

DLT algorithm needs at least 6 points for pose estimation from 3D-2D point correspondences. (expected: 'count >= 6'), where
'count' is 5
must be greater than or equal to
'6' is 6

My environment:

Screenshot from 2022-01-07 13-41-18

seems Eigen version problem

i try Eigen 3.3.4 3.4 but get errors, one of lines
‘ScalarBinaryOpTraits’ in namespace ‘Eigen’ does not name a template type
using ReturnScalar = typename Eigen::ScalarBinaryOpTraits<

anyone get ideal about this?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.