Git Product home page Git Product logo

mulls's Introduction

MULLS: Versatile LiDAR SLAM via Multi-metric Linear Least Square

This repository implements MULLS, an efficient, low-drift, and versatile LiDAR-only SLAM system with both the front-end and back-end. It's an overall updated version of LLS-LOAM.

Version 1.1

(tested on Ubuntu 16.04 / 18.04 / 20.04)

Codes are currently under refactoring for better readability and performance.


MULLS SLAM demo

mulls_slam


MULLS Registration demo

mulls_slam


Instruction

1. Install dependent 3rd libraries

For a compiler that supports OpenMP

Compulsory:

Optional:

  • For *.las data IO: LibLas
  • For *.h5 data IO: HDF5
  • For geo-coordinate projection for global mapping: Proj4
  • For Lie-Group and Lie-Algebra related functions (only used in baseline registration method vgicp): Sophus
  • For 2D map, scan range image and BEV image generation: OpenCV
  • For pose graph optimization: g2o(<=2016version)
  • or ceres
  • or gtsam
  • For efficient global registration using truncated least square: TEASER++

You may run the following shell file to install all the dependent libs (tested on Ubuntu 16.04):

bash script/tools/install_dep_lib.sh

Note: ceres, g2o and gtsam are all used for pose graph optimization. You only need to install one of them (ceres is recommended).

2. Compile

# cd to the base folder of MULLS
mkdir build
cd build
cmake ..
make 
cd ..

If you'd like to configure the optional dependent libs needed by your task, you can directly switch the options in CMakeLists.txt and then rebuild or use ccmake .. in build folder instead. (If it does not work, you nned to delete the build folder and do 2.Compile again).

3. Minimum example

By using the example data (16 adjacent scans) in ./demo_data, you can have a quick test of MULLS-SLAM and MULLS-Registration.

Without editing anything, directly run

sh script/run_mulls_slam.sh

and

sh script/run_mulls_reg.sh

to check the results (in the real-time viewer and the result folder in demo_data).

4. Prepare data

The input of MULLS should be a sequence of point cloud. Each point cloud is corresponding to a frame of the transaction.

Test on KITTI

Download the KITTI Odometry Dataset to test the project.

To test the semantic mask aided solution, please download the Semantic KITTI Odometry Dataset.

The complete data folder structure should be as following:

Base Folder
_____00
     |___velodyne [raw data *.bin]
     |___pcd [*.pcd] (can be generated from *.bin by run_kittibin2pcd.sh)
     |___labels [raw semantic label *.label] (optional for semantic aided lidar odometry) 
     |___label_pcd [*.pcd] (optional for semantic aided lidar odometry, can be generated from *.label and *.bin by run_semantic_kitti_labelbin2pcd.sh) 
     |___00.txt [ground truth (gnssins) pose] (optional for evaluation)
     |___calib.txt [extrinsic transformation matrix (from body to lidar coordinate system)] (optional for evaluation)
     |___result [output of MULLS: generated map, pose and evaluation] (would be generated automatically after the transaction) 
_____01
     |___velodyne
     |___pcd
     |___labels
     |...
_____...
   

Scripts for converting the data format are available in ./script/tool/ folder.

You can use script/tools/run_kittibin2pcd.sh to convert *.bin to *.pcd to get the pcd folder. Similarly, you can use script/tools/run_semantic_kitti_labelbin2pcd.sh to convert *.label to *.pcd to get the label_pcd folder.

Test on your own data

If you'd like to use your own data, the data format should be one of the following: *.pcd, *.ply, *.txt, *.las, *.csv, *.h5, *.bin. You can simply specify the data format in script/run_mulls_slam.sh.

The data foler structure can be as simple as follows:

Base Folder
      |___dummy_framewise_point_cloud
      .    |___00001.pcd (las,txt,ply,h5,csv,bin...)
      .    |___00002.pcd (las,txt,ply,h5,csv,bin...)
      .    |___...
      |___dummy_ground_truth_trajectory.txt (optional)   
      |___dummy_calibration_file.txt (optional)  

To feed your data into MULLS in order, your point clouds' filename should also in the same order. You can use script/tools/batch_rename.sh to rename filenames from something like 1.pcd to 00001.pcd in batch.

Links to more open datasets are available here.

5. Run

MULLS-SLAM

If you'd like to test the LiDAR SLAM module (MULLS-SLAM), please edit the script/run_mulls_slam.sh file, specify the data path and then run:

sh script/run_mulls_slam.sh

If the visualization is enabled, then you can configure the visualization GUI by following the instructions here.

For better performance on a specific dataset, you are suggested to play with the parameters in script/config/lo_gflag_list_[xxx].txt (tips on parameter tuning are available here), and then you need to change the config file path in script/run_mulls_slam.sh as follows:

config_file=./script/config/lo_gflag_list_[xxx].txt

To disable or enable the back-end (loop closure detection and pose graph optimization), you can edit the --loop_closure_detection_on=true/false in the config file.

After the transaction, you are expected to find the results (plots, poses, evaluation results, generated 3D and 2D map ...) in the result folder under the data path. You can configure the output preference according to here.

MULLS-Registration

You can use script/run_mulls_reg.sh to test the pairwise point cloud registration using MULLS-ICP with TEASER++ simply by configuring the data path in it. Then you can run it by:

sh script/run_mulls_reg.sh

An example on TLS point cloud registration can be found here.


More demos


On KITTI dataset (seq. 00 & 01)

alt text

alt text


Citation

If you find this code useful for your work or use it in your project, please consider citing the paper:

@inproceedings{pan2021mulls,
  title={MULLS: Versatile LiDAR SLAM via Multi-metric Linear Least Square},
  author={Yue Pan, Pengchuan Xiao, Yujie He, Zhenlei Shao, Zesong Li},
  booktitle={IEEE International Conference on Robotics and Automation (ICRA)},
  year={2021},
  organization={IEEE}
}

Contact

If you have any questions, please let me know:


Acknowledgments

We thank the authors of TEASER and NDT_OMP for making their work public.

Thanks Martin Valgur @valgur for fixing the compatibility issues.


TODO List

  • Add preprint paper
  • Add Wiki
  • Add demo examples
  • Update camera-ready paper
  • Code refactoring
  • Add ROS support
  • Add cross-platform support (run on windows)
  • Add sensor fusion module
  • Add localization module with built map

mulls's People

Contributors

lexanagibator228 avatar valgur avatar yuepanedward avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

mulls's Issues

Build on Ubuntu 20.04

Hi, I am opening an Issue to document how to build MULLS on ubuntu 20.04, GCC 11.1.0 and cmake 3.26.1

If you plan on running the install_dep_lib.sh change the ceres version from 2.0.0 to 2.1.0 in line 66 and 73.
If like me you receive an error

function: not found

Run the shell script using bash instead of sh, this solved the problem for me: bash script/tools/install_dep_lib.sh. Moreover, the checkinstall-auto function leverages checkinstall which is not automatically installed in Ubuntu 20.04, you can get it by using the command:

sudo apt-get update && sudo apt-get install checkinstall

Lastly, I got an error for the version of pyyaml I had installed:

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts. open3d 0.13.0 requires pyyaml>=5.4.1, but you have pyyaml 5.3.1 which is incompatible.

To solve this problem just run:

pip install --upgrade pyyaml

Following these steps I encountered no problems and I just had to follow the remaining steps, always using bash instead of sh to run the shell scripts.
image

卡顿

测试KITTI00,跑是可以正常跑,但是为什么非常卡顿,我的机器配置是ubuntu18+32G+i710700,我看了电脑资源利用率,内存只用了10G,cpu只用了10%,按理说不应该让机器卡顿,这是为什么呢

core dumped / too few necessary correspondences

Hi, I'm trying to test MULLS with some LiDAR data that I collected in my apartment, however I just wasn't able to get it working properly. I have tried all the configuration files that are in the script/config folder, and depending on the config file I would get different messages. Below are the detailed descriptions of my problem.

  1. Before getting into my problem, when I tried the example data in demo_data folder, except for Map Viewer, the rest three Viewers were always empty. Most GUI buttons (e.g. F1 - F12) were not working at all. I used Ubuntu 18.04.3 LTS in a virtual machine.
  2. When I used lo_gflag_list_16.txt for my data (which is supposed to be the most suitable for my data since my LiDAR is Velodyne VLP-16), I got the following message:
    ...
    I0312 13:14:30.965481 16036 dataio.hpp:910] Record the file: [./test1/pcd/760.pcd].
    I0312 13:14:30.965485 16036 dataio.hpp:910] Record the file: [./test1/pcd/761.pcd].
    W0312 13:14:30.965608 16036 mulls_slam.cpp:332] [6] threads availiable in total
    The file has already existed, are you sure to overwrite it? 0. No 1. Yes [default 1]
    1
    I0312 13:14:37.337500 16036 dataio.hpp:1213] A new file [./test1/result/Rt_lo_xxx_id.txt] would be generated and used later
    I0312 13:14:37.380396 16036 dataio.hpp:157] A pcd file has been imported.
    I0312 13:14:37.380431 16036 dataio.hpp:212] [14528] points loaded in [42.6244] ms
    I0312 13:14:37.380708 16036 cfilter.hpp:91] too small voxel size, the downsampling would be disabled
    I0312 13:14:37.381307 16036 cfilter.hpp:1985] Ground: [0 | 0] Unground: [584].
    I0312 13:14:37.381335 16036 cfilter.hpp:1991] Ground segmentation and normal estimation in [0.546499] ms. preparation in [0.1245] ms.
    I0312 13:14:37.382174 16036 cfilter.hpp:2288] Unground geometric feature points extracted done in [0.810499] ms.
    I0312 13:14:37.382205 16036 cfilter.hpp:2289] Details: pca in [0.781499] ms, geometric feature points extracted in [0.0149] ms, encoding keypoints in [0.0138] ms, nms sharpen in [0] ms, downsampling in [0.0003] ms.
    I0312 13:14:37.382216 16036 cfilter.hpp:2290] Pillar: [1 | 0] Beam: [27 | 8] Facade: [113 | 54] Roof: [0 | 0] Vertex: [76].
    I0312 13:14:37.419108 16036 dataio.hpp:157] A pcd file has been imported.
    I0312 13:14:37.419142 16036 dataio.hpp:212] [14528] points loaded in [36.9146] ms
    I0312 13:14:37.419378 16036 cfilter.hpp:91] too small voxel size, the downsampling would be disabled
    I0312 13:14:37.419759 16036 cfilter.hpp:1985] Ground: [0 | 0] Unground: [572].
    I0312 13:14:37.419788 16036 cfilter.hpp:1991] Ground segmentation and normal estimation in [0.328699] ms. preparation in [0.1164] ms.
    I0312 13:14:37.420783 16036 cfilter.hpp:2288] Unground geometric feature points extracted done in [0.936699] ms.
    I0312 13:14:37.420815 16036 cfilter.hpp:2289] Details: pca in [0.901299] ms, geometric feature points extracted in [0.0178] ms, encoding keypoints in [0.0174] ms, nms sharpen in [0.0001] ms, downsampling in [0.0001] ms.
    I0312 13:14:37.420831 16036 cfilter.hpp:2290] Pillar: [1 | 0] Beam: [27 | 9] Facade: [121 | 47] Roof: [0 | 0] Vertex: [75].
    I0312 13:14:37.420855 16036 map_manager.cpp:36] Map based filtering range(m): (0, 0.03] U [0.3,3]
    I0312 13:14:37.420903 16036 map_manager.cpp:133] Feature point number of the local map: G: [0] P: [0] B: [8] F: [54] R: [0] V: [76].
    I0312 13:14:37.420933 16036 map_manager.cpp:138] Update local map ([62] points at present) done in [0.0505] ms.
    I0312 13:14:37.420995 16036 cregistration.hpp:1350] Build Kdtree done in [0.0275] ms
    I0312 13:14:37.421021 16036 cregistration.hpp:1359] Apply initial guess transformation
    1 0 0 0.5
    0 1 0 0
    0 0 1 0
    0 0 0 1
    I0312 13:14:37.421066 16036 cregistration.hpp:1379] Source point cloud updating done in [0.0446] ms for iteration [0]
    mulls_slam: /build/pcl-OilVEB/pcl-1.8.1+dfsg1/kdtree/include/pcl/kdtree/impl/kdtree_flann.hpp:136: int pcl::KdTreeFLANN<PointT, Dist>::nearestKSearch(const PointT&, int, std::vector&, std::vector&) const [with PointT = pcl::PointXYZINormal; Dist = flann::L2_Simple]: Assertion `point_representation_->isValid (point) && "Invalid (NaN, Inf) point coordinates given to nearestKSearch!"' failed.
    Aborted (core dumped)
  3. For all the rest configuration files, they all gave similar if not identical messages, and I needed to keep pressing space to go to the next frame of my data. It looked like the algorithm had a hard time extracting enough features. Note lo_gflag_list_example_demo.txt was the only file that made the Map Viewer display one frame of my data, and Map Viewer would be empty if I used other configuration files, which made no sense to me. Regardless the rest three Viewers were always empty. Here are the messages:
    ...
    I0312 14:22:00.955034 17215 dataio.hpp:910] Record the file: [./test1/pcd/760.pcd].
    I0312 14:22:00.955039 17215 dataio.hpp:910] Record the file: [./test1/pcd/761.pcd].
    W0312 14:22:00.955149 17215 mulls_slam.cpp:332] [6] threads availiable in total
    The file has already existed, are you sure to overwrite it? 0. No 1. Yes [default 1]
    1
    I0312 14:22:03.892556 17215 dataio.hpp:1213] A new file [./test1/result/Rt_lo_xxx_id.txt] would be generated and used later
    I0312 14:22:03.940910 17215 dataio.hpp:157] A pcd file has been imported.
    I0312 14:22:03.940933 17215 dataio.hpp:212] [14528] points loaded in [48.0784] ms
    I0312 14:22:03.940984 17215 cfilter.hpp:91] too small voxel size, the downsampling would be disabled
    I0312 14:22:03.942530 17215 cfilter.hpp:1985] Ground: [0 | 0] Unground: [13].
    I0312 14:22:03.942560 17215 cfilter.hpp:1998] Ground segmentation done in [1.4904] ms.
    I0312 14:22:03.942569 17215 cfilter.hpp:1999] Ground Normal Estimation done in [0.0008] ms. preparation in [0.2037] ms.
    I0312 14:22:03.942642 17215 cfilter.hpp:2288] Unground geometric feature points extracted done in [0.0638] ms.
    I0312 14:22:03.942685 17215 cfilter.hpp:2289] Details: pca in [0.0603] ms, geometric feature points extracted in [0.0008] ms, encoding keypoints in [0.0001] ms, nms sharpen in [0.0023] ms, downsampling in [0.0003] ms.
    I0312 14:22:03.942696 17215 cfilter.hpp:2290] Pillar: [0 | 0] Beam: [1 | 0] Facade: [0 | 0] Roof: [0 | 0] Vertex: [0].
    I0312 14:22:03.981065 17215 dataio.hpp:157] A pcd file has been imported.
    I0312 14:22:03.981096 17215 dataio.hpp:212] [14528] points loaded in [38.3904] ms
    I0312 14:22:03.981142 17215 cfilter.hpp:91] too small voxel size, the downsampling would be disabled
    I0312 14:22:03.981544 17215 cfilter.hpp:1985] Ground: [0 | 0] Unground: [12].
    I0312 14:22:03.981571 17215 cfilter.hpp:1998] Ground segmentation done in [0.3398] ms.
    I0312 14:22:03.981608 17215 cfilter.hpp:1999] Ground Normal Estimation done in [0.0009] ms. preparation in [0.2246] ms.
    I0312 14:22:03.981671 17215 cfilter.hpp:2288] Unground geometric feature points extracted done in [0.0467] ms.
    I0312 14:22:03.981700 17215 cfilter.hpp:2289] Details: pca in [0.0429] ms, geometric feature points extracted in [0.0007] ms, encoding keypoints in [0.0005] ms, nms sharpen in [0.0022] ms, downsampling in [0.0004] ms.
    I0312 14:22:03.981714 17215 cfilter.hpp:2290] Pillar: [0 | 0] Beam: [1 | 0] Facade: [0 | 0] Roof: [0 | 0] Vertex: [0].
    I0312 14:22:03.981729 17215 map_manager.cpp:36] Map based filtering range(m): (0, 0.03] U [0.3,3]
    I0312 14:22:03.981745 17215 map_manager.cpp:133] Feature point number of the local map: G: [0] P: [0] B: [0] F: [0] R: [0] V: [0].
    I0312 14:22:03.981750 17215 map_manager.cpp:138] Update local map ([0] points at present) done in [0.0175] ms.
    I0312 14:22:03.981770 17215 cregistration.hpp:3050] Intersection local bounding box filtering done
    I0312 14:22:03.981778 17215 cregistration.hpp:1350] Build Kdtree done in [0.0014] ms
    I0312 14:22:03.981784 17215 cregistration.hpp:1359] Apply initial guess transformation
    1 0 0 0.5
    0 1 0 0
    0 0 1 0
    0 0 0 1
    I0312 14:22:03.981797 17215 cregistration.hpp:1379] Source point cloud updating done in [0.0138] ms for iteration [0]
    I0312 14:22:03.981804 17215 cregistration.hpp:1412] Used correspondences [G: 0 P: 0 B: 0 F: 0 R: 0 V: 0 ].
    I0312 14:22:03.981809 17215 cregistration.hpp:1414] Correspondence searching done in [0.0062] ms for iteration [0]
    W0312 14:22:03.981814 17215 cregistration.hpp:1424] Too few neccessary correspondences
    I0312 14:22:03.981828 17215 cregistration.hpp:1548] The posterior overlapping ratio is [-nan]
    I0312 14:22:03.981833 17215 cregistration.hpp:1553] Registration done in [0.0695] ms.
    I0312 14:22:03.981838 17215 cregistration.hpp:1554] Final tran. matrix:
    1 0 0 0.5
    0 1 0 0
    0 0 1 0
    0 0 0 1
    Press [space] to continue
    [space] was pressed => resume
    I0312 14:22:06.766881 17215 mulls_slam.cpp:679] scan to scan registration done
    frame [0] - [1]:
    1 0 0 0.5
    0 1 0 0
    0 0 1 0
    0 0 0 1
    I0312 14:22:06.771096 17215 mulls_slam.cpp:773] Render frame [1] in [4.1485] ms.
    I0312 14:22:06.771374 17215 common_nav.cpp:51] current approximate velocity: 18 (km/h)
    I0312 14:22:06.771409 17215 mulls_slam.cpp:827] Consuming time of lidar odometry for current frame is [2785.8] ms.
    I0312 14:22:06.771418 17215 mulls_slam.cpp:828] Process frame [1] in [2824.5] ms.
    I0312 14:22:06.810683 17215 dataio.hpp:157] A pcd file has been imported.
    I0312 14:22:06.810708 17215 dataio.hpp:212] [14528] points loaded in [39.2815] ms
    I0312 14:22:06.810760 17215 cfilter.hpp:91] too small voxel size, the downsampling would be disabled
    I0312 14:22:06.813957 17215 cfilter.hpp:1985] Ground: [0 | 0] Unground: [13].
    I0312 14:22:06.813987 17215 cfilter.hpp:1998] Ground segmentation done in [3.1415] ms.
    I0312 14:22:06.813997 17215 cfilter.hpp:1999] Ground Normal Estimation done in [0.0012] ms. preparation in [0.2118] ms.
    I0312 14:22:06.815634 17215 cfilter.hpp:2288] Unground geometric feature points extracted done in [1.6279] ms.
    I0312 14:22:06.815663 17215 cfilter.hpp:2289] Details: pca in [1.5722] ms, geometric feature points extracted in [0.0014] ms, encoding keypoints in [0.0008] ms, nms sharpen in [0.053] ms, downsampling in [0.0005] ms.
    I0312 14:22:06.815675 17215 cfilter.hpp:2290] Pillar: [0 | 0] Beam: [3 | 0] Facade: [0 | 0] Roof: [0 | 0] Vertex: [3].
    I0312 14:22:06.815685 17215 map_manager.cpp:36] Map based filtering range(m): (0, 0.03] U [0.3,3]
    I0312 14:22:06.815701 17215 map_manager.cpp:133] Feature point number of the local map: G: [0] P: [0] B: [0] F: [0] R: [0] V: [0].
    I0312 14:22:06.815704 17215 map_manager.cpp:138] Update local map ([0] points at present) done in [0.0175] ms.
    I0312 14:22:06.815719 17215 cregistration.hpp:3050] Intersection local bounding box filtering done
    I0312 14:22:06.815727 17215 cregistration.hpp:1350] Build Kdtree done in [0.0019] ms
    I0312 14:22:06.815733 17215 cregistration.hpp:1359] Apply initial guess transformation
    1 0 0 0
    0 1 0 0
    0 0 1 0
    0 0 0 1
    I0312 14:22:06.815785 17215 cregistration.hpp:1379] Source point cloud updating done in [0.0513] ms for iteration [0]
    I0312 14:22:06.815796 17215 cregistration.hpp:1412] Used correspondences [G: 0 P: 0 B: 0 F: 0 R: 0 V: 0 ].
    I0312 14:22:06.815800 17215 cregistration.hpp:1414] Correspondence searching done in [0.0117] ms for iteration [0]
    W0312 14:22:06.815804 17215 cregistration.hpp:1424] Too few neccessary correspondences
    I0312 14:22:06.815819 17215 cregistration.hpp:1548] The posterior overlapping ratio is [-nan]
    I0312 14:22:06.815822 17215 cregistration.hpp:1553] Registration done in [0.1081] ms.
    I0312 14:22:06.815826 17215 cregistration.hpp:1554] Final tran. matrix:
    1 0 0 0
    0 1 0 0
    0 0 1 0
    0 0 0 1
    Press [space] to continue

Thanks for any help in advance!

bin2pcd

Hello,

I'm using the tool script run_kittibin2pcd.sh and was advised that the bin2pcd, which is used by the script itself, is not present on my system.
Therefore I would like to ask in which of the dependencies this one is included (PCL is of course already installed).
I'm running Ubuntu 20.04.

Thanks in advance.

Parameter Modification

Hi, Author:
I have encountered some difficulties and hope to get your help. I recorded some point cloud data with 32 line radar. I want to see the road information clearly, such as the lane line. But when I finish our program, I can only see the outline of the road clearly, but I can't see the lane line clearly. How can I modify the configuration file parameters, please? Thank you.

About the error of the ROS Vertion

Thanks for your great open-source work

Recently,I transplanted MULLS to a version of ROS, I think there's probably nothing wrong with my transplant,
But a strange thing happened, the accuracy of ROS version of MULLS is worse than the origin version. What do you think are the possible reasons?

Ps: For some reasons, I can't open source code at present (I will open source it for a few months).

All of those results are without Loop closure.

Result is about KITTI00

The Ros Version:

max 29.731126
mean 13.463498
median 14.155483
min 0.000000
rmse 15.795527
sse 1132973.519208
std 8.260320

2022-02-17 15-41-08 的屏幕截图
2022-02-17 15-41-20 的屏幕截图

The origin version:
max 23.285513
mean 9.309443
median 6.983857
min 0.000000
rmse 10.916848
sse 541185.327411
std 5.701915

2022-02-17 15-44-15 的屏幕截图
2022-02-17 15-44-26 的屏幕截图

Segmentation fault (core dumped). Is it caused by the wrong config setting?

Thanks for your sharing of this great work! I try to test MULLS on my own dataset sampled by a 32-line Robosense LiDAR, but it fails and throw out Segmentation fault (core dumped). I carefully checked the code and found the error is caused by the out-of-bounds of point cloud. However, the problem disappared when I use another dataset sampled by a 16-line Velodyne LiDAR.

I'm not sure whether it is caused by the wrong config setting or the unsafe usage of PCL iterator. For instance

  1. at line 1766 in cregistration.hpp, the index of matched point t_index = (*iter).index_match; returns a random integer (e.g., -671088336) which out of the size of target point cloud. The extracted features are no problem, since I outputted the extracted ground and facade features of both source and target cloud as PCD file and compared them in CloudCompare software.
  2. at line 2399 in cregistration.hpp, the index of matched point t_index = (*Corr)[i].index_match also returns a random integer which out of the size of target point cloud.

Here is part of pcd files in my dataset.
test dataset.zip

Loop Closure Issue

Hello,
the SLAM works fine so far, however i have problems with the loop closure.
If i return to my start point the algorithm doesnt detect it and furthermore all the curves my robot drives are not detected in the right angle.
Is there a way to include IMU or GPS data?

feature seg

how can i only show the lidar points feature segmentation

how to save the final map?

when mapping process is already done,I use Ctrl+C to end of the process,but map_point_clouds is empty.

Compile Error:"*** target pattern contains no '%'. Stop."

When I execute cmake .. and make, I get the following errors:

Current Cmake version is : 3.10.2
-- Boost version: 1.65.1
-- Found the following Boost libraries:
-- thread
-- chrono
-- system
-- date_time
-- atomic
OpenMP [OK]:
Eigen3 [OK]
-- The imported target "vtkRenderingPythonTkWidgets" references the file
"/usr/lib/x86_64-linux-gnu/libvtkRenderingPythonTkWidgets.so"
but this file does not exist. Possible reasons include:

  • The file was deleted, renamed, or moved to another location.
  • An install or uninstall procedure did not complete successfully.
  • The installation package was faulty and contained
    "/usr/lib/cmake/vtk-6.3/VTKTargets.cmake"
    but not all the files it references.

-- The imported target "vtk" references the file
"/usr/bin/vtk"
but this file does not exist. Possible reasons include:

  • The file was deleted, renamed, or moved to another location.
  • An install or uninstall procedure did not complete successfully.
  • The installation package was faulty and contained
    "/usr/lib/cmake/vtk-6.3/VTKTargets.cmake"
    but not all the files it references.

PCL [OK]:/usr/include/pcl-1.8/usr/include/eigen3/usr/include/usr/include/ni/usr/include/openni2/usr/include/vtk-6.3/usr/include/freetype2/usr/lib/x86_64-linux-gnu/openmpi/include/openmpi/usr/lib/x86_64-linux-gnu/openmpi/include/openmpi/opal/mca/event/libevent2022/libevent/usr/lib/x86_64-linux-gnu/openmpi/include/openmpi/opal/mca/event/libevent2022/libevent/include/usr/lib/x86_64-linux-gnu/openmpi/include/usr/include/python2.7/usr/include/x86_64-linux-gnu/usr/include/hdf5/openmpi/usr/include/libxml2/usr/include/jsoncpp/usr/include/tcl
-- Found gflags (include: /usr/include, library: /usr/lib/x86_64-linux-gnu/libgflags.so)
GFLAGS [OK]:/usr/include
-- Found glog (include: /usr/include, library: /usr/lib/x86_64-linux-gnu/libglog.so)
GLOG [OK]:/usr/include
HDF5 [OK]
-- Found installed version of Eigen: /usr/lib/cmake/eigen3
-- Found required Ceres dependency: Eigen version 3.3.4 in /usr/include/eigen3
-- Found required Ceres dependency: glog
-- Found installed version of gflags: /usr/lib/x86_64-linux-gnu/cmake/gflags
-- Detected gflags version: 2.2.1
-- Found required Ceres dependency: gflags
-- Found Ceres version: 1.13.0 installed in: /usr with components: [LAPACK, SuiteSparse, SparseLinearAlgebraLibrary, CXSparse, SchurSpecializations, OpenMP]
CERES [OK]:/usr/include/usr/include/eigen3/usr/include
Teaser++ [OK]
OPENCV [OK]: /usr/local/include/usr/local/include/opencv
-- Configuring done
-- Generating done
-- Build files have been written to: /home/jxh/workspace/MULLS/build

CMakeFiles/mulls_reg.dir/build.make:197: *** target pattern contains no '%'. Stop.
CMakeFiles/Makefile2:67: recipe for target 'CMakeFiles/mulls_reg.dir/all' failed
make[1]: *** [CMakeFiles/mulls_reg.dir/all] Error 2
Makefile:83: recipe for target 'all' failed
make: *** [all] Error 2
I don't know where the dependency went wrong, please help find out what the problems are. Thank you very much O(∩_∩)O~

Created PointCloud objects have an invalid state

A common pattern in MULLS is to build instances of pcl::PointCloud by appending points directly into the underlying vector points using ->points.push_back.
This leaves the object in an invalid state, as the member attributes pcl::PointCloud::width and height are out-of-sync with the actual size. This can especially cause problems when passing such an object to an external library, as is the case with the map viewer.

Please consider using pcl::PointCloud.push_back instead of pcl::PointCloud::points.push_back.

This issue might be part of the reason for a program crash I am experiencing while having the map viewer enabled:

terminate called after throwing an instance of 'std::length_error'
  what():  vector::reserve
Aborted (core dumped)
GDB stacktrace
terminate called after throwing an instance of 'std::length_error'
  what():  vector::reserve
--Type <RET> for more, q to quit, c to continue without paging-- c

Thread 1 "mulls_slam" received signal SIGABRT, Aborted.
__GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:50
50      ../sysdeps/unix/sysv/linux/raise.c: No such file or directory.
(gdb) bt
#0  __GI_raise (sig=sig@entry=6) at ../sysdeps/unix/sysv/linux/raise.c:50
#1  0x00007fffee56f859 in __GI_abort () at abort.c:79
#2  0x00007fffee949911 in  () at /lib/x86_64-linux-gnu/libstdc++.so.6
#3  0x00007fffee95538c in  () at /lib/x86_64-linux-gnu/libstdc++.so.6
#4  0x00007fffee9553f7 in  () at /lib/x86_64-linux-gnu/libstdc++.so.6
#5  0x00007fffee9556a9 in  () at /lib/x86_64-linux-gnu/libstdc++.so.6
#6  0x00007fffee94c326 in std::__throw_length_error(char const*) () at /lib/x86_64-linux-gnu/libstdc++.so.6
#7  0x00007ffff6249d9c in  () at /lib/x86_64-linux-gnu/libpcl_features.so.1.10
#8  0x00007ffff02355bf in vtkOpenGLIndexBufferObject::AppendPointIndexBuffer(std::vector<unsigned int, std::allocator<unsigned int> >&, vtkCellArray*, long long) ()
    at /lib/x86_64-linux-gnu/libvtkRenderingOpenGL2-7.1.so.7.1p
#9  0x00007ffff02358af in vtkOpenGLIndexBufferObject::CreatePointIndexBuffer(vtkCellArray*) () at /lib/x86_64-linux-gnu/libvtkRenderingOpenGL2-7.1.so.7.1p
#10 0x00007ffff02a94e0 in vtkOpenGLPolyDataMapper::BuildIBO(vtkRenderer*, vtkActor*, vtkPolyData*) () at /lib/x86_64-linux-gnu/libvtkRenderingOpenGL2-7.1.so.7.1p
#11 0x00007ffff02ad166 in vtkOpenGLPolyDataMapper::BuildBufferObjects(vtkRenderer*, vtkActor*) () at /lib/x86_64-linux-gnu/libvtkRenderingOpenGL2-7.1.so.7.1p
#12 0x00007ffff02b15dc in vtkOpenGLPolyDataMapper::RenderPieceStart(vtkRenderer*, vtkActor*) () at /lib/x86_64-linux-gnu/libvtkRenderingOpenGL2-7.1.so.7.1p
#13 0x00007ffff02a8d14 in vtkOpenGLPolyDataMapper::RenderPiece(vtkRenderer*, vtkActor*) () at /lib/x86_64-linux-gnu/libvtkRenderingOpenGL2-7.1.so.7.1p
#14 0x00007fffefef8c63 in vtkPolyDataMapper::Render(vtkRenderer*, vtkActor*) () at /lib/x86_64-linux-gnu/libvtkRenderingCore-7.1.so.7.1p
#15 0x00007fffefe49345 in vtkDataSetMapper::Render(vtkRenderer*, vtkActor*) () at /lib/x86_64-linux-gnu/libvtkRenderingCore-7.1.so.7.1p
#16 0x00007ffff01ed677 in vtkOpenGLActor::Render(vtkRenderer*, vtkMapper*) () at /lib/x86_64-linux-gnu/libvtkRenderingOpenGL2-7.1.so.7.1p
#17 0x00007ffff0439220 in vtkLODActor::Render(vtkRenderer*, vtkMapper*) () at /lib/x86_64-linux-gnu/libvtkRenderingLOD-7.1.so.7.1p
#18 0x00007ffff0438762 in vtkLODActor::RenderOpaqueGeometry(vtkViewport*) () at /lib/x86_64-linux-gnu/libvtkRenderingLOD-7.1.so.7.1p
#19 0x00007fffeff1558b in vtkRenderer::UpdateOpaquePolygonalGeometry() () at /lib/x86_64-linux-gnu/libvtkRenderingCore-7.1.so.7.1p
#20 0x00007ffff02d8fb8 in vtkOpenGLRenderer::DeviceRenderOpaqueGeometry() () at /lib/x86_64-linux-gnu/libvtkRenderingOpenGL2-7.1.so.7.1p
#21 0x00007ffff02d69a6 in vtkOpenGLRenderer::UpdateGeometry() () at /lib/x86_64-linux-gnu/libvtkRenderingOpenGL2-7.1.so.7.1p
#22 0x00007ffff02d6185 in vtkOpenGLRenderer::DeviceRender() () at /lib/x86_64-linux-gnu/libvtkRenderingOpenGL2-7.1.so.7.1p
#23 0x00007fffeff1bc63 in vtkRenderer::Render() () at /lib/x86_64-linux-gnu/libvtkRenderingCore-7.1.so.7.1p
#24 0x00007fffeff14968 in vtkRendererCollection::Render() () at /lib/x86_64-linux-gnu/libvtkRenderingCore-7.1.so.7.1p
#25 0x00007ffff2d7d952 in pcl::visualization::PCLVisualizerInteractorStyle::OnTimer() () at /lib/x86_64-linux-gnu/libpcl_visualization.so.1.10
#26 0x00007fffeff94400 in vtkInteractorStyle::ProcessEvents(vtkObject*, unsigned long, void*, void*) () at /lib/x86_64-linux-gnu/libvtkRenderingCore-7.1.so.7.1p
#27 0x00007fffef56f08d in vtkCallbackCommand::Execute(vtkObject*, unsigned long, void*) () at /lib/x86_64-linux-gnu/libvtkCommonCore-7.1.so.7.1p
#28 0x00007fffef60e192 in  () at /lib/x86_64-linux-gnu/libvtkCommonCore-7.1.so.7.1p
#29 0x00007ffff0386cb2 in vtkXRenderWindowInteractorTimer(void*, unsigned long*) () at /lib/x86_64-linux-gnu/libvtkRenderingOpenGL2-7.1.so.7.1p
#30 0x00007fffe972360e in  () at /lib/x86_64-linux-gnu/libXt.so.6
#31 0x00007fffe9724227 in XtAppNextEvent () at /lib/x86_64-linux-gnu/libXt.so.6
#32 0x00007ffff038603c in vtkXRenderWindowInteractor::StartEventLoop() () at /lib/x86_64-linux-gnu/libvtkRenderingOpenGL2-7.1.so.7.1p
#33 0x00007ffff2db8e0a in pcl::visualization::PCLVisualizer::spinOnce(int, bool) () at /lib/x86_64-linux-gnu/libpcl_visualization.so.1.10
#34 0x00005555555cfa61 in lo::MapViewer<pcl::PointXYZINormal>::judge_pause(boost::shared_ptr<pcl::visualization::PCLVisualizer>&, int) (display_time_ms=<optimized out>, viewer=..., this=0x7fffffffc490)
    at /home/markus/Repos/MULLS/include/common/map_viewer.h:108
#35 lo::MapViewer<pcl::PointXYZINormal>::judge_pause(boost::shared_ptr<pcl::visualization::PCLVisualizer>&, int) (this=0x7fffffffc490, viewer=..., display_time_ms=<optimized out>)
--Type <RET> for more, q to quit, c to continue without paging--c
   MULLS/include/common/map_viewer.h:108
#36 0x000055555560d42b in lo::MapViewer<pcl::PointXYZINormal>::display_lo_realtime(boost::shared_ptr<lo::cloudblock_t>&, boost::shared_ptr<pcl::visualization::PCLVisualizer>&, int, int, int) (this=0x7fffffffc490, current_frame=..., viewer=..., display_time_ms=1, display_downsample_ratio_current=<optimized out>, display_downsample_ratio_history=<optimized out>) at /usr/include/boost/smart_ptr/detail/shared_count.hpp:122
#37 0x00005555555a4f37 in main(int, char**) (argc=<optimized out>, argv=<optimized out>) at /home/markus/Repos/MULLS/test/mulls_slam.cpp:745 

This may also be related to #18 (comment)

Performance about 03 sequence

Hi, thanks for your good work! I test the 03 sequence of kitti with both urban and highway configurations. However, the ATE is much worse than the one reported in your ICRA paper. The ATE is about 1.84%. The 00 and 01 sequences can get similar performance with your paper. I want to know how to set the parameter configration for 03? Thanks a lot!

Teaser fails

Hi @YuePanEdward , thanks for your great work!
Now I am trying to use mulls_registration to align loop closure key submaps in indoor environment. However, the Teaser fails and no initial pose is provided for icp.
The following pics shows that features from submaps seem abundant but teaser finds too few inliners.
选区_884
选区_882
And the following pic is my mulls_reg params:
选区_883
So how to adjust params to make the teaser success?

Thanks!

Script not found!

script/run_mulls_slam.sh: 87: script/run_mulls_slam.sh: ./bin/mulls_slam: not found

Could you please provide some suggestions on how to adjust parameters on KITTI dataset?

Hi, yue pan,

Thank you very much for sharing your code! Your work is awesome, which gives me a lot of inspiration. However, when I try to test KITTI00 data and run your code with your configuration (lo_gflag_list_kitti_urban.txt). I cannot obtain same results as you presented. Meanwhile, It seems that I cannot achieve real-time performance when testing your code. I think there must be something wrong with my configuration. However, I didn't change anything. Do you know the potential reasons? Thank you very much for your help!

Here is the odometry result

kitti00_odometry

Here is the SLAM result with loop closure

kitti00_odometry_loop

Thank you very much!

CMake Error "teaserpp"

CMake Error at CMakeLists.txt:188 (FIND_PACKAGE):
By not providing "Findteaserpp.cmake" in CMAKE_MODULE_PATH this project has
asked CMake to find a package configuration file provided by "teaserpp",
but CMake did not find one.

Could not find a package configuration file provided by "teaserpp" with any
of the following names:

teaserppConfig.cmake
teaserpp-config.cmake

Add the installation prefix of "teaserpp" to CMAKE_PREFIX_PATH or set
"teaserpp_DIR" to a directory containing one of the above files. If
"teaserpp" provides a separate development package or SDK, be sure it has
been installed.

Some questions

I have some questions about MULLS that are bothering me, can you answer them?

  1. TIME consumption Although the terminal output of MULLS shows that the average time per frame is about 15ms, but in actual operation, I found that when the data is larger, the running speed of MULLS is slower. For example, when testing KITTI04, about It takes 10s (the average time consumption per frame is about 40ms), when testing KITTI02, it took about 2h (that is to say, the average time consumption per frame is about 1.5s, my computer configuration It's 32GB RAM, i710700, 1TSSD+2T mechanical hard drive, I don't think it's because of the computer configuration),why caused this? and is there any improvement plan?
  2. The relationship between the experimental data of the paper and the code parameters I noticed that you set three sets of parameters for the KITTI data set, so in the experimental data in the paper, do you use one set of parameters or all three sets of parameters?

How to remove dynamic objects

I tried parameters in congfig/demo.txt and kitti_urban.txt with sequence00/01 of KITTI dataset, however, dynamic points are still in the file of "merged_map.pcd". Parameter of "apply_map_based_dynamic_removal" was also set to "true". How could I get merged points without dynamic objects? Thanks.

diff between code and paper

Hi, thanks for you outstanding work. I learned a lot from your paper and your code.
Howerver, there are some diffs between your paper and the code:

  1. weight for xyz balance, according to your paper, w_ground should be "max_(0.01, z_xy_balance_ratio * (m2 + 2 * m3 + m4) / (0.0001 + 2.0 * m1))"
    w_ground = max_(0.01, z_xy_balance_ratio * (m2 + 2 * m3 - m4) / (0.0001 + 2.0 * m1));
  2. wrong comment?
    Eigen::Matrix4d initial_guess = Eigen::Matrix4d::Identity(), //used_feature_type (1: on, 0: off, order: ground, pillar, beam, facade, roof, vetrex)
		if (used_feature_type[1] == '1')
			source_feature_points_count += pc_pillar_sc->points.size();
		if (used_feature_type[2] == '1')
			source_feature_points_count += pc_facade_sc->points.size();
		if (used_feature_type[3] == '1')
			source_feature_points_count += pc_beam_sc->points.size();

It seems that the order of beam and facade in commet is wrong.

Process killed

Hi,

if i try to use apparently too much PCD Files the algorithm abruptly stopps displaying the message killed in the terminal, in the process of saving the map i think. (When it iterates through the pcd files at the end).
I guess it has to do with too little RAM?

I am using a MacBook Pro 2018.

build failed ubuntu18.06

MULLS/src/build_pose_graph.cpp:213:43: error: qualified-id in declaration before ‘(’ token
bool Constraint_Finder::double_check_tran(Eigen::Matrix4d &global_reg_tran, Eigen::Matrix4d &lo_predicted_tran,Eigen::Matrix4d &trusted_tran,

Docker Build Fail

First of all, thanks for the hard work in putting this repo together. I am trying to build a docker image using the provided Dockerfile. When I run:

docker build . --tag mulls

The build fails at

Step 17/21 : RUN rm -rf build &&     mkdir build &&     cd build &&     cmake .. -DBUILD_WITH_SOPHUS=ON -DBUILD_WITH_PROJ4=ON -DBUILD_WITH_LIBLAS=ON -DCMAKE_CXX_COMPILER=${CXX_COMPILER} &&     make -j${NPROC}

with 1 error:

/usr/local/include/sophus/common.hpp:42:10: fatal error: 'fmt/core.h' file not found
#include <fmt/core.h>
         ^~~~~~~~~~~~

I have had a quick google but couldn't find anything that looked immediately applicable. Would you have any ideas?

TEASER++ build error

I'm trying to install the dependencies, but I got the following error, can someone help?

"""


Done. The new package has been saved to

/home/predev/repos/jonatan/aggregate/MULLS/dependent_libs/libLAS/build/liblas-dev_0.0.0-1_amd64.deb
You can install it in your system anytime using:

  dpkg -i liblas-dev_0.0.0-1_amd64.deb

Reading package lists... Done
Building dependency tree
Reading state information... Done
Note, selecting 'liblas-dev' instead of './liblas-dev_0.0.0-1_amd64.deb'
The following package was automatically installed and is no longer required:
liblas3
Use 'sudo apt autoremove' to remove it.
The following packages will be DOWNGRADED:
liblas-dev
0 to upgrade, 0 to newly install, 1 to downgrade, 0 to remove and 42 not to upgrade.
Need to get 0 B/710 kB of archives.
After this operation, 3,833 kB of additional disk space will be used.
Get:1 /home/predev/repos/jonatan/aggregate/MULLS/dependent_libs/libLAS/build/liblas-dev_0.0.0-1_amd64.deb liblas-dev amd64 0.0.0-1 [710 kB]
dpkg: warning: downgrading liblas-dev from 1.8.1-6build1 to 0.0.0-1
(Reading database ... 300340 files and directories currently installed.)
Preparing to unpack .../liblas-dev_0.0.0-1_amd64.deb ...
Unpacking liblas-dev (0.0.0-1) over (1.8.1-6build1) ...
Setting up liblas-dev (0.0.0-1) ...
Processing triggers for man-db (2.8.3-2ubuntu0.1) ...
install [libLAS] done
install [TEASER++]
Cmake version >= 3.10 required
Cloning into 'TEASER-plusplus'...
remote: Enumerating objects: 297, done.
remote: Counting objects: 100% (297/297), done.
remote: Compressing objects: 100% (257/257), done.
remote: Total 297 (delta 17), reused 254 (delta 14), pack-reused 0
Receiving objects: 100% (297/297), 30.17 MiB | 24.29 MiB/s, done.
Resolving deltas: 100% (17/17), done.
-- The C compiler identification is GNU 7.5.0
-- The CXX compiler identification is GNU 7.5.0
-- Check for working C compiler: /usr/bin/cc
-- Check for working C compiler: /usr/bin/cc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done
-- Check for working CXX compiler: /usr/bin/c++
-- Check for working CXX compiler: /usr/bin/c++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Setting build type to 'Release' as none was specified.
-- Enable printing of diagnostic messages.
-- Configuring done
-- Generating done
-- Build files have been written to: /home/predev/repos/jonatan/aggregate/MULLS/dependent_libs/TEASER-plusplus/build/googletest-download
Scanning dependencies of target googletest
[ 11%] Creating directories for 'googletest'
[ 22%] Performing download step (download, verify and extract) for 'googletest'
-- Downloading...
dst='/home/predev/repos/jonatan/aggregate/MULLS/dependent_libs/TEASER-plusplus/build/googletest-download/googletest-prefix/src/release-1.8.1.zip'
timeout='none'
-- Using src='https://github.com/google/googletest/archive/release-1.8.1.zip'
-- Retrying...
-- Using src='https://github.com/google/googletest/archive/release-1.8.1.zip'
-- Retry after 5 seconds (attempt #2) ...
-- Using src='https://github.com/google/googletest/archive/release-1.8.1.zip'
-- Retry after 5 seconds (attempt #3) ...
-- Using src='https://github.com/google/googletest/archive/release-1.8.1.zip'
-- Retry after 15 seconds (attempt #4) ...
-- Using src='https://github.com/google/googletest/archive/release-1.8.1.zip'
-- Retry after 60 seconds (attempt #5) ...
-- Using src='https://github.com/google/googletest/archive/release-1.8.1.zip'
CMake Error at googletest-download/googletest-prefix/src/googletest-stamp/download-googletest.cmake:159 (message):
Each download failed!

error: downloading 'https://github.com/google/googletest/archive/release-1.8.1.zip' failed
     status_code: 1
     status_string: "Unsupported protocol"
     log:
     --- LOG BEGIN ---
     Protocol "https" not supported or disabled in libcurl

Closing connection -1

     --- LOG END ---
     error: downloading 'https://github.com/google/googletest/archive/release-1.8.1.zip' failed
     status_code: 1
     status_string: "Unsupported protocol"
     log:
     --- LOG BEGIN ---
     Protocol "https" not supported or disabled in libcurl

Closing connection -1

     --- LOG END ---
     error: downloading 'https://github.com/google/googletest/archive/release-1.8.1.zip' failed
     status_code: 1
     status_string: "Unsupported protocol"
     log:
     --- LOG BEGIN ---
     Protocol "https" not supported or disabled in libcurl

Closing connection -1

     --- LOG END ---
     error: downloading 'https://github.com/google/googletest/archive/release-1.8.1.zip' failed
     status_code: 1
     status_string: "Unsupported protocol"
     log:
     --- LOG BEGIN ---
     Protocol "https" not supported or disabled in libcurl

Closing connection -1

     --- LOG END ---
     error: downloading 'https://github.com/google/googletest/archive/release-1.8.1.zip' failed
     status_code: 1
     status_string: "Unsupported protocol"
     log:
     --- LOG BEGIN ---
     Protocol "https" not supported or disabled in libcurl

Closing connection -1

     --- LOG END ---
     error: downloading 'https://github.com/google/googletest/archive/release-1.8.1.zip' failed
     status_code: 1
     status_string: "Unsupported protocol"
     log:
     --- LOG BEGIN ---
     Protocol "https" not supported or disabled in libcurl

Closing connection -1

     --- LOG END ---

CMakeFiles/googletest.dir/build.make:91: recipe for target 'googletest-prefix/src/googletest-stamp/googletest-download' failed
make[2]: *** [googletest-prefix/src/googletest-stamp/googletest-download] Error 1
CMakeFiles/Makefile2:72: recipe for target 'CMakeFiles/googletest.dir/all' failed
make[1]: *** [CMakeFiles/googletest.dir/all] Error 2
Makefile:83: recipe for target 'all' failed
make: *** [all] Error 2
CMake Error at CMakeLists.txt:83 (add_subdirectory):
The source directory

/home/predev/repos/jonatan/aggregate/MULLS/dependent_libs/TEASER-plusplus/build/googletest-src

does not contain a CMakeLists.txt file.

"""

Run time

It seems that the runtime of the open-sourced MULLS is much higher than the paper. In paper, it said that MULLS needs 80ms(without loop) or 100ms(with loop). But in my experiment, I find that the MULLS needs 154ms(without loop), about 1s(with loop), There is no changes between the code I run with your open-soured.

What causes this situation?

roof feature points number always 0

Hi,nice project, i test on unground parking lot,but the roof feature points number always is 0;i use Velodyne_VLS128 and
Hesai pandar64, but both are 0, do you have any idea or suggestions? thank you.

Segmentation fault (core dumped)

Hello,

if i try to run the algorithm with my own data i get the following output and everything shuts down:

I0512 11:56:04.058781 18420 mulls_slam.cpp:205] Launch the program!
I0512 11:56:04.058979 18420 mulls_slam.cpp:206] Logging is written to ./log/test
W0512 11:56:04.367514 18420 mulls_slam.cpp:332] [8] threads availiable in total
Segmentation fault (core dumped)

Does someone know a solution?

MULLS and LOAM ranking in KITTI

MULLS ranks top 10 in KITTI benchmark and LOAM ranks top 3. But, your algorithm performs better in your experiment than LOAM. I also think elaborated front-end can improves the odometry performance. Could you tell me why LOAM can ranks higher?

Question about the map drawing

Hi, I'm sorry, but could you please tell me, how to draw the point cloud like this ?
image
or this?
Uploading image.png…
It looks really beautiful, so I want to test it on my project, could you give me some advice about that ? Thank you very much!

IO time seems dominate the run time

Hi, Thanks for your sharing of this great work!
When I run MULLS on my dataset, the pcd IO seems contribute msot of the run time (95%+), as shown below, and the cpu utilization is very low:

I0824 16:12:37.009825 9307 mulls_slam.cpp:811] Consuming time of lidar odometry for current frame is [10.0236] ms.
I0824 16:12:37.009831 9307 mulls_slam.cpp:812] Process frame (including data IO) [210] in [669.144] ms.
I0824 16:12:37.668747 9307 dataio.hpp:157] A pcd file has been imported.

I wonder how can fix it?

Thanks!

Minimum example error

When I run"sh script/run_mulls_slam.sh"
I failed
The information is
"
ERROR: unknown command line flag 'colorlogtostderr'
ERROR: unknown command line flag 'log_dir'
ERROR: unknown command line flag 'stderrthreshold'
ERROR: unknown command line flag 'v'

"

GPU Usage

Does the Algorithm use my GPU to compute my map or only the CPU?
If only the CPU, is there a way to include the GPU in the calculations?

teaser and loop closure in MULLS

hi, @YuePanEdward ,thanks for your hardwork!
I didn't find the loop closure in your code ,Is it not finished yet? Or maybe I didn't find!
I want to know how your loop closure detects works , is it work likes the mulls_reg(scan-to-scan) moudle, use teaser to get coarse registration, then use lls to get the refine result?

Thanks!

Bug in Program?

Hello YuePanEdward. Thank you for sharing your program.

Now i am trying to run your program in my pc, ubuntu 18.04.

I run as your tutorial. However, when the program run into "dataio.batch_read_filenames_in_folder(pc_folder, "_filelist.txt", pc_format, filenames, FLAGS_frame_num_begin, FLAGS_frame_num_end, FLAGS_frame_step);".

There will be an error as follows:

terminate called after throwing an instance of 'std::out_of_range'
what(): basic_string::substr: __pos (which is 18446744073709551615) > this->size() (which is 6)
Aborted (core dumped)

The function call is in line 299 in mulls_slam.cpp

Thank you for sharing the great program.

Docker build failed: Could not find OpenMP_CXX

I tried building using Docker and ran into this CMake error:

CMake Error at /usr/share/cmake-3.24/Modules/FindPackageHandleStandardArgs.cmake:230 (message):
  Could NOT find OpenMP_CXX (missing: OpenMP_CXX_FLAGS OpenMP_CXX_LIB_NAMES)
Call Stack (most recent call first):
  /usr/share/cmake-3.24/Modules/FindPackageHandleStandardArgs.cmake:594 (_FPHSA_FAILURE_MESSAGE)
  /usr/share/cmake-3.24/Modules/FindOpenMP.cmake:547 (find_package_handle_standard_args)
  CMakeLists.txt:58 (find_package)

It looks like this is due to a missing dependency libomp-X-dev where X is the major version of clang++. In my case for Ubuntu 20.04 and the newest clang apt repository, adding apt install -y libomp-16-dev fixed the problem

Poor accuracy on self-collected dataset

Hi, Thanks for your sharing of this great work!

I have successfully run this work on my dataset which is collected using OS1-128 Lidar in a underground parking lot.
However, the performance is very poor (much worse than A-LOAM), I think I must mistake something. So I come here, and hope can be helped.

Should I modified any configuration in the code to accommodate to my own sensor or scene?

Thanks for your reading!

Segmentation fault(Core dumped) on KITTI dataset(sequence 00)

Hi Yue Pan Edward,

I compiled MULLS from your GitHub repo and ran the script
run_mulls_slam.sh on Kitti dataset(sequence 00). It started running fine
but it just stops after some time around frame 1354(mentioned in the
screenshot) and the terminal shows segmentation fault(core dumped). I
tried running the same sequence on different config files(kitti_urban,
kitti_highway, and kitti_ultrafast) but it somehow stops at the same
frame. Can you please tell me what changes should I make to remove this
error and attain loop closure?
Frame1354

Error Message

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.