Git Product home page Git Product logo

Comments (15)

kennyjchen avatar kennyjchen commented on July 17, 2024

Does it work if IMU is disabled by setting imu: false? If so, then your IMU is probably flipped and providing DLO with a wrong scan-matching prior. The current implementation assumes that the LiDAR and IMU coordinate systems coincide, so try rearranging your sensors to align (preferably in ROS forward-left-up) if possible. Otherwise I can add support for extrinsics.

from direct_lidar_odometry.

HomieRegina avatar HomieRegina commented on July 17, 2024

Thank you for your reply. When imu is disabled by setting, its map has not changed. I checked the defined lidar coordinate system and the defined IMU coordinate system. They are coincident. But I found that my lidar and imu still drifted seriously. The following two figures are the coordinate system of lidar and imu. Do I need to recalibrate the lidar and imu?

image

image

from direct_lidar_odometry.

kennyjchen avatar kennyjchen commented on July 17, 2024

Hmm... which sensors are you using? There was a similar issue a while back which was fixed through a configuration change, so try checking your driver settings. Otherwise if you record a bag for me I can help debug.

from direct_lidar_odometry.

HomieRegina avatar HomieRegina commented on July 17, 2024

Well, thank you for your reply and I have solved some issues. The data set with serious drift before recorded all topics of IMU and lidar. When I recorded the topic of using only IMU acceleration data and lidar pointcloud data in the data set, the drift situation was basically solved. Could you please tell me whether the motion trajectory, attitude and other sensor information of IMU will affect the accuracy of mapping?

When the lidar system moves unsteadily on the carrier, the built map will have errors in the horizontal direction. As shown in the following figure. But this problem does not exist on the stable moving carrier. Here is the data set.Do I need to calibrate the random error of IMU?
image
By the way, is there any setting for the joint calibration of the coordinate system of lidar and IMU in the program?

from direct_lidar_odometry.

kennyjchen avatar kennyjchen commented on July 17, 2024

Interesting. It shouldn't; the IMU callback only pulls from the angular_velocity and linear_acceleration fields. Glad to hear you figured it out though.

DLO's world coordinate system depends on the initial position of the LiDAR, so if you start tilted the entire map will be tilted. I checked out your bag though and it looks like you disabled gravity alignment. Your data does better if you turn it on. Keep in mind that the gravity vector isn't fully observable (strictly speaking) so it's just an estimated procedure -- but we've seen pretty good results from our experience (with a good IMU).

Extrinsics isn't currently supported, but it's not as necessary as other approaches since we're just using a relative rotational prior for scan-matching.

from direct_lidar_odometry.

HomieRegina avatar HomieRegina commented on July 17, 2024

Thank you for your reply and patience. I have solved the above problems.
By the way. How to generate .pcd format file? I use the official order to get the following information.
图片

from direct_lidar_odometry.

kennyjchen avatar kennyjchen commented on July 17, 2024

Happy to help. You need a space between the two arguments, and the last argument should be the parent folder for the file, i.e.

rosservice call /robot/dlo_map/save_pcd 0.5 ~/Downloads

will save the map as ~/Downloads/dlo_map.pcd.

from direct_lidar_odometry.

HomieRegina avatar HomieRegina commented on July 17, 2024

Thanks for your great work and attentive reply. I have successfully got the .pcd file.

Now I found new problems. Here are some pictures that are the results of the data set I recorded in the outdoor environment. In Figure 2, it can be seen from the .pcd file that there is an obvious drift in the map. When the path of motion returns to the origin, you can clearly see the drift of the map (as shown in Figure 3). Have you ever encountered this situation when you were building a map? How did you solve it?

By the way, there are obvious pause (not jitter) when building the map. What is the reason for this?

图片
图片
图片

from direct_lidar_odometry.

kennyjchen avatar kennyjchen commented on July 17, 2024

I don't see any obvious drift with your map itself, but if you are referring to the map's tilt, make sure you start on flat ground as I mentioned previously (i.e., Z is axis-aligned to gravity). Your initial position looks like it has some positive roll and maybe some negative pitch. If you're using our gravity alignment procedure, the IMU and LiDAR needs to be rotationally-aligned on your platform (I can't tell if they are from your second reply).

Regarding the periodic pauses, our map publisher is on a one second ros::Timer, so RViz may slow down / delay as the map grows. This was done in the event of lost communication with our robots for SubT so that the robot could send the full map once it came within range again. I added an option in v1.4.2 to turn this off and only publish keyframes individually as a solution. Make sure to turn up the Decay Time if you want to see the full map.

from direct_lidar_odometry.

HomieRegina avatar HomieRegina commented on July 17, 2024

Thank you for your patient reply. I think maybe my statement is not accurate enough. You can pay close attention to my latest reply (the reply with three photos) because this data set is a new recording package. My data set starts from the starting point and finally returns to the starting point. You could see in my last picture that there is a height difference between my starting position and my final position. This height difference is the problem I want to solve. Do you know the reason for the inaccurate vertical positioning?
And I'm sure I set gravityAlign:true and the IMU and lidar coordinate system are physically consistent, but there is no rotation alignment. Do I need to rotate and align the IMU and lidar? How to achieve rotation alignment? Can rotating alignment solve the problem of inaccurate vertical positioning? We look forward to your reply.

from direct_lidar_odometry.

kennyjchen avatar kennyjchen commented on July 17, 2024

Try playing around with the voxelization (i.e, turn off submap voxelization), the number of keyframes used in submapping (maybe increase it), and the maximum correspondence distances for S2S and S2M.

from direct_lidar_odometry.

HomieRegina avatar HomieRegina commented on July 17, 2024

Hi, thank you for your good work and reply. I'm trying to adjust these parameters these days. I want to ask you something I'm not very clear about.

When I changed the parameters of submap voxeFilter, I found that the clarity of its mapping has changed. When I looked at the code, I found that this parameter is related to the keyframe of the point cloud. Is that right? But when I turned down this parameter, I found that the mapping was clearer. When I turned up this parameter, I found that the number of keyframes become less. So I want to ask, is this parameter a filter? Does it take the reciprocal when used? Or in other forms?

And when I adjusted the parameter of maximum correspondence distances, I knew nothing about the changes in the mapping process. But it is clear that the coordinate error of the map has changed a lot. I can't find its rule for the time being. I want to know what factors affect the mapping process by adjusting the maximum correspondence distances parameters of S2S and S2M in GICP.

Thank you for your answer. We look forward to your reply

from direct_lidar_odometry.

kennyjchen avatar kennyjchen commented on July 17, 2024

A voxel filter downsamples a pointcloud depending on the leaf size of each voxel. Think of this as the size of each "3D pixel," so the larger the number the bigger the voxel and vice versa. Then, for a leaf size of 0.1 (for example), there will be one point per 0.1 cubic meters in the cloud.

This can affect DLO's adaptive keyframing from our spaciousness metric, which computes the median point distance.

Maximum correspondence distance affects scan-matching and which pair of points to consider during optimization. In larger environments, a larger distance is generally good since points are probably more spread out. In smaller environments (like your dataset), bad correspondence matching can corrupt the optimization process and therefore the overall result. This holds true mainly for S2M; for S2S, the search is purely between two instantaneous scans. In that case, it's slightly dependent on walking speed and the rate of the LiDAR (but in general it's not as sensitive as S2M).

Here is a nice tutorial on what voxel filtering is, and you can read more here about correspondence-based registration.

from direct_lidar_odometry.

HomieRegina avatar HomieRegina commented on July 17, 2024

Thanks for your reply. I will understand this part carefully.

I recorded a dataset in the business district and built a map. We took the escalator from the second floor of the mall to the first floor of the mall. After turning around on the first floor, I returned to the starting point of the second floor. The effect of the map is good.
image
image
image
image
You can see a slash across the second floor. Don't worry. This part is the escalator from the second floor to the third floor. As shown in the third figure.

from direct_lidar_odometry.

kennyjchen avatar kennyjchen commented on July 17, 2024

Nice! Thanks for sharing your results with us :^)

If there are no more questions, I'll be closing this thread. Feel free to reopen or create a new issue if you have any further questions or comments about our work.

from direct_lidar_odometry.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.