Comments (14)
Hi bjornph,
did you succesfully tested the LSD-SLAM with the TUM RGB-D Benchmark dataset?
I'm trying to do it, but the software doesn't initialize well from the first images therefore it loses the track very soon.
from orb_slam.
I had to stop working on it to try something else for some time. However I've looked at it for the last couple of days. Have you gotten it to work?
I made it work good on some and not on others. Which datasets are you trying to run? The ORB-SLAM paper has a nice overview of it in Table III.
My tests:
- fr3_sit_xyz: My error of 8.5 cm in comparison to their 7,73 which is pretty damn close. All of the - fr3_.... datasets are pre-rectified so I just used standard parameters and the camera info.
- fr2_desk: The best I've gotten is an error of ~30cm, compared to the 4.57 which is quite the difference. With ORB-SLAM I got an error of ~2cm.
My test setup:
To record the needed parameters, I use:
- rostopic echo -p /lsd_slam/keyframes/camToWorld > lsd_camToWorld.txt
- rostopic echo -p /lsd_slam/keyframes/time > lsd_time.txt
I then match the comparing keyframes with ground truth on time, and run a script to obtain the scale difference. To get the absolute trajectory error, I use http://vision.in.tum.de/data/datasets/rgbd-dataset/online_evaluation python scripts.
Notes/questions:
fr2_desk: The dataset is compressed. I don't know if this affects the performance. You can decompress it, try "rosbag decompress -h" for info. The camera_info included is the kinect standard ones, from http://vision.in.tum.de/data/datasets/rgbd-dataset/file_formats#intrinsic_camera_calibration_of_the_kinect. I have also tried to replace them with fr2 parameters, but it does not give good results. I find their use of calib files confusing, so when I try to undistort it I run it through image_proc.
Hope this helps some. Keep me updated if you get any breakthrough. I promise it won't take as long to answer next time.
from orb_slam.
Thanks for your reply.
I'm trying it with almost all the datasets, but often the tracking is lost very soon unless I set a large value for KFUsageWeight and KFDistWeight threshold. However this involve a very poor quality of the map.
Which value did you use for the keyframe threshold?
Futhermore when the algorithm complete the sequence and I evaluate with the online tool the ATE the keyframe poses and the ground truth haven't the same "shape" and the error is at least 1 meter. To gather the keyframe poses I use the same command you wrote so I don't understand why this happen.
For the calibration I'm using the ROS default parameters for all the datasets, because are the raccomanded ones on the TUM website.
Can you be so kind to give me the script to calculate the scale? I've tried to write it but without success.
from orb_slam.
- I'm using default parameters on KFUsageWegiht and KFDistWeight.
- On calibration: "We recommend to use the ROS default parameter set (i.e., without undistortion), as undistortion of the pre-registered depth images is not trivial" - My interpretation is that as long as you don't use depth images then it should not be a problem to use rectified color images.
- Scale and ATE calculation: See the small repository I uploaded [1]. I use a matlab script [2] and link to it in my script "ate_test.py". Steps: 1 To record keyframes I use "rostopic echo -p /lsd_slam/keyframes/camToWorld > lsd_camToWorld.txt" and "rostopic echo -p /lsd_slam/keyframes/time > lsd_time.txt". 2 process these with "python lsd_to_readable.py lsd_time.txt lsd_camToWorld.txt lsd.txt(<-the output text file)". 3. Process it all with "python ate_test.py lsd.txt gt.txt(e.g. rgbd_dataset_freiburg3_sitting_xyz-groundtruth.txt). This first uses "associate.py" to find corresponding timestamps, then it uses the matlab script "getScale.m" which again links to "absor.m"[2] to get the scale. Then it performs the ATE test with "evaluate_ate.py".
- To link matlab and python, see [3]
- I have included my test result form fr3/sit_xyz in [1] so you can play around.
- There are som hardcoded parameters and all around lazy code in this as I thought I was only going to use it myself. You have to change the path in "ate_test.py" at least. Ask if there's any confusion.
Hope this helps
[1] https://github.com/bjornph/lsd_files
[2] http://www.mathworks.com/matlabcentral/fileexchange/26186-absolute-orientation-horn-s-method
[3] http://se.mathworks.com/help/matlab/matlab_external/install-the-matlab-engine-for-python.html#responsive_offcanvas
from orb_slam.
@bjornph
Please excuse me for taking so long to answer.
I was quite busy in these days so I tried your code only yesterday... and it works!
Thanks a lot you've been very helpful.
I have one question left: in the LSD-SLAM paper when they tried the TUM benchmark dataset they used the depth image for the initialization. Did you do the same?
Thanks again
from orb_slam.
So nice that it worked, some questions?
- Which datasets did you try, and what results did you get?
- Did you do any undistortion of the images - how? Parameters, etc.
Your question: I am not sure what you mean, but if it's this part you are referring to: "For comparison we show respective results from semi-dense mono-VO
[9], keypoint-based mono-SLAM [15], direct RGB-D SLAM [14] and keypoint-
based RGB-D SLAM [7]. Note that [14] and [7] use depth information from the
sensor, while the others do not.", then it says that RGB-D slam uses depth information, and the others, including LSD-SLAM, does not.
from orb_slam.
May I ask a question?
why I run rostopic echo -p /lsd_slam/keyframes/camToWorld > lsd_camToWorld.txt
get: ERROR: Cannot load message class for [lsd_slam_viewer/keyframeMsg]. Are your messages built?
thanks. @bjornph
from orb_slam.
@weichnn I have not looked at this in quite some time, so I am not sure. I would ask this question in the lsd_slam github page. My guess is that you don't load the correct reps in your bash file.
Good luck
from orb_slam.
In project readme: Instead, this is solved in LSD-SLAM by publishing keyframes and their poses separately:
keyframeGraphMsg contains the updated pose of each keyframe, nothing else.
keyframeMsg contains one frame with it's pose, and - if it is a keyframe - it's points in the form of a depth map.
so, I use topic /lsd_slam/graph now. thanks. @bjornph
from orb_slam.
@bjornph Sorry, did you do some pre-processing to the ground truth data or the keyframeTrayectory that ORB SLAM provides you before use the "online evaluation" ? i got very bad results with the freiburg1 sequences
from orb_slam.
@dawei22 I do not recall the specific details of my test. What kind of results are you getting?
from orb_slam.
Hi @bjornph,
when I do the first steps to record data from the different topics you mentioned I get weird time data. the lsd_to_readable script works fine but then I get an error related to associate.py:
Traceback (most recent call last):
File "associate.py", line 117, in
second_list = read_file_list(args.second_file)
File "associate.py", line 68, in read_file_list
list = [(float(l[0]),l[1:]) for l in list if len(l)>1]
ValueError: could not convert string to float: {\fonttbl\f0\fmodern\fcharset0
Do you have any idea how to solve this ? how did you manage to get time data in your example approximately the same as in ground truth ?
thanks !!
from orb_slam.
Hello someone, I am trying to do first steps: 1 To record keyframes I use "rostopic echo -p /lsd_slam/keyframes/camToWorld > lsd_camToWorld.txt" and "rostopic echo -p /lsd_slam/keyframes/time > lsd_time.txt".
My files camToWorld.txt and time.txt are empties
PARAMETERS
- /rosdistro: indigo
- /rosversion: 1.11.21
NODES
auto-starting new master
process[master]: started with pid [81362]
ROS_MASTER_URI=http://haidara-virtual-machine:11311/
setting /run_id to dc6eec44-b7ba-11e7-8504-000c2967f3d5
process[rosout-1]: started with pid [81376]
started core service [/rosout]
haidara@haidara-virtual-machine:/catkin_ws$ rosrun lsd_slam_core dataset _files:='/home/haidara/Downloads/fr1_rgb_calibration' _hz:=0 _calib:='/home/haidara/Downloads/fr1_rgb_calibration/cameraCalibration.cfg'! skipping.
Reading Calibration from file /home/haidara/Downloads/fr1_rgb_calibration/cameraCalibration.cfg ... found!
found ATAN camera model, building rectifier.
Input resolution: 640 480
In: 0.262383 -0.953104 -0.005358 0.002628 1.163314
Out: 0.262383 -0.953104 -0.005358 0.002628 1.163314
Output resolution: 640 480
Prepped Warp matrices
Started mapping thread!
Started constraint search thread!
Started optimization thread
found 68 image files in folder /home/haidara/Downloads/fr1_rgb_calibration!
failed to load image /home/haidara/Downloads/fr1_rgb_calibration/cameraCalibration.cfg! skipping.
failed to load image /home/haidara/Downloads/fr1_rgb_calibration/cameraCalibration.cfg
Doing Random initialization!
started image display thread!
Done Random initialization!
warning: reciprocal tracking on new frame failed badly, added odometry edge (Hacky).
TRACKING LOST for frame 22 (0.47% good Points, which is 52.07% of available points, DIVERGED)!
failed to load image /home/haidara/Downloads/fr1_rgb_calibration/ost.txt! skipping.
Finalizing Graph... finding final constraints!!
Optizing Full Map!
Done optizing Full Map! Added 0 constraints.
Finalizing Graph... optimizing!!
doing final optimization iteration!
Finalizing Graph... publishing!!
Done Finalizing Graph.!!
... waiting for SlamSystem's threads to exit
Exited mapping thread
Exited constraint search thread
Exited optimization thread
DONE waiting for SlamSystem's threads to exit
waiting for image display thread to end!
ended image display thread!
done waiting for image display thread to end!
haidara@haidara-virtual-machine:~/catkin_ws$
haidara@haidara-virtual-machine:~$ rostopic echo -p /lsd_slam/keyframes/camToWorld > lsd_camToWorld.txt
Can someone please help me out on this one.
Thanks in advance.
from orb_slam.
Hello bjorn, I'm also trying to use TUM online tool to evaluate LSD_SLAM and followed your instructions from the previous commentaries. But, I run into a problem. My lsd_time.txt is being generated with wrong information, or at least not in the format that TUM uses to compare. Comparing with your file my parameter "field" on lsd_time are wrong, i get values like 1.79999876022 while you get values like 1341845820.99. So, as my lsd_time is at the wrong format TUM tool can't find any timestamp correspondences. I also run your scripts, and I get error "Index exceeds matrix dimensions" due to the values from lsd_time, because I run with your files and everything worked fine..
Do you know why i'm getting these wrong values in lsd_time?
Thanks
from orb_slam.
Related Issues (20)
- Extracting depth images using rosrun command through realsense R200 camera.
- problem with roslaunch ExampleGroovyOrNewer.launch
- ORB_SLAM failure on indigo ubuntu14.04
- Can I run this on mobile device
- How to extract x-y position of ORB Keypoints?
- final problem in build when running make command. HOT 6
- Configuration for using ORB_SLAM with high resolution 12MP images.
- Orientation of Axis in ORB
- CMake Error HOT 2
- Superpoint front-end HOT 1
- Validate ORB SLAM with TUM Dataset
- ../Thirdparty/DBoW2/lib/libDBoW2.dylib: file not recognized: File format not recognized HOT 1
- Only SfM HOT 1
- True scale tacking and mapping using downward looking camera and rangefinder
- Using /usb_cam/image_raw topic instead of camera/image_raw
- Function to Reset
- Error on final build of ORB_SLAM HOT 2
- Keyframe Tracking but map not moving
- Executable file not found: cannot launch node of type [ORB_SLAM/ORB_SLAM]
- left feature exctract
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from orb_slam.