Git Product home page Git Product logo

orb_slam's People

Contributors

raulmur avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

orb_slam's Issues

Fail to initialize the relative pose of first two frames

Hi, @raulmur r
I am sorry to trouble you but I am facing a very hard problem
I was using your method to compute the relative pose of the first two frames.
but I always get two solutions, and the value of "bestGood" and "secondBestGood" nealy the same.
I am looking forward to hearing from you and best wishes to you.
thanks in advance

                                                    sincerely
                                                       Li Qile

How to visualize the map with color?

Hello, in the video below:
https://www.youtube.com/watch?t=30&v=HlBmq70LKrQ

There are two versions of map, the first is the one contains dots in red and black
http://imgur.com/ra566zO

And another version is colorful
http://imgur.com/xn2RG5o

I've tried to find possible topic publishing the colorful version, but I haven't found one. (I've seen /ORB_SLAM/Map_array, but it's a empty topic) Also, I've checked ROS parameters and found nothing related. So I can only get the first version in my Rviz.
http://imgur.com/UDXhtgO

Should I modify the code or do something else to see the colorful version?
Thanks in advance.

Code for learning vocab tree

@raulmur:

Hi,
I was wondering if you could point me to the code you used to train the vocab tree. Is it something you put together or is it part of Dbow2?
Thanks
Nikhil

only seeing polygons in rviz

Hi guys,

ORB-SLAM itself seems to be working fine (features tracking well), but I can't visualize the 3D map in rviz at all. I see very weird shapes/polygons instead. Here's a screenshot. Any help would be appreciated.

Thanks,
Sid

Hi, how to access the pose of the camera?

Hello, @raulmur
I am sorry to trouble you but I am facing some problems while porting your code to an arm board.

The 1st problem is that I have no idea how to get the camera pose.
I have seen into your code, and found following lines
cv::Mat Rwc = mCurrentFrame.mTcw.rowRange(0,3).colRange(0,3).t();
cv::Mat twc = -Rwc*mCurrentFrame.mTcw.rowRange(0,3).col(3);
is Rwcthe rotation matrix of the camera in world coordinate system?
and
I am also wondering how to access the (x,y,z) coordinate of camera in world coordinate system?
is it twc or mCurrentFrame.mTcw.rowRange(0,3).col(3)?

The second problem is that how to remove the viewer and rviz of the program. Because I only need the pose of the camera. or how do I get to know all the topics with orbslam?

I am looking forward to hearing from you
best wishes to you

                                                                                                             yours, sincerely
                                                                                                             Li Qile

used equipement

Hi,
Can you tell me please witch camera (and other equipments,PC.... ) did you use, thank you.
Best regards.

error: Wrong path to settings

When I launch ORB-SLAM from the terminal by "rosrun ORB_SLAM ORB_SLAM PATH_TO_VOCABULARY PATH_TO_SETTINGS_FILE",There will be a mistake - [ERROR] [1438841720.919284826]: Wrong path to settings. Path must be absolut or relative to ORB_SLAM package directory .
ps:I have provided the path of ORB_SLAM directory to the environment variable of PATH_TO_SETTINGS_FILE

Error during make Indigo

Hi,
I got that error message during the ORB_SLAM make. I've done everything like in the manual.

Regards,
husdo

/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h: In instantiation of ‘DBoW2::TemplatedVocabulary<TDescriptor, F>::~TemplatedVocabulary() [with TDescriptor = cv::Mat; F = DBoW2::FORB]’:
/opt/ros/indigo/share/ORB_SLAM/src/main.cc:86:29: required from here
/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h:510:3: warning: deleting object of abstract class type ‘DBoW2::GeneralScoring’ which has non-virtual destructor will cause undefined behaviour [-Wdelete-non-virtual-dtor]
delete m_scoring_object;
^
/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h: In instantiation of ‘void DBoW2::TemplatedVocabulary<TDescriptor, F>::createScoringObject() [with TDescriptor = cv::Mat; F = DBoW2::FORB]’:
/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h:420:23: required from ‘DBoW2::TemplatedVocabulary<TDescriptor, F>::TemplatedVocabulary(int, int, DBoW2::WeightingType, DBoW2::ScoringType) [with TDescriptor = cv::Mat; F = DBoW2::FORB]’
/opt/ros/indigo/share/ORB_SLAM/src/main.cc:86:29: required from here
/opt/ros/indigo/share/ORB_SLAM/Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h:446:3: warning: deleting object of abstract class type ‘DBoW2::GeneralScoring’ which has non-virtual destructor will cause undefined behaviour [-Wdelete-non-virtual-dtor]
delete m_scoring_object;
^
[ 11%] Building CXX object CMakeFiles/ORB_SLAM.dir/src/Tracking.cc.o
[ 16%] Building CXX object CMakeFiles/ORB_SLAM.dir/src/LocalMapping.cc.o
[ 22%] Building CXX object CMakeFiles/ORB_SLAM.dir/src/LoopClosing.cc.o
[ 27%] Building CXX object CMakeFiles/ORB_SLAM.dir/src/ORBextractor.cc.o
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc: In member function ‘void ORB_SLAM::ORBextractor::ComputeKeyPoints(std::vectorstd::vector<cv::KeyPoint >&)’:
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:606:63: error: ‘FAST’ was not declared in this scope
FAST(cellImage,cellKeyPoints[i][j],fastTh,true);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:615:34: error: ‘ORB’ has not been declared
if( scoreType == ORB::HARRIS_SCORE )
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:682:17: error: ‘KeyPointsFilter’ has not been declared
KeyPointsFilter::retainBest(keysCell,nToRetain[i][j]);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:698:13: error: ‘KeyPointsFilter’ has not been declared
KeyPointsFilter::retainBest(keypoints,nDesiredFeatures);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc: In member function ‘void ORB_SLAM::ORBextractor::operator()(cv::InputArray, cv::InputArray, std::vectorcv::KeyPoint&, cv::OutputArray)’:
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:759:82: error: ‘GaussianBlur’ was not declared in this scope
GaussianBlur(workingMat, workingMat, Size(7, 7), 2, 2, BORDER_REFLECT_101);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc: In member function ‘void ORB_SLAM::ORBextractor::ComputePyramid(cv::Mat, cv::Mat)’:
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:799:78: error: ‘INTER_LINEAR’ was not declared in this scope
resize(mvImagePyramid[level-1], mvImagePyramid[level], sz, 0, 0, INTER_LINEAR);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:799:90: error: ‘resize’ was not declared in this scope
resize(mvImagePyramid[level-1], mvImagePyramid[level], sz, 0, 0, INTER_LINEAR);
^
/opt/ros/indigo/share/ORB_SLAM/src/ORBextractor.cc:802:80: error: ‘INTER_NEAREST’ was not declared in this scope
resize(mvMaskPyramid[level-1], mvMaskPyramid[level], sz, 0, 0, INTER_NEAREST);
^
make[2]: *** [CMakeFiles/ORB_SLAM.dir/src/ORBextractor.cc.o] Error 1
make[1]: *** [CMakeFiles/ORB_SLAM.dir/all] Error 2
make: *** [all] Error 2

Camera specs

Hi Raul,
thanks for your work.
Can you tell me which camera did you use and if there are some camera specs that improve the performance more than others?
Thanks

Mauro

test on KITTI VO seq 00: Not initialized -> Trying to initialize

Dear all,

I had installed ORB_SLAM and run the example successfully. Now I turn into having a test on KITTI VO seq 00, like authors did in their paper.
But when I start ORB_SLAM on KITTI VO seq 00, I just receive "Not initialized -> Trying to initialize", and seeing street-view images with green edges on it in the /ORB_SLAM/Frame, please the link:
https://www.dropbox.com/s/0oi3iu3nqzwpv1h/47.png?dl=0

Could someone give me some hints?

Thanks in advance~

Milton

Tracking fails half way for KITTI seq 00

Environment: ROS indigo, Ubuntu Trusty 14.04 on VM ware workstation, CPU i7 2.4GHz, 8G RAM
Dataset: KITTI\data_odometry_gray\dataset\sequences\00\image_0
Bag file: created with rosrun BagFromImages BagFromImages IMAGE_PATH 10.0 image_0.bag
settings.yaml:
%YAML:1.0

Camera Parameters. Adjust them!

Camera calibration parameters (OpenCV)

Camera.fx: 718.856
Camera.fy: 718.856
Camera.cx: 607.1928
Camera.cy: 185.2157

Camera distortion paremeters (OpenCV) --

Camera.k1: 0
Camera.k2: 0
Camera.p1: 0.0
Camera.p2: 0.0

Camera frames per second

Camera.fps: 10.0

Color order of the images (0: BGR, 1: RGB. It is ignored if images are grayscale)

Camera.RGB: 1

--------------------------------------------------------------------------------------------

Changing the parameters below could seriously degradate the performance of the system

ORB Extractor: Number of features per image

ORBextractor.nFeatures: 2000

ORB Extractor: Scale factor between levels in the scale pyramid

ORBextractor.scaleFactor: 1.2

ORB Extractor: Number of levels in the scale pyramid

ORBextractor.nLevels: 8

ORB Extractor: Fast threshold (lower less restrictive)

ORBextractor.fastTh: 20

ORB Extractor: Score to sort features. 0 -> Harris Score, 1 -> FAST Score

ORBextractor.nScoreType: 1

Constant Velocity Motion Model (0 - disabled, 1 - enabled [recommended])

UseMotionModel: 1

In execution, the tracking always failed and the relocalization module was invoked regardless of velocity motion model and nScoreType.
Do you have any idea why the result as shown on the ORB-SLAM project website cannot be reproduced?

error:undefined symbol

hi, I am running your code with dataset from tum-vision
but an error occured, and the process died.

[ INFO] [1429678011.002510623]: New Map created with 425 points /home/hongbohuang/catkin_ws/src/ORB_SLAM/bin/ORB_SLAM: symbol lookup error: /home/hongbohuang/catkin_ws/src/ORB_SLAM/bin/ORB_SLAM: undefined symbol: _ZN3g2o17EdgeSE3ProjectXYZC1Ev
[ORB_SLAM-3] process has died [pid 13939, exit code 127, cmd /home/hongbohuang/catkin_ws/src/ORB_SLAM/bin/ORB_SLAM Data/ORBvoc.yml Data/Settings.yaml __name:=ORB_SLAM __log:=/home/hongbohuang/.ros/log/6cfb0986-e893-11e4-be96-bcaec584f5df/ORB_SLAM-3.log]. log file: /home/hongbohuang/.ros/log/6cfb0986-e893-11e4-be96-bcaec584f5df/ORB_SLAM-3*.log

could you please tell me what's wrong with my options? thanks in advance
best

ORB-SLAM dies after 5 minutes

Hi,

when using ORB-SLAM in our applications (onboard an MAV) everything works fine until about 5 minutes into flight. Usually around that time, the ORB-SLAM process dies. Is there a way to see/check why it dies? Is it a memory issue? It just gives us exit code -9

about develop tool

hi, @raulmur I am ready to learn your algorithm and code,
could you please give me some suggestions about develop tool? or what kinds of tools do you use to develop orbslam (et. vim , emacs or other ides)?
thanks a lot!

Subpixel accuracy for tracking

Did you do any subpixel refinement on the tracked features?
FAST does not give subpixel detection and descriptor matching also does not do subpixel refinement.
(Unless you did it somewhere in the code and I didn't find it)
Will this cause the odometry to drift more compared to other template-based tracking methods (KLT or ESM)

saving/loading maps

This is pretty awesome, thanks for sharing!

Is there currently a way to save and load maps?

Using ATAN camera model

I'm using a very wide FOV camera (go pro), so I'd like to use the FOV camera model (Deverneay and Faugeras) instead of the provided model that OpenCV uses. I believe I've modified the code correctly, as I've double-checked my math by hand (by observing and verifying the final undistorted coordinates I feed back into the ORB-SLAM code) and the features have no problems tracking near the center of the frame. However, I'm having trouble with disappearing features at the edges of the frame (maybe the first 20% of each edge), so I wanted to check here and see if the authors had maybe implemented some checks to delete features that were corrected by the calibration to be too far outside of the image bounds eg. a feature at (10,10) in an image of size (1920,1080) that was then corrected to (-150,-50).

In case I haven't analyzed the ORB-SLAM code well, I should probably also mention what changes I've made to switch to the FOV model. In Data/Settings.yaml, I set fx,fy,cx,cy according to my calibration, and set k1,k2,p1,p2 to 0. I modified the call to cv::undistortPoints made in Frame.cc:UndistortKeyPoints and removed the last parameter. This allowed cv::undistortPoints to return normalized coordinates, which I then fed into my own function to account for radial distortion. I set the contents of mat to the values my function returned.

Thanks,
Sid

Is there a restart key available ?

When ORB_SLAM is running, what should I do if I want to go back to the initialization stage ?

Is there a key for restarting the program ?

Erreur de segmentation (core dumped)

Hi all, I'm trying to install ORB_SLAM but when I do rosrun ORB_SLAM ORB_SLAM...... I get this error: Erreur de segmentation (core dumped)
any help please, and thank you.

What's the reason for picking ORB?

I would like to know the reason for picking ORB as the feature descriptor.
There are other rotational and scale invariant binary descriptors.
If I switch ORB to FREAK (or BRISK), will the system work equally well?

Always" not initialize"

Dear Raul Mur
Hello, this is Li Qile
I am running your code with my own data recently. but the orb slam cannot initialize, I can only see green lines on the /ORB_SLAM/Frame
May I ask you for some suggestions to fix such situsations?
Thanks a lot
Best
Li Qile

Tracking on Corner-Poor Environments: Alternate Feature Detectors/Descriptors

Hello,
It would be interesting to combine this SLAM system with a feature detector and descriptor other than ORB, primarily to compare performance in special use cases, such as environments with few corners. However the ORB version implemented here is greatly extended with specific functionality which probably wouldn't be easy to convert.

  • Does anyone have an idea on this subject?
  • Can a SIFT feature detector be combined with an ORB descriptor?

Error Make Indigo

Hello,

I am currently using Ubuntu 14.04 + ROS Indigo. I have cloned the ORB_SLAM repository to my catkin workspace and all of the necessary dependencies are installed, but when I launch "ExampleGroovyHydro.launch" then I get the following error:

ERROR: cannot launch node of type [ORB_SLAM/ORB_SLAM]: can't locate node [ORB_SLAM] in package [ORB_SLAM]

And indeed I cannot find the ORB_SLAM node within the package. Also when I run the following instruction: rosrun ORB_SLAM ORB_SLAM PATH_TO_VOCABULARY PATH_TO_SETTINGS_FILE, I get:

[rosrun] Couldn't find executable named ORB_SLAM below /home/nils/catkin_ws/src/ORB_SLAM.

Does anyone know how I could solve this error? I would appreciate it a lot. Regards.

Tf Transform

Hi raul,
Thanks for your contribution. I was wondering that why tf frames behave strangely on rviz? I have found that ORB-SLAM/Camera is fixed and world is moving, why? Sometimes I also noticed that ORB-SLAM/Camera is also rotating ! How come I convert it to real world coordinate? Does tf Transform give us camera pose information? Thanks in advance.

error

I was running ORB SLAM on my own data set , I got this error.

terminate called after throwing an instance of 'std::bad_alloc'
what(): std::bad_alloc
[ORB_SLAM-3] process has died [pid 7918, exit code -6, cmd /home/shabhu/Work/rosfuerte/ORB_SLAM/bin/ORB_SLAM Data/ORBvoc.yml Data/Settings.yaml __name:=ORB_SLAM __log:=/home/shabhu/.ros/log/28136f02-e8d9-11e4-bce3-18a905b9cd56/ORB_SLAM-3.log].
log file: /home/shabhu/.ros/log/28136f02-e8d9-11e4-bce3-18a905b9cd56/ORB_SLAM-3*.log

Can you help me out?

Error while building ORB_SLAM

Hello,

I am trying to build ORB_SLAM package. I am using ROS-Indigo on ubuntu 14.04. I get the following error on step 6 (https://github.com/raulmur/ORB_SLAM).

~/ORB_SLAM/build$ cmake .. -DROS_BUILD_TYPE=Release
-- Found PythonInterp: /usr/bin/python (found version "2.7.6")
[rosbuild] Building package ORB_SLAM
Failed to invoke /opt/ros/indigo/bin/rospack deps-manifests ORB_SLAM
[rospack] Error: package 'ORB_SLAM' depends on non-existent package 'opencv2' and rosdep claims that it is not a system dependency. Check the ROS_PACKAGE_PATH or try calling 'rosdep update'

CMake Error at /opt/ros/indigo/share/ros/core/rosbuild/public.cmake:129 (message):

Failed to invoke rospack to get compile flags for package 'ORB_SLAM'. Look above for errors from rospack itself. Aborting. Please fix the broken dependency!

Call Stack (most recent call first): /opt/ros/indigo/share/ros/core/rosbuild/public.cmake:207 (rosbuild_invoke_rospack)
CMakeLists.txt:4 (rosbuild_init)

-- Configuring incomplete, errors occurred!
See also "/home/steve/ORB_SLAM/build/CMakeFiles/CMakeOutput.log".

Can somebody please help me out?

Dictonary size

1.What is the size of visual dictionary you are using ,i mean how may nodes are there , I haven't seen such a good re localization in any other SLAM algorithm.
2. What does motion model in setting.yaml file tell. Are you using EKF with constant velocity model.

thanks

how can i run this on ROS(jade)?

when i install ORB_SLAM , i have some problem.
/******************************************************************************
Failed to invoke /opt/ros/jade/bin/rospack deps-manifests ORB_SLAM-master
[rospack] Error: package 'ORB_SLAM-master' depends on non-existent package 'opencv2' and rosdep claims that it is not a system dependency. Check the ROS_PACKAGE_PATH or try calling 'rosdep update'
******************************************************************************/
how can i solve this problem . Thanks.

rosrun image_view

Hi all,
I had a black window when I did:
rosrun image_view image_view image:=/ORB_SLAM/Frame _autosize:=true, any suggestion please.
PS: I'm using a Logitech camera. My frames.pdf contains "no tf data recieved".
Thank you a lot.

ORB_SLAM/Frame, Map ,Camera, World

Hi,

can you show me where I can find Frame , Map,Camera and World because I don't find them under ORB_SLAM (in general I think that I didn't understand those two parts :
2-The last processed frame is published to the topic /ORB_SLAM/Frame. You can visualize it using image_view and number 3).

Thank you a lot.

How to compute the relative pose

Hello, I am looking into your code recently
But I could not understand how to compute the relative pose between the first two frames.
could you please give me some references? thanks

ORB_SLAM initialization failure on indigo ununto 14.04

Hello
i have remapped the camera topics correctly . but getiing this error
RB-SLAM Copyright (C) 2014 Raul Mur-Artal
This program comes with ABSOLUTELY NO WARRANTY;
This is free software, and you are welcome to redistribute it
under certain conditions. See LICENSE.txt.

Loading ORB Vocabulary. This could take a while.

Wrong path to vocabulary. Path must be absolut or relative to ORB_SLAM package directory.
[orb_slam-3] process has died [pid 30997, exit code 1, cmd /home/farhan/catkin_ws/devel/lib/orb_slam/orb_slam Data/ORBvoc.yml Data/Settings.yaml /camera/image_raw:=/creative_cam/image_color __name:=orb_slam __log:=/home/farhan/.ros/log/2e26604a-16ae-11e5-9789-d8fc9361bda4/orb_slam-3.log].
log file: /home/farhan/.ros/log/2e26604a-16ae-11e5-9789-d8fc9361bda4/orb_slam-3*.log

Thanks
Farhan

Some questions on your test process and result (TUM RGB-D Benchmark)

I am testing ORB-SLAM (and LSD-SLAM) on the TUM RGB-D Benchmark dataset, and I have some question on your test process:

ORB questions:

  • Did you only test with KeyFrames, or did you test with the pose of alle the frames?
  • Did you match on timing to find the correct match between ORB-SLAM poses and groundtruth poses?
  • Does the hardware performance affect the RMSE error?
  • Does ORB-SLAM rectify images by itself, or do they have to be pre-rectified?
  • Did you use standard ROS kinect intrinsic parameters, or did you use the fr(1,2,3) calibrated parameters?

LSD-SLAM questions:

  • Did you only use Keyframes, or poses for all the frames?

Thank you

Missing include DBoW2

In Thirdparty/DBoW2/DBoW2/TemplatedVocabulary.h

I needed to add:

include <limits>

to fix:
error: ‘numeric_limits’ is not a member of ‘std’

camera poses of all the frames

Hi Raul,

Thanks for sharing the code.
I want to save the camera pose of all the frames, not only for the key frames. Could you please tell how to do this?

thanks,

Could you provide different kinds of ORB vocabulary

I have installed ORB_SLAM on the odroid u3.
However, the ORBvoc file is too big to run on the odroid board.
I would appreciate it if you could offer another smaller ORB vocabulary file just for indoor environment.

scale changed after turnning

hi, @raulmur
thanks for your sharing your orb-slam code .sorry to trouble you but I have a trouble while tracking
I firstly go straight towards north and then turn to east
orb-slam can calculate the degrees very well, but the scale of distance changed obviously after turning to east.
the walking distance towards north and towards east are nealy the same, but on the rviz, the distance to east is obviously shorter than that to north
could you give me some suggestions ?
thanks and best

Different error buildiing ORB-SLAM

Installation: Ubuntu 14.04
ROS: Indigo
Output of $ROS_PACKAGE_PATH=/home/josh/Desktop/ORB-SLAM:/opt/ros/indigo/share:/opt/ros/indigo/stacks

Error occurs in building ORB-SLAM at step 3-6. Output of error:

josh@josh-Surface-with-Windows-8-Pro:~/Desktop/ORB_SLAM/build$ cmake .. -DROS_BUILD_TYPE=RELEASE
[rosbuild] Building package ORB_SLAM
[rosbuild] Error from directory check: /opt/ros/indigo/share/ros/core/rosbuild/bin/check_same_directories.py /home/josh/Desktop/ORB_SLAM
1
Traceback (most recent call last):
File "/opt/ros/indigo/share/ros/core/rosbuild/bin/check_same_directories.py", line 46, in
raise Exception
Exception
CMake Error at /opt/ros/indigo/share/ros/core/rosbuild/private.cmake:102 (message):
[rosbuild] rospack found package "ORB_SLAM" at "", but the current
directory is "/home/josh/Desktop/ORB_SLAM". You should double-check your
ROS_PACKAGE_PATH to ensure that packages are found in the correct
precedence order.
Call Stack (most recent call first):
/opt/ros/indigo/share/ros/core/rosbuild/public.cmake:177 (_rosbuild_check_package_location)
CMakeLists.txt:4 (rosbuild_init)

-- Configuring incomplete, errors occurred!
See also "/home/josh/Desktop/ORB_SLAM/build/CMakeFiles/CMakeOutput.log".

Further research on answers.ros.org indicates that using CMAKE isn't supported, given the error above. (source: http://answers.ros.org/question/65801/ros-inside-part-of-a-c-project/ )

Does anyone have any sort of answers how to make this work? This would be an impressive project to get up and running.

Sincerely,
Josh Conway

How to create ORB vocabulary from dataset?

While I have managed to get your example file to work, I struggle with providing an ORB vocabulary from my own dataset. I have a set of image files converted to a .bag file.

How do you extract ORB features from an image dataset to provide to ORB-SLAM?

multiple camera with orb_slam

Hi

I have a question about how to integrate multiple cameras in ORB_SLAM in a way that each can use the map generated from the other one (not generate new map) for localization.

Thanks

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.