arrenglover / openfabmap Goto Github PK
View Code? Open in Web Editor NEWOpen-source C++ code for the FAB-MAP visual place recognition algorithm
License: Other
Open-source C++ code for the FAB-MAP visual place recognition algorithm
License: Other
In an effort to improve efficiency, some of my in-progress code may benefit from using Eigen matrix math vs OpenCV matrix math. A lot of this would be the manipulation of small vectors / matrices of sizes < 5, making OpenCV agonizingly slow.
For discussion
Hi. I'm trying to install openFABMAP and having a few problems.
When I type in make after building, I get the following message.
[ 66%] Built target openFABMAP
[ 83%] Building CXX object CMakeFiles/openFABMAPcli.dir/samples/openFABMAPcli.cpp.o
/home/kd/trunk/samples/openFABMAPcli.cpp: In function ‘int openFABMAP(std::__cxx11::string, of2::FabMap*, std::__cxx11::string, std::__cxx11::string, bool)’:
/home/kd/trunk/samples/openFABMAPcli.cpp:571:16: error: ambiguous overload for ‘operator=’ (operand types are ‘cv::Mat’ and ‘int’)
confusion_mat = 0; // init to 0's
^
In file included from /usr/local/include/opencv2/core/core.hpp:4922:0,
from /usr/local/include/opencv2/opencv.hpp:49,
from /home/kd/trunk/samples/../include/openfabmap.hpp:38,
from /home/kd/trunk/samples/openFABMAPcli.cpp:31:
/usr/local/include/opencv2/core/mat.hpp:281:13: note: candidate: cv::Mat& cv::Mat::operator=(const cv::Mat&)
inline Mat& Mat::operator = (const Mat& m)
^
In file included from /usr/local/include/opencv2/opencv.hpp:49:0,
from /home/kd/trunk/samples/../include/openfabmap.hpp:38,
from /home/kd/trunk/samples/openFABMAPcli.cpp:31:
/usr/local/include/opencv2/core/core.hpp:1795:10: note: candidate: cv::Mat& cv::Mat::operator=(const Scalar&)
Mat& operator = (const Scalar& s);
^
CMakeFiles/openFABMAPcli.dir/build.make:62: recipe for target 'CMakeFiles/openFABMAPcli.dir/samples/openFABMAPcli.cpp.o' failed
make[2]: *** [CMakeFiles/openFABMAPcli.dir/samples/openFABMAPcli.cpp.o] Error 1
CMakeFiles/Makefile2:67: recipe for target 'CMakeFiles/openFABMAPcli.dir/all' failed
make[1]: *** [CMakeFiles/openFABMAPcli.dir/all] Error 2
Makefile:83: recipe for target 'all' failed
make: *** [all] Error 2
I am currently using openCV 2.4. I would really appreciate any help.
Thank you!
/home/mpkuse/Downloads/openfabmap/src/chowliutree.cpp: In member function ‘bool of2::ChowLiuTree::reduceEdgesToMinSpan(std::__cxx11::listof2::ChowLiuTree::info&)’:
/home/mpkuse/Downloads/openfabmap/src/chowliutree.cpp:272:5: error: ‘map’ is not a member of ‘std’
std::map<int, int> groups; std::map<int, int>::iterator groupIt;
^
/home/mpkuse/Downloads/openfabmap/src/chowliutree.cpp:272:14: error: expected primary-expression before ‘int’
std::map<int, int> groups; std::map<int, int>::iterator groupIt;
^
/home/mpkuse/Downloads/openfabmap/src/chowliutree.cpp:272:32: error: ‘map’ is not a member of ‘std’
std::map<int, int> groups; std::map<int, int>::iterator groupIt;
^
/home/mpkuse/Downloads/openfabmap/src/chowliutree.cpp:272:41: error: expected primary-expression before ‘int’
std::map<int, int> groups; std::map<int, int>::iterator groupIt;
^
/home/mpkuse/Downloads/openfabmap/src/chowliutree.cpp:273:53: error: ‘groups’ was not declared in this scope
for(int i = 0; i < imgDescriptors[0].cols; i++) groups[i] = i;
^
/home/mpkuse/Downloads/openfabmap/src/chowliutree.cpp:278:12: error: ‘groups’ was not declared in this scope
if(groups[edge->word1] != groups[edge->word2]) {
^
/home/mpkuse/Downloads/openfabmap/src/chowliutree.cpp:281:17: error: ‘groupIt’ was not declared in this scope
for(groupIt = groups.begin(); groupIt != groups.end(); groupIt++)
^
CMakeFiles/openFABMAP.dir/build.make:158: recipe for target 'CMakeFiles/openFABMAP.dir/src/chowliutree.cpp.o' failed
make[2]: *** [CMakeFiles/openFABMAP.dir/src/chowliutree.cpp.o] Error 1
CMakeFiles/Makefile2:104: recipe for target 'CMakeFiles/openFABMAP.dir/all' failed
make[1]: *** [CMakeFiles/openFABMAP.dir/all] Error 2
Will Maddern modified the modified sequential clustering, by adding a few of the TODO functionality plus modifying the Mahalanobis distance to use a nearest neighbour technique for speed-ups. This code should be integrated back in to the bowmsctrainer. The openMP changes might conflict with these changes and needs to be dealt with.
The files are present in the repository but not included in the project.
Hi everyone, I've followed the steps in visualization and got the result.txt file, but I could not see the image thumbnails. I was thinking that it may be the problem of monitor size and thumbnail size, so I get into the setting file, but I cannot find these two parameters. Could anyone tell me where I can find these parameters?
Dear all,
I came across the following make error. It seems the original code supports opencv 2.3 while my system equips with opencv 2.4.8. What should I modify the code openFABMAPcli.cpp to pass the build step?
Thanks!
root@milton-OptiPlex-9010:/data/code/openfabmap/build# make
Scanning dependencies of target openFABMAP
[ 12%] Building CXX object CMakeFiles/openFABMAP.dir/src/chowliutree.cpp.o
[ 25%] Building CXX object CMakeFiles/openFABMAP.dir/src/inference.cpp.o
[ 37%] Building CXX object CMakeFiles/openFABMAP.dir/src/fabmap.cpp.o
[ 50%] Linking CXX static library lib/libopenFABMAP.a
[ 75%] Built target openFABMAP
Scanning dependencies of target openFABMAPcli
[ 87%] Building CXX object CMakeFiles/openFABMAPcli.dir/samples/openFABMAPcli.cpp.o
In file included from /usr/include/opencv2/nonfree/nonfree.hpp:46:0,
from /data/code/openfabmap/samples/openFABMAPcli.cpp:58:
/usr/include/opencv2/nonfree/features2d.hpp:73:21: error: ‘vector’ has not been declared
vector& keypoints) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:73:27: error: expected ‘,’ or ‘...’ before ‘<’ token
vector& keypoints) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:77:21: error: ‘vector’ has not been declared
vector& keypoints,
^
/usr/include/opencv2/nonfree/features2d.hpp:77:27: error: expected ‘,’ or ‘...’ before ‘<’ token
vector& keypoints,
^
/usr/include/opencv2/nonfree/features2d.hpp:76:10: error: ‘void cv::SIFT::operator()(cv::InputArray, cv::InputArray, int) const’ cannot be overloaded
void operator()(InputArray img, InputArray mask,
^
/usr/include/opencv2/nonfree/features2d.hpp:72:10: error: with ‘void cv::SIFT::operator()(cv::InputArray, cv::InputArray, int) const’
void operator()(InputArray img, InputArray mask,
^
/usr/include/opencv2/nonfree/features2d.hpp:81:5: error: ‘AlgorithmInfo’ does not name a type
AlgorithmInfo* info() const;
^
/usr/include/opencv2/nonfree/features2d.hpp:83:49: error: ‘vector’ has not been declared
void buildGaussianPyramid( const Mat& base, vector& pyr, int nOctaves ) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:83:55: error: expected ‘,’ or ‘...’ before ‘<’ token
void buildGaussianPyramid( const Mat& base, vector& pyr, int nOctaves ) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:84:33: error: ‘vector’ does not name a type
void buildDoGPyramid( const vector& pyr, vector& dogpyr ) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:84:39: error: expected ‘,’ or ‘...’ before ‘<’ token
void buildDoGPyramid( const vector& pyr, vector& dogpyr ) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:85:39: error: ‘vector’ does not name a type
void findScaleSpaceExtrema( const vector& gauss_pyr, const vector& dog_pyr,
^
/usr/include/opencv2/nonfree/features2d.hpp:85:45: error: expected ‘,’ or ‘...’ before ‘<’ token
void findScaleSpaceExtrema( const vector& gauss_pyr, const vector& dog_pyr,
^
/usr/include/opencv2/nonfree/features2d.hpp:89:40: error: ‘vector’ has not been declared
void detectImpl( const Mat& image, vector& keypoints, const Mat& mask=Mat() ) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:89:46: error: expected ‘,’ or ‘...’ before ‘<’ token
void detectImpl( const Mat& image, vector& keypoints, const Mat& mask=Mat() ) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:90:41: error: ‘vector’ has not been declared
void computeImpl( const Mat& image, vector& keypoints, Mat& descriptors ) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:90:47: error: expected ‘,’ or ‘...’ before ‘<’ token
void computeImpl( const Mat& image, vector& keypoints, Mat& descriptors ) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:125:28: error: ‘vector’ has not been declared
CV_OUT vector& keypoints) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:125:34: error: expected ‘,’ or ‘...’ before ‘<’ token
CV_OUT vector& keypoints) const;
^
In file included from /usr/include/opencv2/nonfree/nonfree.hpp:46:0,
from /data/code/openfabmap/samples/openFABMAPcli.cpp:58:
/usr/include/opencv2/nonfree/features2d.hpp:128:28: error: ‘vector’ has not been declared
CV_OUT vector& keypoints,
^
/usr/include/opencv2/nonfree/features2d.hpp:128:34: error: expected ‘,’ or ‘...’ before ‘<’ token
CV_OUT vector& keypoints,
^
/usr/include/opencv2/nonfree/features2d.hpp:127:10: error: ‘void cv::SURF::operator()(cv::InputArray, cv::InputArray, int) const’ cannot be overloaded
void operator()(InputArray img, InputArray mask,
^
In file included from /usr/include/opencv2/nonfree/nonfree.hpp:46:0,
from /data/code/openfabmap/samples/openFABMAPcli.cpp:58:
/usr/include/opencv2/nonfree/features2d.hpp:124:10: error: with ‘void cv::SURF::operator()(cv::InputArray, cv::InputArray, int) const’
void operator()(InputArray img, InputArray mask,
^
In file included from /usr/include/opencv2/nonfree/nonfree.hpp:46:0,
from /data/code/openfabmap/samples/openFABMAPcli.cpp:58:
/usr/include/opencv2/nonfree/features2d.hpp:132:5: error: ‘AlgorithmInfo’ does not name a type
AlgorithmInfo* info() const;
^
/usr/include/opencv2/nonfree/features2d.hpp:142:40: error: ‘vector’ has not been declared
void detectImpl( const Mat& image, vector& keypoints, const Mat& mask=Mat() ) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:142:46: error: expected ‘,’ or ‘...’ before ‘<’ token
void detectImpl( const Mat& image, vector& keypoints, const Mat& mask=Mat() ) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:143:41: error: ‘vector’ has not been declared
void computeImpl( const Mat& image, vector& keypoints, Mat& descriptors ) const;
^
/usr/include/opencv2/nonfree/features2d.hpp:143:47: error: expected ‘,’ or ‘...’ before ‘<’ token
void computeImpl( const Mat& image, vector& keypoints, Mat& descriptors ) const;
^
In file included from /usr/include/_G_config.h:15:0,
from /usr/include/libio.h:31,
from /usr/include/stdio.h:74,
from /usr/include/c++/4.8/cstdio:42,
from /usr/local/include/opencv2/core/operations.hpp:52,
from /usr/local/include/opencv2/core.hpp:3106,
from /usr/local/include/opencv2/core/core.hpp:48,
from /data/code/openfabmap/samples/../include/inference.hpp:57,
from /data/code/openfabmap/samples/../include/fabmap.hpp:57,
from /data/code/openfabmap/samples/../include/openfabmap.hpp:58,
from /data/code/openfabmap/samples/openFABMAPcli.cpp:54:
/data/code/openfabmap/samples/openFABMAPcli.cpp: In function ‘cv::Ptrcv::Feature2D generateDetector(cv::FileStorage&)’:
/data/code/openfabmap/samples/openFABMAPcli.cpp:668:42: error: conversion from ‘long int’ to non-scalar type ‘cv::Ptrcv::Feature2D’ requested
cv::Ptrcv::FeatureDetector detector = NULL;
^
/data/code/openfabmap/samples/openFABMAPcli.cpp:677:19: error: expected type-specifier
detector = new cv::DynamicAdaptedFeatureDetector(
^
/data/code/openfabmap/samples/openFABMAPcli.cpp:677:19: error: expected ‘;’
/data/code/openfabmap/samples/openFABMAPcli.cpp:687:19: error: expected type-specifier
detector = new cv::StarFeatureDetector(
^
/data/code/openfabmap/samples/openFABMAPcli.cpp:687:19: error: expected ‘;’
/data/code/openfabmap/samples/openFABMAPcli.cpp:699:32: error: cannot allocate an object of abstract type ‘cv::FastFeatureDetector’
["NonMaxSuppression"] > 0);
^
In file included from /usr/local/include/opencv2/features2d/features2d.hpp:48:0,
from /data/code/openfabmap/samples/../include/bowmsctrainer.hpp:58,
from /data/code/openfabmap/samples/../include/openfabmap.hpp:59,
from /data/code/openfabmap/samples/openFABMAPcli.cpp:54:
/usr/local/include/opencv2/features2d.hpp:388:20: note: because the following virtual functions are pure within ‘cv::FastFeatureDetector’:
class CV_EXPORTS_W FastFeatureDetector : public Feature2D
^
/usr/local/include/opencv2/features2d.hpp:401:26: note: virtual void cv::FastFeatureDetector::setThreshold(int)
CV_WRAP virtual void setThreshold(int threshold) = 0;
^
/usr/local/include/opencv2/features2d.hpp:402:25: note: virtual int cv::FastFeatureDetector::getThreshold() const
CV_WRAP virtual int getThreshold() const = 0;
^
/usr/local/include/opencv2/features2d.hpp:404:26: note: virtual void cv::FastFeatureDetector::setNonmaxSuppression(bool)
CV_WRAP virtual void setNonmaxSuppression(bool f) = 0;
^
/usr/local/include/opencv2/features2d.hpp:405:26: note: virtual bool cv::FastFeatureDetector::getNonmaxSuppression() const
CV_WRAP virtual bool getNonmaxSuppression() const = 0;
^
/usr/local/include/opencv2/features2d.hpp:407:26: note: virtual void cv::FastFeatureDetector::setType(int)
CV_WRAP virtual void setType(int type) = 0;
^
/usr/local/include/opencv2/features2d.hpp:408:25: note: virtual int cv::FastFeatureDetector::getType() const
CV_WRAP virtual int getType() const = 0;
^
/data/code/openfabmap/samples/openFABMAPcli.cpp:733:19: error: expected type-specifier
detector = new cv::MserFeatureDetector(
^
/data/code/openfabmap/samples/openFABMAPcli.cpp:733:19: error: expected ‘;’
In file included from /usr/include/_G_config.h:15:0,
from /usr/include/libio.h:31,
from /usr/include/stdio.h:74,
from /usr/include/c++/4.8/cstdio:42,
from /usr/local/include/opencv2/core/operations.hpp:52,
from /usr/local/include/opencv2/core.hpp:3106,
from /usr/local/include/opencv2/core/core.hpp:48,
from /data/code/openfabmap/samples/../include/inference.hpp:57,
from /data/code/openfabmap/samples/../include/fabmap.hpp:57,
from /data/code/openfabmap/samples/../include/openfabmap.hpp:58,
from /data/code/openfabmap/samples/openFABMAPcli.cpp:54:
/data/code/openfabmap/samples/openFABMAPcli.cpp: In function ‘cv::Ptrcv::Feature2D generateExtractor(cv::FileStorage&)’:
/data/code/openfabmap/samples/openFABMAPcli.cpp:763:47: error: conversion from ‘long int’ to non-scalar type ‘cv::Ptrcv::Feature2D’ requested
cv::Ptrcv::DescriptorExtractor extractor = NULL;
^
make[2]: *** [CMakeFiles/openFABMAPcli.dir/samples/openFABMAPcli.cpp.o] Error 1
make[1]: *** [CMakeFiles/openFABMAPcli.dir/all] Error 2
make: *** [all] Error 2
root@milton-OptiPlex-9010:/data/code/openfabmap/build#
Hi,
I found there's an openfabmap implementation in opencv 2.4
http://docs.opencv.org/2.4/modules/contrib/doc/openfabmap.html
but I cannot find the source code in official OpenCV repo (Itseez/opencv, opencv_contrib, opencv_attic)
Did I missed anything?
Thanks!
Question from @wengxiuling36:
Sorry to bother you.I see you paper "Towards Hierarchical Place Recognition for Long-Term Autonomy".
Could I ask you how to get Precison-Recall curves using the New College datasets.I downloaded NewCollegeGroundTruth.mat. FabMap1.0 outputs a matrix psame.But this psame matrix is not a 0-1 matrix,and the high probability entries on the main diagonal indicate the detection of new places.
So I set main diagonal element in psame to 0.The matlab code:psame=psame+diag(-diag(psame));
psame=(psame>=0.99);
npsame=length(find(psame==1));
tp=length(intersect(find(psame1==1),find(truth==1)));
ntruth=length(find(truth==1));
r=tp/ntruth;
p=tp/npsame;Is it right?I find the ntruth is too big and tp is too small compared to npsame.
On my computer clustering took two days, and still is not completed.
here is the configuration file
Windows 10
VocabularyTrainData 1.5G
Train Video Duration: 3min
%YAML:1.0
# openFABMAPexe Settings File
#---------------------------------------------------------------------------
FilePaths:
#The training data video should be of disjoint, non-overlapping scenes
#and should not visit the same location more than once
TrainPath: "E:\\PositioningSystem\\openfabmap-2.04\\build\\bin\\Release\\test.mp4"
#The test video should not overlap with the training video but should
#have similar environment types.
TestPath: "E:\\PositioningSystem\\openfabmap-2.04\\build\\bin\\Release\\test.mp4"
#All feature descriptors extracted from the training data. Used to
#create the vocabulary/codebook
TrainFeatDesc: "E:\\PositioningSystem\\openfabmap-2.04\\build\\bin\\Release\\SIFT_SIFT_ADAPTIVE_vocabularytraindata.yml"
#All bag-of-words type whole image descriptors extracted from the
#training data. Used to create the chow-liu tree and used in the
#FabMap 'new place' likelihood calculation
TrainImagDesc: "E:\\PositioningSystem\\openfabmap-2.04\\build\\bin\\Release\\BOWtraindata.yml"
#The vocabulary/codebook itself
Vocabulary: "E:\\PositioningSystem\\openfabmap-2.04\\build\\bin\\Release\\vocabulary.yml"
#The Chow-Liu Tree itself
ChowLiuTree: "E:\\PositioningSystem\\openfabmap-2.04\\build\\bin\\Release\\tree.yml"
#The FabMap Test
TestImageDesc: "E:\\PositioningSystem\\openfabmap-2.04\\build\\bin\\Release\\BOWtestdata.yml"
#The FabMap results
FabMapResults: "E:\\PositioningSystem\\openfabmap-2.04\\build\\bin\\Release\\results.txt"
#---------------------------------------------------------------------------
# openFABMAP running mode:
# "ShowFeatures"
# "GenerateVocabTrainData"
# "TrainVocabulary"
# "GenerateFABMAPTrainData"
# "TrainChowLiuTree"
# "GenerateFABMAPTestData"
# "RunOpenFABMAP"
Function: "TrainVocabulary"
#---------------------------------------------------------------------------
FeatureOptions:
# Feature Detection Options
# "FAST"
# "STAR"
# "SIFT"
# "SURF"
# "MSER"
# "ORB"
# "BRISK"
DetectorType: "ORB"
# Feature Detection Modes
# "STATIC"
# "ADAPTIVE"
DetectorMode: "ADAPTIVE"
#ADAPTIVE SETTINGS
Adaptive:
MinFeatures: 300
MaxFeatures: 500
MaxIters: 5
# STATIC SETTINGS
FastDetector:
Threshold: 15
NonMaxSuppression: 1
StarDetector:
MaxSize: 32 #45
Response: 10 #30
LineThreshold: 18 #10
LineBinarized: 18 #8
Suppression: 20 #5
SiftDetector:
EdgeThreshold: 10
ContrastThreshold: 0.04
#OPENCV2.4+only
NumFeatures: 200
NumOctaveLayers: 3
Sigma: 1.6
SurfDetector:
HessianThreshold: 1000 #400
NumOctaves: 4
NumOctaveLayers: 2
Upright: 1
Extended: 0
MSERDetector:
Delta: 5
MinArea: 60
MaxArea: 14400
MaxVariation: 0.25
MinDiversity: 0.2
MaxEvolution: 200
AreaThreshold: 1.01
MinMargin: 0.003
EdgeBlurSize: 5
ORB:
nFeatures: 500
scaleFactor: 1.2
nLevels: 4
edgeThreshold: 31
firstLevel: 0
patchSize: 31
BRISK:
Threshold: 30
Octaves: 3
PatternScale: 1.0
AGAST:
Threshold: 20
NonMaxSuppression: 1
# Descriptor Extraction Options
# "SIFT"
# "SURF"
# "ORB"
# "BRISK"
ExtractorType: "ORB"
#---------------------------------------------------------------------------
#An option to throw away frames with low numbers of different words.
#Setting this to 0 turns off this feature
BOWOptions:
MinWords: 0
#---------------------------------------------------------------------------
VocabTrainOptions:
# a smaller clustersize increases the specificity of each codeword
# and will increase the number of words in the vocabulary
ClusterSize: 0.45
#---------------------------------------------------------------------------
ChowLiuOptions:
# a large amount of data is required to store all mutual information from
# which the minimum spanning tree is created. e.g. an 8000 word codebook
# requires 1.2 Gb RAM. Increasing the threshold results in information being
# discarded, and a lower memory requirement. Too high a threshold may result
# in necessary information being discarded and the tree not being created.
#A threshold of 0 takes longer and may fail due to memory requirements
LowerInfoBound: 0.0005
#---------------------------------------------------------------------------
# Method to add new location to the FabMap location list
# "All"
# "NewMaximumOnly"
FabMapPlaceAddition: "All"
openFabMapOptions:
#Detector Model
PzGe: 0.39
PzGne: 0
#The method to calculate the likelihood of a 'new location'
#Note, FAB-MAP2.0 cannot use mean-field
# "Sampled"
# "Meanfield"
NewPlaceMethod: "Sampled"
#if using "sampled" method how many samples from training data to use
NumSamples: 3000
# The option to switch off FabMap's feature dependency model (Chow-Liu tree)
# in the likelihood calculations
# "Naive"
# "ChowLiu"
BayesMethod: "ChowLiu"
# The option to switch on and off the addition of a simple motion model
# which assumes links between sequential additions to the query space.
# 0 for False, 1 for True
SimpleMotion: 0
# Which version of openFABMAP to run
# "FABMAP1"
# "FABMAPLUT"
# "FABMAPFBO"
# "FABMAP2"
FabMapVersion: "FABMAP2"
#FabMap1:
# no additional options
FabMapLUT:
# precision with which to store precomputed values (decimal places)
Precision: 6
FabMapFBO:
# The loglikelihood bound beneath the best hypothesis at which other
# hypotheses are always retained
RejectionThreshold: 1e-6
# The likelihood bound of a hypothesis 'overtaking' the current best
# hypothesis, below which hypotheses are discarded. Used to compute delta
PsGd: 1e-6
# The largest value of delta when computing it via the bisection method
BisectionStart: 512
# The number of iterations to perform the bisection. Bisection accuracy =
# BisectionStart / 2^BisectionIts
BisectionIts: 9
#FabMap2:
# no additional options
#---------------------------------------------------------------------------
Hi, everyone. I wonder how to get the visualisation results? I have followed the Wiki "Visualising Results", and I have got the "results.txt". However, when i import the data into Matlab, it says the data is too big, and the computer died. Did i miss some key step?
Besides, how can i visualize the recognition results real time?
Hello, i'm developing a project using your openfabmap implementation, i link my project using your library, but was a little bit hard to me because i'm dont a expert in cmake, but now works well.
i need to add this line into your cmake file:
set(CMAKE_POSITION_INDEPENDENT_CODE ON)
to enable the -fPIC recompilation and works well
This issue discusses the generator inference function FabMap::PeqGL
.
This function behaves as expected when in Naive Bayes mode, but Lzpq
should be used in the inference when in Chow Liu mode.
Internally, the function should use PzqGzpqeq
instead of PzqGeq
when in Chow Liu mode, where PzqGzpqeq
would refactor some of what is currently happening in PzqGzpqL
.
Sparse descriptors are stored as binary vectors. This makes large vocabularies (100k words) intractable. It also makes FAB-MAP 2.0's computational complexity dependent on vocabulary size. Oxford stores BOW descriptors as lists of indices.
Hi i was trying to use openfabmap for my school project and while i trying to "GenerateFABMAPTrainData" it goes
terminate called after throwing an instance of 'cv::Exception'
what(): OpenCV(4.2.0) ../modules/flann/src/miniflann.cpp:315: error: (-210:Unsupported format or combination of formats) in function 'buildIndex_'
type=0
Aborted (core dumped)
any idea what might be causing it? I am using orb feature for detection and extraction.
Hi,
I am running openFABMAP through python.
I see that there was a pull request 7 years ago that included OpenMP speedups, and that it is enabled automatically when OpenMP is found yet when I monitor the CPU usage, there is only ever one core that is going used. It switches between the cores but it doesn't run on two simultaneously.
When I ran "cmake .." in the build folder it found OpenMP, but it seems not to be applying the speedups?
Any advice on this would be greatly appreciated.
When a loop closure is detected, the generator likelihoods for that place should be updated with the newest measurement. What currently happens:
This issue will be used to discuss a design for storing place generators, and updating them with new measurements.
Thanks for such a great implementation of FabMap!
Was an inverted index added to speed up in the OpenFabMap v2.02?
I've started using the Catch unit testing framework. I'll add it in an upcoming pull request, but we should think about what unit tests would be beneficial for FabMap.
Hi, Dear Aren Glover
As a researcher in visual place recognation, I am trying to test the performances of different algorithms. Unfortunately, the url of "Gardens Points" seems not to be unavaiable because of the automatical jump to QUT.
Do you have any ideas to download this dataset?
Thank you very much.
P.s. The previouse URL is : https://wiki.qut.edu.au/display/cyphy/Day+and+Night+with+Lateral+Pose+Change+Datasets
when i installed the package, i modified the setting file(the avi file path) and run the command "bin/openFABMAPcli -s ../samples/settings.yml", but i get the following error:
"Could not create Descriptor Extractor. Please specify extractor type in settings file"
Feature Extractor error.
Hello I am having this Error while running the given github procedure using linux : Could not create detector class. Specify detector mode (static/adaptive) in the settings file Feature Detector error. I have tried specifing for both options but having same error.
the project needs to be updated to work with newer versions of opencv.
I am sorry that bother you. I think it's a little problem but I can't figure it out :(
I follow the function steps sequentially. In the second function I load the train.avi and generate vocabularytraindata.yml. In the third function, it needs to load this yaml file, but I got an error with opencv:
OpenCV Error: Parsing error (vocabularytraindata.yml(132841098): Missing, between the elements) in icvYMLParseValue
I am sure that vocabularytraindata.yml has been generated and I have set the file path in the settings.yml. Anyone has the same problem?
It would be nice to have the code (at least the public API) documented with Doxygen.
This work is in progress (will come across in a later pull request).
Sorry to bother but I have the similar issue to #28 with opencv 4.2. It states that "Could not create detector class. Specify detector mode (static/adaptive) in settings file" but I've already selected the detector mode as static. I am new to this and don't know if I did anything wrong. Any advice would be appreciated.
Hi guys,
I've manage the change settings.yaml to run my dataset, but it does not seem to generate results.txt.
Is there anything I have to do to get results.txt?
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.