vdvchen / sgmnet Goto Github PK
View Code? Open in Web Editor NEWImplementation of "Learning to Match Features with Seeded Graph Matching Network" ICCV2021
License: MIT License
Implementation of "Learning to Match Features with Seeded Graph Matching Network" ICCV2021
License: MIT License
@vdvchen 你好, 当运行SGMNet+SP时出现了一个错误,应该是因为superpoint是256维特征,但是SMGNet的权重是128维的,请问方便发送我邮箱[email protected]一个基于superpoint的SGMNet权重么?
aug_desc1, aug_desc2 = x1_pos_embedding + desc1, x2_pos_embedding + desc2
RuntimeError: The size of tensor a (128) must match the size of tensor b (256) at non-singleton dimension 1
Hi! I have recently been having problems downloading the SGMNet dataset, mainly downloading:
1.gl3d_cams
2.gl3d_depths
3.gl3d_ct
For these three data sets, after running the corresponding bash command, it is displayed that the download cannot be performed. After querying the corresponding download URL, it is found that the website cannot be accessed, so the download fails. Could you please send a new link that can be downloaded? Thank you very much!
Hi! Can you answer a couple of questions
Hi @vdvchen ,
I tried, This source code implementation is not supported to run on CPU
How to run it on CPU?
Thanks for your kind support last time and thank you very much for sharing the training script... It is quite interesting for me.
Here, I would like to kindly ask about the data for training.
As I have tried to follow the instruction to download the data from https://github.com/lzx551402/GL3D
I have a question on which of these three datasets (1) gl3d_imgs, (2) gl3d_raw_imgs (3) gl3d_blended_images from https://github.com/lzx551402/GL3D#downloads ...... are to be downloaded ... or all of them ?
I have downloaded gl3d_raw_imgs... However, I received the error (below).... Does this mean that I did not download correctly? Or that I have downloaded the wrong dataset?
My setting for gl3d.yaml file is as follows. Should rawdata_dir
be the cloned directory of https://github.com/lzx551402/GL3D ? I am very sorry as this is not what you wrote in the instruction. The reason that I thought that this maybe the GL3D cloned directory is because dump.py
also looks for GL3D/data/list/comb/imageset_train.txt
... :
data_name: gl3d_train
rawdata_dir: /mnt/HDD4TB2/GL3D
feature_dump_dir: /mnt/HDD4TB3/SGMNet/gl3d_desc_dir
dataset_dump_dir: /mnt/HDD4TB3/SGMNet/gl3d_dataset_dir
The error:
python dump.py --config_path configs/gl3d.yaml
dump.py:20: YAMLLoadWarning: calling yaml.load() without Loader=... is deprecated, as the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full details.
config = yaml.load(f)
Formatting data...
0%| | 0/109 [00:00<?, ?it/s]
multiprocessing.pool.RemoteTraceback:
"""
Traceback (most recent call last):
File "/home/gabby-suwichaya/anaconda3/envs/sgmnet/lib/python3.7/multiprocessing/pool.py", line 121, in worker
result = (True, func(*args, **kwds))
File "/home/gabby-suwichaya/anaconda3/envs/sgmnet/lib/python3.7/multiprocessing/pool.py", line 44, in mapstar
return list(map(*args))
File "/mnt/HDD4TB3/SGMNet/datadump/dumper/gl3d_train.py", line 147, in format_seq
pair_list=np.loadtxt(os.path.join(seq_dir,'geolabel','common_track.txt'),dtype=float)[:,:2].astype(int)
File "/home/gabby-suwichaya/anaconda3/envs/sgmnet/lib/python3.7/site-packages/numpy/lib/npyio.py", line 1067, in loadtxt
fh = np.lib._datasource.open(fname, 'rt', encoding=encoding)
File "/home/gabby-suwichaya/anaconda3/envs/sgmnet/lib/python3.7/site-packages/numpy/lib/_datasource.py", line 193, in open
return ds.open(path, mode, encoding=encoding, newline=newline)
File "/home/gabby-suwichaya/anaconda3/envs/sgmnet/lib/python3.7/site-packages/numpy/lib/_datasource.py", line 533, in open
raise IOError("%s not found." % path)
OSError: /mnt/HDD4TB2/GL3D/data/586326ad712e276146904571/geolabel/common_track.txt not found.
"""
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "dump.py", line 27, in <module>
dataset.format_dump_data()
File "/mnt/HDD4TB3/SGMNet/datadump/dumper/gl3d_train.py", line 244, in format_dump_data
pool.map(self.format_seq,indices)
File "/home/gabby-suwichaya/anaconda3/envs/sgmnet/lib/python3.7/multiprocessing/pool.py", line 268, in map
return self._map_async(func, iterable, mapstar, chunksize).get()
File "/home/gabby-suwichaya/anaconda3/envs/sgmnet/lib/python3.7/multiprocessing/pool.py", line 657, in get
raise self._value
OSError: /mnt/HDD4TB2/GL3D/data/586326ad712e276146904571/geolabel/common_track.txt not found.
Hi! I would like to ask for guidance on reproducing Aachen for SIFT and SGMNet+SIFT?
At a starter, I have tried to run MNN+SIFT for Aachen day-night...but I get a much worse result for 8196 keypoints... 23.5 / 33.7 / 41.8
I am guessing that maybe because I am using SIFT instead of RootSIFT... But I am not sure about other settings that could be different...
So, as I saw the result of SGMNet+SIFT as well as MNN+SIFT on both papers and on https://www.visuallocalization.net/details/17655/, I am wondering how to reproduce them (especially Table 4. in the paper)...
Could you please share the setting? I would like to learn how to get a similar result...
So I would like to scope down my question as follows:
MNN+SIFT
, is there any thresholding applied for MNN matching?matcher:
name: SGM
model_dir: ../weights/sgm/root
seed_top_k: [256,256]
seed_radius_coe: 0.01
net_channels: 128
layer_num: 9
head: 4
seedlayer: [0,6]
use_mc_seeding: True
use_score_encoding: False
conf_bar: [1.11,0.1] #set to [1,0.1] for sp
sink_iter: [10,100]
detach_iter: 1000000
p_th: 0.2
matcher:
name: SG
model_dir: ../weights/sg/root
net_channels: 128
layer_num: 9
head: 4
use_score_encoding: True
sink_iter: [100]
p_th: 0.2
train/train.py, line 87
Why use 'step-start_step'? Should data['step'] be the same as 'step-start_step'?
Hello, I read your paper, there is a combination of SIFT+Superglue in your comparative experiment, how do you solve the problem that the descriptor and matching network dimensions are inconsistent
thanks
When I run
python dump.py --config_path configs/gl3d.yaml
I encounter the following issue, seems like it can not find hdf5 file
how to solve it?
Traceback (most recent call last):
File "/usr/lib/python3.6/multiprocessing/pool.py", line 119, in worker result = (True, func(*args, **kwds)) File "/usr/lib/python3.6/multiprocessing/pool.py", line 44, in mapstar
return list(map(*args))
File "/home/sshuang/SGMNet/datadump/dumper/gl3d_train.py", line 192, in format_seq
with h5py.File(os.path.join(self.config['feature_dump_dir'],fea_path1),'r') as fea1,
File "/usr/local/lib/python3.6/dist-packages/h5py/_hl/files.py", line 394, in init
swmr=swmr)
File "/usr/local/lib/python3.6/dist-packages/h5py/_hl/files.py", line 170, in make_fid
fid = h5f.open(name, flags, fapl=fapl)
File "h5py/_objects.pyx", line 54, in h5py._objects.with_phil.wrapper
File "h5py/_objects.pyx", line 55, in h5py._objects.with_phil.wrapper
File "h5py/h5f.pyx", line 85, in h5py.h5f.open
OSError: Unable to open file (unable to open file: name = '/home/sshuang/SGMNet/datadump/dump_desc/000000000000000000000009/00000007.jpg_sp_500.hdf5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)"""
The above exception was the direct cause of the following exception:
Traceback (most recent call last):
File "dump.py", line 27, in
dataset.format_dump_data()
File "/home/sshuang/SGMNet/datadump/dumper/gl3d_train.py", line 244, in format_dump_data
pool.map(self.format_seq,indices)
File "/usr/lib/python3.6/multiprocessing/pool.py", line 266, in map
return self._map_async(func, iterable, mapstar, chunksize).get()
File "/usr/lib/python3.6/multiprocessing/pool.py", line 644, in get
raise self._value
OSError: Unable to open file (unable to open file: name = '/home/sshuang/SGMNet/datadump/dump_desc/000000000000000000000009/00000007.jpg_sp_500.hdf5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)
thanks!
Why can't I find the hdf5 file?Does the code not generate hdF5 files in the folder?
Traceback (most recent call last):
File "dump.py", line 27, in
dataset.format_dump_data()
File "/data5/huZhao/code/SGMNet/datadump/dumper/gl3d_train.py", line 244, in format_dump_data
pool.map(self.format_seq,indices)
File "/data5/huZhao/anaconda3/envs/sgmnet/lib/python3.7/multiprocessing/pool.py", line 268, in map
return self._map_async(func, iterable, mapstar, chunksize).get()
File "/data5/huZhao/anaconda3/envs/sgmnet/lib/python3.7/multiprocessing/pool.py", line 657, in get
raise self._value
FileNotFoundError: [Errno 2] Unable to open file (unable to open file: name = '/data5/huZhao/code/GL3D-2/dump_desc_dir/000000000000000000000009/00000010.jpg_root_1000.hdf5', errno = 2, error message = 'No such file or directory', flags = 0, o_flags = 0)
Hi! Thank you so much for releasing the code.
Your paper is very impressive and contains so many interesting findings.
Here, I would like to kindly ask for the following setting in using SGMNet.
I have tried running your work with HPatch it seems pretty good with the setting of SIFT (but switch the dimension to 256).
However, I don't know what is the proper setting.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.