Comments (9)
Sorry for the late reply. This repo provides evaluation for the CMU Seasons dataset, not the extended version. We used this script to generate a SIFT database from the .nvm
file. It is called from the reconstruction script. I think that there is no sift_queries.db
file, this must be a typo in the documentation.
from hfnet.
Thank you for your reply. I used the magic_cmu_to_db.py
script to generate the sift_database.db
for each slice, and found the create_cmu_query_db.py
script for the sift_queries.db
(not sure if it's needed like you said, but I generated them anyway). Now trying to run the evaluation, hopefully that works!
from hfnet.
When running evaluate_cmu.py
, I get the following error, which seems to be related to PCA:
ValueError: n_components=1024 must be between 0 and min(n_samples, n_features)=446 with svd_solver='full'
The min(n_samples, n_features)
number changes depending on the slice. Have you encountered this, or know how to fix the issue? I saw in another issue that you had mentioned possibly disabling PCA?
from hfnet.
When running
evaluate_cmu.py
, I get the following error, which seems to be related to PCA:
ValueError: n_components=1024 must be between 0 and min(n_samples, n_features)=446 with svd_solver='full'
The
min(n_samples, n_features)
number changes depending on the slice. Have you encountered this, or know how to fix the issue? I saw in another issue that you had mentioned possibly disabling PCA?
I've just met the same problem with you, have you solved it now?
from hfnet.
When running
evaluate_cmu.py
, I get the following error, which seems to be related to PCA:
ValueError: n_components=1024 must be between 0 and min(n_samples, n_features)=446 with svd_solver='full'
Themin(n_samples, n_features)
number changes depending on the slice. Have you encountered this, or know how to fix the issue? I saw in another issue that you had mentioned possibly disabling PCA?I've just met the same problem with you, have you solved it now?
Hi SheldonHS, so sorry, I didn't end up solving this because it wasn't critical for me and moved on to now trying to run hfnet on a custom dataset. Let me know if you solved it though!
from hfnet.
When running
evaluate_cmu.py
, I get the following error, which seems to be related to PCA:
ValueError: n_components=1024 must be between 0 and min(n_samples, n_features)=446 with svd_solver='full'
Themin(n_samples, n_features)
number changes depending on the slice. Have you encountered this, or know how to fix the issue? I saw in another issue that you had mentioned possibly disabling PCA?I've just met the same problem with you, have you solved it now?
Hi SheldonHS,Is it resolved? I also encountered the same problem
from hfnet.
When running
evaluate_cmu.py
, I get the following error, which seems to be related to PCA:
ValueError: n_components=1024 must be between 0 and min(n_samples, n_features)=446 with svd_solver='full'
Themin(n_samples, n_features)
number changes depending on the slice. Have you encountered this, or know how to fix the issue? I saw in another issue that you had mentioned possibly disabling PCA?I've just met the same problem with you, have you solved it now?
Hi SheldonHS,Is it resolved? I also encountered the same problem
Hi @SheldonHS and @bxh1,
Not sure if you've solved already, I just found that disabling the PCA does in fact allow the evaluation to run! In line 30 of evaluate_cmu.py
I set pca_dim
to 0 (instead of 1024).
from hfnet.
Hi! After set pca_dim to 0. I got another error. Could anyone help me? Thanks!
Traceback (most recent call last): File "colmap-helpers/magic_cmu_to_db.py", line 68, in <module> main() File "colmap-helpers/magic_cmu_to_db.py", line 37, in main image_id = db.add_image(name, camera_id) File "/dockerdata/xinkong/hfnet/colmap-helpers/internal/db_handling.py", line 178, in add_image prior_q[3], prior_t[0], prior_t[1], prior_t[2])) sqlite3.IntegrityError: UNIQUE constraint failed: images.name
from hfnet.
This is probably a different issue. It says that the entry in the COLMAP database already existed. Maybe the script crashed and you launched it again without deleting the corresponding db?
from hfnet.
Related Issues (20)
- Is it possible to do the online localization pipeline with HF-Net without GPU? HOT 1
- hfnet inference with C++ HOT 1
- 'Config' error during evaluation lfnet for hpatches. HOT 1
- Hllo HOT 1
- About disstillation in the global descriptor HOT 3
- About the training process
- is the global descriptor of hfnet better than netvlad? HOT 3
- Training Dataset HOT 1
- NetVLAD Descriptors for Training HOT 1
- why triangulate the aachen 3D model according to matches instead of using 3d points provided? HOT 2
- Training with TUM HOT 6
- error during evaluation aachen
- Question about databases contents
- RobotCar Seasons evaluation issues HOT 4
- About train and evaluation of HFNet.
- Extremely large size npz files during SuperPoint export_predictions
- undefined variables output_types, output_shape HOT 1
- Hello professor, I met some problems while processing local evaluation.
- Experiences for adjusting the weights in the loss when retraining HFNET with 128D local descriptor
- Is python 3.6 mandatory?
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from hfnet.