Git Product home page Git Product logo

Comments (14)

Relja avatar Relja commented on September 3, 2024

Did the serialAllFeats print any messages like not finding files? Are the sizes of the .bin files reasonable, i.e. num_images x dimensionality x 4 bytes?
I think if you don't have the images (to get them you should email me) or paths are not properly set in localPaths.m, then the feature extraction will silently fail and performance will be 0. Or maybe it failed the first time, then you set up the paths correctly, but the features were not recomputed (it looks if the .bin file is there and doesn't recompute it - this is bad if the existing file is corrupt) so the performance is still 0.
So - delete the .bin files, make sure dataset images are there and the paths are correct, rerun serialAllFeats, make sure the .bin sizes are correct, and see if this fixes it.

from netvlad.

ionLi avatar ionLi commented on September 3, 2024

I inspect and run the code again, the information that is printed is shown at the end I think there is no problem about localPath.m.
And we have acquired two bin files.

AAEA2B5F8F3A2C75D79644F1B0656B32

The only place I haven't configured is paths.pretraineedCNNs, I don't know if it cause the wrong result:
621B9768B56088A5D80839076BDAA64B

the information printed:

``
lujing =

'E:\学术和工作\netvlad\code_run\netvlad_v103\netvlad_v103\Databases\Tokyo247\'

serialAllFeats: Start
serialAllFeats 19-Jan-2023 10:44:32 1 / 7599
serialAllFeats 19-Jan-2023 10:45:26 2 / 7599; time 00:00:54; left 114:39:41; avg 54.3276 s
serialAllFeats 19-Jan-2023 10:45:27 3 / 7599; time 00:00:55; left 58:03:11; avg 27.5098 s
serialAllFeats 19-Jan-2023 10:45:27 4 / 7599; time 00:00:55; left 38:58:44; avg 18.4735 s
serialAllFeats 19-Jan-2023 10:45:29 8 / 7599; time 00:00:57; left 17:10:43; avg 8.1459 s
serialAllFeats 19-Jan-2023 10:45:32 16 / 7599; time 00:01:00; left 08:27:16; avg 4.0133 s
serialAllFeats 19-Jan-2023 10:45:38 32 / 7599; time 00:01:06; left 04:30:54; avg 2.1478 s
serialAllFeats 19-Jan-2023 10:45:51 64 / 7599; time 00:01:19; left 02:38:00; avg 1.2580 s
serialAllFeats 19-Jan-2023 10:46:16 128 / 7599; time 00:01:44; left 01:42:41; avg 0.8246 s
serialAllFeats 19-Jan-2023 10:47:08 256 / 7599; time 00:02:35; left 01:14:49; avg 0.6113 s
serialAllFeats 19-Jan-2023 10:48:50 512 / 7599; time 00:04:18; left 00:59:43; avg 0.5056 s
serialAllFeats 19-Jan-2023 10:50:29 759 / 7599; time 00:05:57; left 00:53:46; avg 0.4716 s
serialAllFeats 19-Jan-2023 10:52:16 1024 / 7599; time 00:07:44; left 00:49:43; avg 0.4537 s
serialAllFeats 19-Jan-2023 10:55:35 1518 / 7599; time 00:11:03; left 00:44:18; avg 0.4371 s
serialAllFeats 19-Jan-2023 10:59:11 2048 / 7599; time 00:14:39; left 00:39:44; avg 0.4294 s
serialAllFeats 19-Jan-2023 11:00:44 2277 / 7599; time 00:16:12; left 00:37:54; avg 0.4274 s
serialAllFeats 19-Jan-2023 11:05:54 3036 / 7599; time 00:21:22; left 00:32:08; avg 0.4225 s
serialAllFeats 19-Jan-2023 11:11:02 3795 / 7599; time 00:26:30; left 00:26:35; avg 0.4193 s
serialAllFeats 19-Jan-2023 11:13:04 4096 / 7599; time 00:28:32; left 00:24:25; avg 0.4182 s
serialAllFeats 19-Jan-2023 11:16:11 4554 / 7599; time 00:31:39; left 00:21:10; avg 0.4172 s
serialAllFeats 19-Jan-2023 11:21:19 5313 / 7599; time 00:36:47; left 00:15:50; avg 0.4155 s
serialAllFeats 19-Jan-2023 11:26:29 6072 / 7599; time 00:41:57; left 00:10:33; avg 0.4147 s
serialAllFeats 19-Jan-2023 11:31:40 6831 / 7599; time 00:47:08; left 00:05:18; avg 0.4141 s
serialAllFeats 19-Jan-2023 11:36:51 7590 / 7599; time 00:52:19; left 00:00:04; avg 0.4137 s
serialAllFeats 19-Jan-2023 11:36:55 7599 / 7599; time 00:52:23; left 00:00:00; avg 0.4137 s
serialAllFeats: Done
serialAllFeats: Start
serialAllFeats 19-Jan-2023 11:36:55 1 / 315
serialAllFeats 19-Jan-2023 11:36:57 2 / 315; time 00:00:01; left 00:09:22; avg 1.7901 s
serialAllFeats 19-Jan-2023 11:36:59 3 / 315; time 00:00:03; left 00:09:20; avg 1.7917 s
serialAllFeats 19-Jan-2023 11:37:01 4 / 315; time 00:00:05; left 00:09:14; avg 1.7758 s
serialAllFeats 19-Jan-2023 11:37:08 8 / 315; time 00:00:12; left 00:09:04; avg 1.7681 s
serialAllFeats 19-Jan-2023 11:37:21 16 / 315; time 00:00:26; left 00:08:41; avg 1.7370 s
serialAllFeats 19-Jan-2023 11:37:47 31 / 315; time 00:00:51; left 00:08:06; avg 1.7076 s
serialAllFeats 19-Jan-2023 11:37:48 32 / 315; time 00:00:52; left 00:08:02; avg 1.6988 s
serialAllFeats 19-Jan-2023 11:38:37 62 / 315; time 00:01:41; left 00:07:03; avg 1.6654 s
serialAllFeats 19-Jan-2023 11:38:40 64 / 315; time 00:01:45; left 00:07:00; avg 1.6668 s
serialAllFeats 19-Jan-2023 11:39:29 93 / 315; time 00:02:33; left 00:06:12; avg 1.6726 s
serialAllFeats 19-Jan-2023 11:40:18 124 / 315; time 00:03:22; left 00:05:15; avg 1.6443 s
serialAllFeats 19-Jan-2023 11:40:24 128 / 315; time 00:03:28; left 00:05:09; avg 1.6452 s
serialAllFeats 19-Jan-2023 11:41:08 155 / 315; time 00:04:12; left 00:04:23; avg 1.6395 s
serialAllFeats 19-Jan-2023 11:41:59 186 / 315; time 00:05:04; left 00:03:33; avg 1.6435 s
serialAllFeats 19-Jan-2023 11:42:51 217 / 315; time 00:05:55; left 00:02:42; avg 1.6442 s
serialAllFeats 19-Jan-2023 11:43:43 248 / 315; time 00:06:47; left 00:01:52; avg 1.6495 s
serialAllFeats 19-Jan-2023 11:43:56 256 / 315; time 00:07:00; left 00:01:38; avg 1.6496 s
serialAllFeats 19-Jan-2023 11:44:34 279 / 315; time 00:07:38; left 00:01:00; avg 1.6482 s
serialAllFeats 19-Jan-2023 11:45:28 310 / 315; time 00:08:32; left 00:00:09; avg 1.6600 s
serialAllFeats 19-Jan-2023 11:45:37 315 / 315; time 00:08:41; left 00:00:01; avg 1.6604 s
serialAllFeats: Done
testFromFn:
.\Data\netvlad\output\vd16_tokyoTM_conv5_3_vlad_preL2_intra_white_tokyo247_db.bin
.\Data\netvlad\output\vd16_tokyoTM_conv5_3_vlad_preL2_intra_white_tokyo247_q.bin
NaN 19-Jan-2023 11:45:39 1 / 315
0.0000 19-Jan-2023 11:45:39 2 / 315; time 00:00:00; left 00:02:19; avg 0.4454 s
0.0000 19-Jan-2023 11:45:40 3 / 315; time 00:00:00; left 00:02:15; avg 0.4315 s
0.0000 19-Jan-2023 11:45:40 4 / 315; time 00:00:01; left 00:02:12; avg 0.4245 s
0.0000 19-Jan-2023 11:45:42 8 / 315; time 00:00:02; left 00:02:06; avg 0.4105 s
0.0000 19-Jan-2023 11:45:45 16 / 315; time 00:00:06; left 00:02:00; avg 0.4030 s
0.0000 19-Jan-2023 11:45:51 31 / 315; time 00:00:11; left 00:01:53; avg 0.3993 s
0.0000 19-Jan-2023 11:45:51 32 / 315; time 00:00:12; left 00:01:53; avg 0.3990 s
0.0000 19-Jan-2023 11:46:03 62 / 315; time 00:00:24; left 00:01:40; avg 0.3969 s
0.0000 19-Jan-2023 11:46:04 64 / 315; time 00:00:25; left 00:01:40; avg 0.3969 s
0.0000 19-Jan-2023 11:46:16 93 / 315; time 00:00:36; left 00:01:28; avg 0.3979 s
0.0000 19-Jan-2023 11:46:28 124 / 315; time 00:00:49; left 00:01:16; avg 0.3986 s
0.0000 19-Jan-2023 11:46:30 128 / 315; time 00:00:50; left 00:01:14; avg 0.3987 s
0.0000 19-Jan-2023 11:46:40 155 / 315; time 00:01:01; left 00:01:04; avg 0.3997 s
0.0000 19-Jan-2023 11:46:53 186 / 315; time 00:01:13; left 00:00:51; avg 0.3991 s
0.0000 19-Jan-2023 11:47:05 217 / 315; time 00:01:26; left 00:00:39; avg 0.3986 s
0.0000 19-Jan-2023 11:47:17 248 / 315; time 00:01:38; left 00:00:27; avg 0.3986 s
0.0000 19-Jan-2023 11:47:21 256 / 315; time 00:01:41; left 00:00:23; avg 0.3991 s
0.0000 19-Jan-2023 11:47:30 279 / 315; time 00:01:50; left 00:00:14; avg 0.3990 s
0.0000 19-Jan-2023 11:47:42 310 / 315; time 00:02:03; left 00:00:02; avg 0.3986 s
0.0000 19-Jan-2023 11:47:44 315 / 315; time 00:02:05; left 00:00:00; avg 0.3985 s

rec@10= 0.0000, time= 125.5254 s, avgTime= 398.4932 ms

001 0.0000
002 0.0000
003 0.0000
004 0.0000
005 0.0000
010 0.0000
015 0.0000
020 0.0000
025 0.0000
030 0.0000
035 0.0000
040 0.0000
045 0.0000
050 0.0000
055 0.0000
060 0.0000
065 0.0000
070 0.0000
075 0.0000
080 0.0000
085 0.0000
090 0.0000
095 0.0000
100 0.0000
``

from netvlad.

Relja avatar Relja commented on September 3, 2024

Hmm, that all looks fine really so I'm not sure what to say - I never had such a complaint. My best guess is that there is some small annoying mistake.

I'm a bit confused by the logs and screenshots as they don't seem consistent. The folder shows modified time as 18:45 and 19:05 while the logs show computations finishing at 11:36 and 11:45 - the times are differnt and the time needed to create the 2nd file is 2x faster for some reason. So I just wonder if everything is consistent, e.g. by accident you switched networks used to compute the files etc. What are the actual sizes of the .bin files? E.g. if I'm not mistaken for Tokyo 24/7 the db file should be around 240x larger than the q file.

If you don't find an error here - can you query with a database image and see if you get a reasonable result?

from netvlad.

ionLi avatar ionLi commented on September 3, 2024

I don't solve the problem, but I find the picture of images is "png", but in the file of tokyo247.mat ,the dbImageFns describes the path which use "jpg", and the picture of query is "jpg", so I convert the picture of images from png to jpg, then to run the code.

so I want to ask whether the format of picture have influenced the result of code?
Thanks!

the dbImageFns of the file of tokyo247.mat:
image

from netvlad.

Relja avatar Relja commented on September 3, 2024

Hi,
Oh sorry about that, I don't remember that the data contains png's and nobody mentioned this to me. I certainly converted to jpg and it's probably best if you do that too - the code uses vl_imreadjpeg which if I recall correctly is more optimized than matlab's generic imread. I don't remember the exact command I used to convert to jpeg but I believe it was just the default parameters of ImageMagick's convert tool. I never measured the difference between running it on png or jpeg - I would be surprised if it mattered for any normal quality jpeg, and the network was anyway trained on jpegs so all should be good.

You wrote "I don't solve the problem" - does this not solve it for you? Or did you not convert to jpeg yet and are yet to try?

from netvlad.

ionLi avatar ionLi commented on September 3, 2024

Sorry, I just check the notifications of github.
I don‘t solve the problem of there is no result of recall rate.The Tokyo 24/7 the db file is actual 240x larger than the q file, but I can't get the any result of recall rate.

from netvlad.

Relja avatar Relja commented on September 3, 2024

That's really strange.. Sorry it's very hard for me to check it myself because I don't have a computer with a GPU that can run MATLAB (nor a MATLAB licence).
Since the files exist and look to be the correct size, then it seems feature extraction is not a problem (presumably you ran feature extraction after fixing png->jpg because otherwise I don't see how it would work), unless the features are somehow nonsense. Can you try visually inspecting the process, for example take a few queries, do the retrieval and see what are the retrieved images. Do they look reasonable, are they positives for the query? So are features the problem, or is retrieval for some reason wrong, or the evaluation metric computation.

from netvlad.

ionLi avatar ionLi commented on September 3, 2024

Thank for you help,whether use the demoRetrieval.m this code to get the best matching result? If so, I will try to modify it in the next few days to inspect the process.

from netvlad.

Relja avatar Relja commented on September 3, 2024

You can just follow what testFn does - it's simple, loads the query and database features and calls testCore.
Then testCore uses recallAtN to compute the recall based on the lambda functions that do the search and tell you if it's a positive. You instead just do the rawNnSearch yourself. Then look at top 10 retrieved database IDs for a few queres - do they make sense visually (so are features bad or is there something broken in the search itself), are they positives according to the ground truth (db.isPosQ(iQuery, iDb), is this broken?), or if all that looks fine then is recallAtN broken?

from netvlad.

ionLi avatar ionLi commented on September 3, 2024

sorry,I went to prepare for the exam a while ago

I found that there might be a problem with the extraction of features, when I did the feature query matching I found that many of the features were exactly the same, for example, in the Tokyo247 dataset, there are 75981 images in total and 315 images in the query set, but I found that in the deFeat and qFeat arrays, they are mostly the same.
image

I think it should be in the function vl_simplenn; if I understand correctly, setting the batchsize to 10 will extract the features of 10 images at once, but I return the same res result many times after vl_simplenn.
They will first subtract the mean value of these 10 images in the three primary colors, and afterwards accumulate to the fourth dimension to input vl_simpenn for feature extraction, after which the output variable res is the same, after which he will reshape to the corresponding 10 columns, which is also the same a vector
I don't know what caused this phenomenon.
Thanks!

from netvlad.

Relja avatar Relja commented on September 3, 2024

Ok so feature extraction is broken somehow if it produces the same features for everything. I'm not sure what would be the reason but this should be easy to experiment with as at least you don't have to extract all features, do the evaluation etc. Just take a few images and extract features, and see why are they all given the same descriptor.
E.g. readme shows you how to do it for individual images:

im= vl_imreadjpeg({which('football.jpg')}); im= im{1}; % slightly convoluted because we need the full image path for `vl_imreadjpeg`, while `imread` is not appropriate - see `help computeRepresentation`
feats= computeRepresentation(net, im); % add `'useGPU', false` if you want to use the CPU

from netvlad.

Relja avatar Relja commented on September 3, 2024

Hi, did you get this to work?

from netvlad.

ionLi avatar ionLi commented on September 3, 2024

Hi, did you get this to work?

Yes, I have run this work successful on Linux. Thanks for your direction.
Good luck with your research, sir.

from netvlad.

Relja avatar Relja commented on September 3, 2024

Glad to hear that. Thanks, likewise!

from netvlad.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.