cuishuhao / bnm Goto Github PK
View Code? Open in Web Editor NEWcode of Towards Discriminability and Diversity: Batch Nuclear-norm Maximization under Label Insufficient Situations (CVPR2020 oral)
License: MIT License
code of Towards Discriminability and Diversity: Batch Nuclear-norm Maximization under Label Insufficient Situations (CVPR2020 oral)
License: MIT License
Hi, I would like to know where I can see the supplementary material in the article, I would like to understand the process of deriving the theory in your article, thanks!
您好,我注意到您好像参与了VisDA-2021 Challenge
https://competitions.codalab.org/competitions/33396#results
如果不是您的话叨扰了!
如果是您的话我想请教一下您是否是使用了BNM的技术使得AUROC取得这么好的效果,想了解一下具体的做法。
Thank you for your excellent work.
And then I have a question, can we replace the kernel norm by just taking the rank of the matrix, and then making its rank as close to the class as possible.
Hi, first of all , thanks for the code and your paper, it's really excellent work.
And during the debug, i found the loss value is negative, is that right?
i debug the BNM in DA, the dataset was office31, source is amazon and target is dslr.
the "Transfer loss" will be -0.8 and classifier loss will be 0.02 after 2000 iterations.
and also, i found that if we simple calculate the classifier loss, the target acc will reach 100% near 1800 iteration, which means if we cut the BNM loss, it will be no harm to transfer the net from source to target.
am i missing something?
by the way, we are using pytorch1.9.0
We look forward to your reply. thanks again.
在代码中, 你的计算方式 transfer_loss = -torch.mean(s_tgt),使用的是均值而不是和,这里到底应该是和?还是均值呢?
Hi, @cuishuhao , thanks for the code release.
I have some confusion about the details of the paper.
According to the inequality (5) in original paper, the two norms can bound each other. Thus, optimizing F-norm is equal to optimizing nuclear-norm..
Thanks
Thank you very much for the excellent work and sharing the code generously!
In Section 3.1 you mentioned the maximum of F-norm and the minimum of entropy could be achieved at the same value in Supplementary. I was wondering where could I find the Supplementary? I couldn't find it on the arxiv page.
Sorry for the bothering and thanks again.
there is visda-2017 in DA/data/, but not training script in README file, does this code support training in the visda database?
Hi, @cuishuhao , thanks for your code implementations, but when I tried to reproduce the results in the paper using the office-31 dataset, specifically from the dslr dataset to the amazon dataset, the final acc is around 69% (CDAN+BNM), can you try to reproduce the results using this version of code and release it here? are there any hyperparameters that you change in this transferring scenario?
Thanks,
Hi:
I'm impressed by your paper and try to apply the BNM loss for domain adaptation problem. However, the torch.svd() function usually crush down, my batch-size and vector size is 128*1000, I wonder how do you figure out the torch.svd() problem?
Best wishes!
Hi and I am studying your approach with your implementation. My question is that in your paper you use (Equation 12) to compute the BNM loss, and the divisor is the batch size. But in
BNM/DA/BNM/train_image.py
L#164 I found that this is done with torch.mean()
. Then if the class number is smaller than batch size, the SVD operation will generate a s_tgt
with length C
instead of B
. Wouldn't that be incorrect according to the original equation? Why don't explicitly divide with the batch size?
Hello,if I have a Tensor with size [8,3,512,512], how can I compute BNM?
Hello, the diversity ratio is measured by the mean predicted category number dividing the mean ground-truth category number. Is there a code to calculate the diversity ratio? Thanks.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.