Comments (10)
@harrygcoppock Thanks Harry.
from self-attention-gan.
My model often learns a negative gamma. I do not think that this means that it has a negative affect, the most important part is the magnitude. out = input + gamma*attentionOutput.
from self-attention-gan.
@harrygcoppock @SeokjuLee Hi, I've got the negative gamma, may I ask what value do you get?
from self-attention-gan.
sorry I never saved the gamma results and cannot remember. A guess was that it slowing increased to 1 or -1 but again that is just a guess. I think it plateaued at this magnitude.
from self-attention-gan.
@harrygcoppock thanks for your quick reply, I got the gamma about -0.01, do you think this is normal?
from self-attention-gan.
The point of the gamma parameter is that at the start of training (when gamma = 0) the model can quickly learn the easier convolutional features. Then the model can slowly introduce information from the attention layer. I recall that the paper argues that this is a somewhat harder task. If your gamma value remains low maybe the model struggles to make use of the attention layer? However this is just speculation. Maybe first investigate the relative magnitudes of the skip connection and the output of the attention layer before gamma is applied (if |output from attention| >>|skip connection| then a gamma value of 0.01 could still be significant?). Finally did the gamma value plateau at this level?
from self-attention-gan.
@harrygcoppock ,yes the value plateau at this level. I'm a rookie to self-attention, so I have no experience on whether -0.01 is a normal value.
from self-attention-gan.
I too am relatively new to the field, however if you are interested in finding out why all I can think of is my above point. That and postulating that maybe self attention is not helping you here - maybe the problem set is not a good match or you have initialised your attention layer poorly.
from self-attention-gan.
@harrygcoppock thanks a lot for your kindly help!I will do more checks. Besiedes, may I ask what should be a normal value of gamma when self-attention works well? 0.1 or any?
from self-attention-gan.
@csyhping Hi, I can't remember the detailed value, but I think I got a similar result. As Harry commented, the magnitude of gamma gradually increased from zero.
from self-attention-gan.
Related Issues (20)
- detach fake image when updating the discriminator
- model.py is working only for imsize=64 HOT 1
- Missing one 1x1 conv on output from attention layer? HOT 4
- UnboundLocalError: local variable 'dataset' referenced before assignment HOT 4
- dropbox link missing HOT 1
- The code is different from the original paper HOT 2
- RuntimeError: cublas runtime error : the GPU program failed to execute at /pytorch/aten/src/THC/THCBlas.cu:450
- 1
- the meaning of Gamma in Attention model HOT 2
- How to make the repo available for input image of 256x256 size? HOT 1
- torch.bmm(), CUDA out of memory. HOT 3
- Negative self.gamma parameter??
- Confused by self-attention layer positioning in Discriminator HOT 1
- self.gamma*out considered as "in place" operation
- Trying the Self-Attention-GAN with dog images
- Add examples to work with audio files as well
- 我长期研究和改进GAN,如果对GAN或者深度学习感兴趣的可以联系我,联系方式,wechat: lovedaixiaobaby
- How can I determine if the model has converged?
- the download.sh can't use
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from self-attention-gan.