Git Product home page Git Product logo

Comments (11)

MenghaoGuo avatar MenghaoGuo commented on July 17, 2024

Hi, thanks for your attention.

  1. I visualize the attention map by calculating relationship by using PCT model (including neighbor embedding) between different points.
  2. The norm dimension is right. The first softmax does not play a role of normalization but plays a role in eliminating the impact of scale. The second normalization plays a role of normalization. For detailed explanation, please read this paper
  3. 12G memory seems not enough for part segmentation. We conduct experiments by using 3090 or RTX.

from pct.

ja604041062 avatar ja604041062 commented on July 17, 2024

Thanks for your reply!
But you have 4 attention layers, which attention layer are you visualizing?
Do you visualizing another attention layer?
In my exp, the 1st and 3rd attention layer just attention the neighboring points(1st is much wider than 3rd), and another attention layer focus on irrelevant area. I don't know what is the meaning of 2nd, 4th attention blocks.(I visualize sPCT in your paper)

from pct.

MenghaoGuo avatar MenghaoGuo commented on July 17, 2024
  1. We visualize the average of 4 attention layers.
  2. Yes, i also try to visualize all attention layers of PCT and it produce simliar results. However, we do not try to visualize SPCT.

from pct.

ja604041062 avatar ja604041062 commented on July 17, 2024

How to visualize the average of 4 attention layers? do you use element-wise addition on all four N*N attention map and divided by 4 and visualize?

OK it seems that PCT is much stronger than SPCT, haha.

感謝您的回覆!

from pct.

suyukun666 avatar suyukun666 commented on July 17, 2024

Thanks for your reply! But you have 4 attention layers, which attention layer are you visualizing? Do you visualizing another attention layer? In my exp, the 1st and 3rd attention layer just attention the neighboring points(1st is much wider than 3rd), and another attention layer focus on irrelevant area. I don't know what is the meaning of 2nd, 4th attention blocks.(I visualize sPCT in your paper)

Hi! can you share your visualization.py? I try many times but fail. Many Thanks!

from pct.

ja604041062 avatar ja604041062 commented on July 17, 2024

`
import matplotlib
matplotlib.use('TkAgg')
import matplotlib.pyplot as plt
from mpl_toolkits.mplot3d import Axes3D
from pylab import *

def Visualize_Attention_Map(u, xyz, attention, axis):
coordi = xyz.cpu().permute(1, 0)
fig = plt.figure(dpi=100, frameon=False)
ax = fig.gca(projection='3d')

if not axis:
    plt.axis('off')

the_fourth_dimension = attention.cpu().numpy()
the_fourth_dimension = (the_fourth_dimension-min(the_fourth_dimension))/(max(the_fourth_dimension)-min(the_fourth_dimension))
colors = cm.cividis(the_fourth_dimension)

ax.scatter(coordi[:,0], coordi[:,1], coordi[:,2], c=colors, marker='o', s=10)
ax.scatter(coordi[u,0], coordi[u,1], coordi[u,2], c='r', s=100)

colmap = cm.ScalarMappable(cmap=cm.cividis)
colmap.set_array(the_fourth_dimension)

fig.colorbar(colmap)
plt.show()

`

input:
u: (0 ~ N) u-th points(data) you want to see
xyz: (3, N) the corresponding coordinate of points
attention: (N) the attention map
axis: (True or False) whether you want to visualize axis

I use matplotlib to print the map and here is the code. Tell me if you have any problems.
ok I can't edit perfectly...

from pct.

suyukun666 avatar suyukun666 commented on July 17, 2024

` import matplotlib matplotlib.use('TkAgg') import matplotlib.pyplot as plt from mpl_toolkits.mplot3d import Axes3D from pylab import *

def Visualize_Attention_Map(u, xyz, attention, axis): coordi = xyz.cpu().permute(1, 0) fig = plt.figure(dpi=100, frameon=False) ax = fig.gca(projection='3d')

if not axis:
    plt.axis('off')

the_fourth_dimension = attention.cpu().numpy()
the_fourth_dimension = (the_fourth_dimension-min(the_fourth_dimension))/(max(the_fourth_dimension)-min(the_fourth_dimension))
colors = cm.cividis(the_fourth_dimension)

ax.scatter(coordi[:,0], coordi[:,1], coordi[:,2], c=colors, marker='o', s=10)
ax.scatter(coordi[u,0], coordi[u,1], coordi[u,2], c='r', s=100)

colmap = cm.ScalarMappable(cmap=cm.cividis)
colmap.set_array(the_fourth_dimension)

fig.colorbar(colmap)
plt.show()

`

input: u: (0 ~ N) u-th points(data) you want to see xyz: (3, N) the corresponding coordinate of points attention: (N) the attention map axis: (True or False) whether you want to visualize axis

I use matplotlib to print the map and here is the code. Tell me if you have any problems. ok I can't edit perfectly...

Thanks!!But I have some problems. The attention tensor is in (Batch,256,256)(sa1,sa2,sa3,sa4 the same),as your instruction,attention: (N) the attention map is in N?How should I change it?

from pct.

ja604041062 avatar ja604041062 commented on July 17, 2024

you can just call this function into your model. here is my partial code:

`#################################### partial code for model ####################################
x, attention1 = self.sa1(x)
x, attention2 = self.sa2(x)
x, attention3 = self.sa3(x)
x, attention4 = self.sa4(x)

#################################### full code for visualize ####################################

for i in range(2048):
coordi = xyz[8].cpu()
atten = attention1[8,I,:] #if I want to see the first attention map
Visualize_Attention_Map(i, coordi, attention[8,i,:], False)`

I simply return the attention map and use for loop to visualize the attention map for each points
see my input of Visualize_Attention_Map and you will know why the size of attention is N.

from pct.

queenie88 avatar queenie88 commented on July 17, 2024

you can just call this function into your model. here is my partial code:

`#################################### partial code for model #################################### x, attention1 = self.sa1(x) x, attention2 = self.sa2(x) x, attention3 = self.sa3(x) x, attention4 = self.sa4(x)

#################################### full code for visualize ####################################

for i in range(2048): coordi = xyz[8].cpu() atten = attention1[8,I,:] #if I want to see the first attention map Visualize_Attention_Map(i, coordi, attention[8,i,:], False)`

I simply return the attention map and use for loop to visualize the attention map for each points see my input of Visualize_Attention_Map and you will know why the size of attention is N.

I use this code to visualize the attention map,but the results are not the same as in the paper. Do you realize the results? I look forward your reply!

from pct.

mmiku1 avatar mmiku1 commented on July 17, 2024

请问您能发布一下您pytorch重现的部分分割的代码吗?我自己重现的效果很差,如果可以的话希望能看看您重现的代码,感激不尽

from pct.

mmiku1 avatar mmiku1 commented on July 17, 2024

Hi, First of all, I want to thank you for the proposed method , which benefited me a lot. So I reproduce your code by pytorch and tried to visualize the attention map in part segmentation task. but when I want to use the right wing as query point, it can't attention the left wing like what you visualize on the paper. So I want to know how you show the visualization result like the paper.

In addition, other issue point out the dimension of softmax is wrong because your multiplication is Value*Attention, so I think the dimension of softmax in Attention would be 1, not -1(or 2), please correct me if there is any mistakes. And also the dimension of softmax and L1 norm is different(softmax is -1 but L1-Norm is 1), why? Line 211: self.softmax = nn.Softmax(dim=-1) Line 220: attention = attention / (1e-9 + attention.sum(dim=1, keepdims=True))

Also, I want to know how you do neighbor embedding in part segmentation. the paper said the number of output points is N, which means you didn't sampling the point and also do SG(sampling and grouping) module twice. but when I reproduce the same method, I got cuda out of memory in RTX 2080Ti(VRAM:12G). Is my VRAM not big enough or I have any problem with the understanding of the paper discription?

I'm looking forward to your reply, and thank you for your contribution.

Can you release the partial segmentation code reproduced by your pytorch? The effect of my own reproduction is very poor. If I can, I hope to see your reproduced code. Thank you very much

from pct.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.