Git Product home page Git Product logo

pointsift's People

Contributors

jmydurant avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pointsift's Issues

Compile TF operator

Hi, How to code this file pointsift_op.py? Can you give me some suggestions?

network architecture code issue in 'pointnetSIFT_pointnet.py'

Thanks for you work! But I think the code of line 41 to 42 in 'pointnetSIFT_pointnet.py' should be
_, l2_points_2, _ = pointSIFT_module(l2_xyz, l2_points_1, radius=0.5...)
_, l2_points_3, _ = pointSIFT_module(l2_xyz, l2_points_2, radius=0.5...)

I am not sure about it. Thank you for your reply.

NotFoundError: /home/yxh/pointSIFT-master/tf_utils/tf_ops/pointSIFT_op/tf_pointSIFT_so.so: cannot open shared object file: No such file or directory

ipdb> Traceback (most recent call last):

File "", line 1, in
debugfile('/home/yxh/pointSIFT-master/train_and_eval_scannet.py', wdir='/home/yxh/pointSIFT-master')

File "/home/yxh/anaconda2/envs/tensorflow/lib/python3.5/site-packages/spyder/utils/site/sitecustomize.py", line 728, in debugfile
debugger.run("runfile(%r, args=%r, wdir=%r)" % (filename, args, wdir))

File "/home/yxh/anaconda2/envs/tensorflow/lib/python3.5/bdb.py", line 431, in run
exec(cmd, globals, locals)

File "", line 1, in

File "/home/yxh/anaconda2/envs/tensorflow/lib/python3.5/site-packages/spyder/utils/site/sitecustomize.py", line 705, in runfile
execfile(filename, namespace)

File "/home/yxh/anaconda2/envs/tensorflow/lib/python3.5/site-packages/spyder/utils/site/sitecustomize.py", line 102, in execfile
exec(compile(f.read(), filename, 'exec'), namespace)

File "/home/yxh/pointSIFT-master/train_and_eval_scannet.py", line 12, in
import models.pointSIFT_pointnet as SEG_MODEL

File "/home/yxh/pointSIFT-master/models/pointSIFT_pointnet.py", line 6, in
from tf_utils.pointSIFT_util import pointSIFT_module, pointSIFT_res_module, pointnet_fp_module, pointnet_sa_module

File "/home/yxh/pointSIFT-master/tf_utils/pointSIFT_util.py", line 6, in
from tf_utils.tf_ops.pointSIFT_op.pointSIFT_op import pointSIFT_select, pointSIFT_select_four

File "/home/yxh/pointSIFT-master/tf_utils/tf_ops/pointSIFT_op/pointSIFT_op.py", line 14, in
pointSIFT_module = tf.load_op_library(os.path.join(BASE_DIR, 'tf_pointSIFT_so.so'))

File "/home/yxh/anaconda2/envs/tensorflow/lib/python3.5/site-packages/tensorflow/python/framework/load_library.py", line 56, in load_op_library
lib_handle = py_tf.TF_LoadLibrary(library_filename, status)

File "/home/yxh/anaconda2/envs/tensorflow/lib/python3.5/site-packages/tensorflow/python/framework/errors_impl.py", line 473, in exit
c_api.TF_GetCode(self.status.status))

Can anyone get the results of scannet in paper?

I have not changed the code of pointsift and I can't get the result reported in paper,The result in paper is 86.1, I can only get 84.8, Can someone get the results of scannet in paper?Maybe I made some mistake,Thank you!

training and test problem

Thank you for sharing your project.
when I apply PointSIFT into our data,it is normal in training phase, however, in test phase,the loss is big and the accuracy is low ,could you give me some suggestion? thank you.

fatal error: kdt.h: No such file or directory

I am having trouble while running tf_pointSIFT_compile.sh.
############
main.cpp:11:17: fatal error: kdt.h: No such file or directory
############
Is this kdt.h a part of kdt-0.3.tar.gz? I tried to install it, but it doesn't work.
HELP!!!!PLEASE!!!!

is there any bug with the pointsift_model use in the code?

According the https://blog.csdn.net/JiangZuning/article/details/87904417, the point [B, C,64] upsample to the [B, C, 256], the code has used three times of pointsift_model, like that:

l2_points = pointnet_fp_module(l2_xyz, l3_xyz, l2_points, l3_points, [512,512], is_training, bn_decay, scope='fa_layer2')
   _, l2_points_1, _ = pointSIFT_module(l2_xyz, l2_points, radius=0.5, out_channel=512, is_training=is_training, bn_decay=bn_decay, scope='fa_layer2_c0')
    _, l2_points_2, _ = pointSIFT_module(l2_xyz, l2_points, radius=0.5, out_channel=512, is_training=is_training, bn_decay=bn_decay, scope='fa_layer2_c1')
    _, l2_points_3, _ = pointSIFT_module(l2_xyz, l2_points, radius=0.5, out_channel=512, is_training=is_training, bn_decay=bn_decay, scope='fa_layer2_c2')

but the parameter is the same ,and the second input is also l2_points instead of the first pointSIFT_module output, the third iput is l2_points of instead of l2_points_2. This can't enlarge the receptive field, l2_points_1、l2_points_2、l2_points_3 has the same receptive field.

and the same procedure in the point [B, C,256] upsample to the [B, C, 1024],the code is :

  l1_points = pointnet_fp_module(l1_xyz, l2_xyz, l1_points, l2_points, [256,256], is_training, bn_decay, scope='fa_layer3')
    _, l1_points_1, _ = pointSIFT_module(l1_xyz, l1_points, radius=0.25, out_channel=256, is_training=is_training, bn_decay=bn_decay, scope='fa_layer3_c0')
    _, l1_points_2, _ = pointSIFT_module(l1_xyz, l1_points_1, radius=0.25, out_channel=256, is_training=is_training, bn_decay=bn_decay, scope='fa_layer3_c1')
    l1_points = tf.concat([l1_points_1, l1_points_2], axis=-1)

the second pointSIFT_module input is the first pointSIFT_module output l1_points_1, so it can enlarge the receptive field.
so I think is that a bug of the code?or is just a trick for the training.

S3DIS

Thanks for your great work!!!!

I have a question about S3DIS. As we known, SEGColud just train on area 1-4,and 6, test on area5. However, others use six cross folder train/test. So the results are in different experiment environments, I think we shouldn't compare them together.

BTW, I am looking forward to your S3DIS code release.

Thanks!!

Question about pointsift_res_model and pointsift_model

@Fang-Haoshu @jmydurant @Jeff-sjtu Thanks for you excellent work, and i have a question about the function of pointsift_res_model and pointsift_model. I can't find any annotation in the code, i guess the second one is coded for the scale-awareness, but i'm not sure why the previous one is used for many times in the code, the two-stage network architecture points out it may be used for three times. Could you tell me the reason and give some annotation about the them?
Thanks, looking for you reply!

Inference time

How much is the time of inference (for example for a batch size of 32)?

Predict and Output

Can anyone give an example code regarding how to predict using the trained model? Got stuck while trying to visualize something...

extract feature by pointSIFT

hello,sir.
I have a question about extracting feature, if I just need the feature exacting by pointSIFT, wether i need not train the model using my dataset. just like the usage you give.

S3IDS

Thanks for your amazing work. Do you still plan to release your code on S3IDS dataset?

ask for the way of evaluation

Hi , @jmydurant Thanks for your awesome codeshare!

I am interested how do you evaluate your algorithm in Scannet dataset, is it same with the way of PointNet?

PointNet use IoU to evaluate semantic segmentaion performance, The details can be found #1 and #2

In your code, I only found evaluate_one_epoch function in train_and_eval_scannet.py,but it is not evaluation of the whole Scannet dataset.

Looking for your response, thanks a lot!

tensorflow.python.framework.errors_impl.NotFoundError: /home/cc/pointSIFT-master/tf_utils/tf_ops/pointSIFT_op/tf_pointSIFT_so.so: undefined symbol: _ZN10tensorflow8internal21CheckOpMessageBuilder9NewStringEv

I am using python2.7, Tensorflow1.4 and cuda9.1. I successfully complied the files in the four folders: sampling, pointSIFT_op, interpolation, and grouping. But when I run the training: python train_and_eval_scannet.py, the below error always pops up. Any help would be greatly appreciated!!!

/home/cc/anaconda2/lib/python2.7/site-packages/h5py/init.py:36: FutureWarning: Conversion of the second argument of issubdtype from float to np.floating is deprecated. In future, it will be treated as np.float64 == np.dtype(float).type.
from ._conv import register_converters as _register_converters
Traceback (most recent call last):
File "train_and_eval_scannet.py", line 14, in
import pointSIFT_pointnet as SEG_MODEL
File "/home/cc/pointSIFT-master/models/pointSIFT_pointnet.py", line 6, in
from pointSIFT_util import pointSIFT_module, pointSIFT_res_module, pointnet_fp_module, pointnet_sa_module
File "/home/cc/pointSIFT-master/tf_utils/pointSIFT_util.py", line 12, in
from pointSIFT_op import pointSIFT_select, pointSIFT_select_four
File "/home/cc/pointSIFT-master/tf_utils/tf_ops/pointSIFT_op/pointSIFT_op.py", line 14, in
pointSIFT_module = tf.load_op_library(os.path.join(BASE_DIR, 'tf_pointSIFT_so.so'))
File "/home/cc/anaconda2/lib/python2.7/site-packages/tensorflow/python/framework/load_library.py", line 56, in load_op_library
lib_handle = py_tf.TF_LoadLibrary(library_filename, status)
File "/home/cc/anaconda2/lib/python2.7/site-packages/tensorflow/python/framework/errors_impl.py", line 473, in exit
c_api.TF_GetCode(self.status.status))
tensorflow.python.framework.errors_impl.NotFoundError: /home/cc/pointSIFT-master/tf_utils/tf_ops/pointSIFT_op/tf_pointSIFT_so.so: undefined symbol: _ZN10tensorflow8internal21CheckOpMessageBuilder9NewStringEv

Multi-scale network

Notice that pointSIFT_select() returns idx of (b, n, 8), does that mean it always finds 8 nearest neighbor points?
Can someone give a hint on how to construct a multi-scale network with the pointSIFT_res_module?

Empirical choices of searching radius in pointSIFT_pointnet.py

Do you have any empirical guidance of choosing the searching radius of the pointSIFT_res_module? In the script: models/pointSIFT_pointnet.py, the radius ranges from 0.1 to 0.5 for c0 - c3. What is the general principle of setting these hierarchical radius values if I have a new dataset with different point density?

Pointsift for 3D shapes classification

Thanks for sharing your work. I am interesting in whether can this pointsift module used in a classification task? Have you done some experiment about classification?

Question about the multi-scale representation in the paper

Thanks for your nice work, but I have some questions about the code and PointSIFT.
First, in the section 4 of paper "PointSIFT: A SIFT-like Network Module for 3D Point Cloud Semantic Segmentation", the multi OE uint stacked to integrates multi-scale feature. However, in the code I did not find this kind of structure.
Second, the pointsifi_res_module looks like a two stacked OE uints. Is there any specific meaning of this desigen?
Third, I am confused about the code line 41 - 43 in pointSIFT_pointnet.py as they have same inputs and parameters. What's the meaning of this structure?

need ckpt and meta file

hi, I need the ckpt and meta file, but I do not have time to train the model, can anyone kindly sends me the files? Thanks a lot!!!

Could you offer the code related to S3DIS

Hello~ Your work is excellent great.

I see that you open the train code of scannet.

However, I do not find the code related to S3DIS. I want to test it. So could you offer the whole code.

Thank you very much!

About mIoU

Could you please offer the code for computing mIoU? I couldn't find it in your code. Thank you very much!

TF1.4.1 need cuDNN 6, not 5.1

The README may mislead users about environment configuration, attention that after TF1.0.0~1.2.0 can use cuDNN 5.1, while 1.3.0~1.4 with 6

tensorflow.python.framework.errors_impl.NotFoundError

Hello, I use python 3.6.7 and tensorflow 1.8 with cuda 9.0, but miss this error. Do I have to use the same version as you? Thanks :)

>>> from tf_utils.pointSIFT_util import pointSIFT_module /home/yxu/anaconda3/lib/python3.6/site-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from float to np.floating is deprecated. In future, it will be treated as np.float64 == np.dtype(float).type. from ._conv import register_converters as _register_converters Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/data1/yxu/Github/pointSIFT/tf_utils/pointSIFT_util.py", line 6, in <module> from tf_utils.tf_ops.pointSIFT_op.pointSIFT_op import pointSIFT_select, pointSIFT_select_four File "/data1/yxu/Github/pointSIFT/tf_utils/tf_ops/pointSIFT_op/pointSIFT_op.py", line 14, in <module> pointSIFT_module = tf.load_op_library(os.path.join(BASE_DIR, 'tf_pointSIFT_so.so')) File "/home/yxu/anaconda3/lib/python3.6/site-packages/tensorflow/python/framework/load_library.py", line 56, in load_op_library lib_handle = py_tf.TF_LoadLibrary(library_filename) tensorflow.python.framework.errors_impl.NotFoundError: libcudart.so.9.0: cannot open shared object file: No such file or directory

what's the meaning of the numbers and can we change the parameters in this function?

This function is in scannet_dataset.py, I guess it is used to do coordinates normalization. What are the meaning of those numbers (e.g. 1.5,3.0,31.0,62.0)? And the 10 in the for loop?
Any help would be greatly appreciated!

def __getitem__(self, index):
    import numpy as np
    point_set = self.scene_points_list[index]
    semantic_seg = self.semantic_labels_list[index].astype(np.int32)
    coordmax = np.max(point_set,axis=0)
    coordmin = np.min(point_set,axis=0)
    smpmin = np.maximum(coordmax-[1.5,1.5,3.0], coordmin)
    smpmin[2] = coordmin[2]
    smpsz = np.minimum(coordmax-smpmin, [1.5, 1.5, 3.0)
    smpsz[2] = coordmax[2]-coordmin[2]
    cur_semantic_seg = None
    cur_point_set = None
    mask = None
    for i in range(10):
        curcenter = point_set[np.random.choice(len(semantic_seg),1)[0],:]
        curmin = curcenter-[0.75, 0.75, 1.5]
        curmax = curcenter+[0.75, 0.75, 1.5]
        curmin[2] = coordmin[2]
        curmax[2] = coordmax[2]
        curchoice = np.sum((point_set >= (curmin-0.2))*(point_set <= (curmax+0.2)),axis=1) == 3
        cur_point_set = point_set[curchoice, :]
        cur_semantic_seg = semantic_seg[curchoice]
        if len(cur_semantic_seg) == 0:
            continue
        mask = np.sum((cur_point_set >= (curmin-0.01))*(cur_point_set <= (curmax+0.01)), axis=1) == 3
        vidx = np.ceil((cur_point_set[mask, :]-curmin)/(curmax-curmin)*[31.0, 31.0, 62.0])
        vidx = np.unique(vidx[:, 0] * 31.0 * 62.0 + vidx[:, 1] * 62.0 + vidx[:, 2])
        isvalid = np.sum(cur_semantic_seg > 0)/len(cur_semantic_seg) >= 0.7 and len(vidx)/31.0/31.0/62.0 >= 0.02
        if isvalid:
            break
    choice = np.random.choice(len(cur_semantic_seg), self.npoints, replace=True)
    point_set = cur_point_set[choice,:]

    semantic_seg = cur_semantic_seg[choice]
    mask = mask[choice]
    sample_weight = self.labelweights[semantic_seg]
    return point_set, semantic_seg, sample_weight

scannet

i have the same problems with scannet, and i want to know what's the format of the dataset of scannet, the reason why i want to train my own dataset and don't know how to change the format.

thank you very much!!!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.