Git Product home page Git Product logo

Comments (6)

pangsu0613 avatar pangsu0613 commented on August 10, 2024 2

Hi @urbansound8K , https://github.com/traveller59/second.pytorch/tree/v1.5#prepare-dataset, in "create reduced point cloud" subsection, it tells you how to generate the reduced point cloud, basically, it removes the points out of view, because KITTI only have cameras facing front, while LiDAR has 360 degrees field of view.
During training CLOCs, I run SECOND in real time, I did this just because it is simple (and SECOND is pretty fast). If you want, you can run SECOND and save the results for CLOCs training, then you don't need to run SECOND online, but you need to modify the codebase, we are also working on this now, will push new versions when it is available. https://github.com/pangsu0613/CLOCs#evaluation, this section is about how to run inference. I am not sure if this answers your question, feel free to ping me if you have further questions.

from clocs.

dominicfmazza avatar dominicfmazza commented on August 10, 2024 2

@urbansound8K Additionally, if you would still like to see the coco results, if you modify line 113 of /home/$USER/anaconda3/envs/spconv-1.0/lib/python3.6/site-packages/numpy/core/function_base.py to be num = operator.index(int(num)) instead of num = operator.index(num) you should be able to see those results

from clocs.

pangsu0613 avatar pangsu0613 commented on August 10, 2024 1

Hello @urbansound8K , For the first question, refer #16, the main problem is numpy versions, but since we don't need coco result, remove it is the simplest solution.
For the second question, follow https://github.com/pangsu0613/CLOCs#prepare-dataset-kitti carefully, note that "velodyne" folder is for original KITTI velodyne point cloud, "velodyne_reduced" is the one that actually used by SECOND-V1.5, it contains the downsampled point clouds, you need to download these reduced point clouds from the link that I put in https://github.com/pangsu0613/CLOCs#prepare-dataset-kitti and put the downloaded files under "velodyne_reduced".

from clocs.

pangsu0613 avatar pangsu0613 commented on August 10, 2024 1

Hi @urbansound8K, you missed https://github.com/pangsu0613/CLOCs#preparation, step 2, you also need to download SECOND-V1.5 pretrained model from the link that I put in the README, and put it under the specified directory. Note that you need two pretrained models, one for SECOND-V1.5, the other for CLOCs, this is because in the current codebase, we are still running SECOND on the fly.
BTW, you can follow https://github.com/pangsu0613/CLOCs#common-errors--solutions, the second point, to hide these unrelated numba warnings, that would make your output cleaner.

from clocs.

urbansound8K avatar urbansound8K commented on August 10, 2024

Thanks a lot
Could you tell how did you make the downsampled point clouds files, please?

And do you make an inference to run the trained model in a real time application? could it be possbile for you to guide me on how to make the inference, please?

from clocs.

urbansound8K avatar urbansound8K commented on August 10, 2024

Hello @pangsu0613

could it be possible for you to do me a favor, please?

I trained the model and I followed the preperation you provided in the README, but still got wired results!!!

Predict_test: False
sparse_shape: [ 41 1600 1408]
num_class is : 1
load existing model
load existing model for fusion layer
Restoring parameters from CLOCs_SecCas_pretrained/fusion_layer-11136.tckpt
remain number of infos: 3769
Generate output labels...
/media/cute/New Volume/AAA/YYY/fusion/ClOCss/second/core/geometry.py:146: NumbaWarning:
Compilation is falling back to object mode WITH looplifting enabled because Function "points_in_convex_polygon_jit" failed type inference due to: No implementation of function Function() found for signature:

getitem(array(float32, 3d, C), Tuple(slice<a:b>, list(int64)<iv=None>, slice<a:b>))

There are 22 candidate implementations:

  • Of which 20 did not match due to:
    Overload of function 'getitem': File: : Line N/A.
    With argument(s): '(array(float32, 3d, C), Tuple(slice<a:b>, list(int64)<iv=None>, slice<a:b>))':
    No match.
  • Of which 2 did not match due to:
    Overload in function 'GetItemBuffer.generic': File: numba/core/typing/arraydecl.py: Line 162.
    With argument(s): '(array(float32, 3d, C), Tuple(slice<a:b>, list(int64)<iv=None>, slice<a:b>))':
    Rejected as the implementation raised a specific error:
    TypeError: unsupported array index type list(int64)<iv=None> in Tuple(slice<a:b>, list(int64)<iv=None>, slice<a:b>)
    raised from /home/cute/anaconda3/envs/spconv-1.0/lib/python3.6/site-packages/numba/core/typing/arraydecl.py:69

During: typing of intrinsic-call at /media/cute/New Volume/AAA/YYY/fusion/ClOCss/second/core/geometry.py (162)

File "core/geometry.py", line 162:
def points_in_convex_polygon_jit(points, polygon, clockwise=True):

vec1 = polygon - polygon[:, [num_points_of_polygon - 1] +
list(range(num_points_of_polygon - 1)), :]
^

@numba.jit
/media/cute/New Volume/AAA/YYY/fusion/ClOCss/second/core/geometry.py:146: NumbaWarning:
Compilation is falling back to object mode WITHOUT looplifting enabled because Function "points_in_convex_polygon_jit" failed type inference due to: Cannot determine Numba type of <class 'numba.core.dispatcher.LiftedLoop'>

File "core/geometry.py", line 170:
def points_in_convex_polygon_jit(points, polygon, clockwise=True):

cross = 0.0
for i in range(num_points):
^

@numba.jit
/home/cute/anaconda3/envs/spconv-1.0/lib/python3.6/site-packages/numba/core/object_mode_passes.py:152: NumbaWarning: Function "points_in_convex_polygon_jit" was compiled in object mode without forceobj=True, but has lifted loops.

File "core/geometry.py", line 157:
def points_in_convex_polygon_jit(points, polygon, clockwise=True):

# first convert polygon to directed lines
num_points_of_polygon = polygon.shape[1]
^

state.func_ir.loc))
/home/cute/anaconda3/envs/spconv-1.0/lib/python3.6/site-packages/numba/core/object_mode_passes.py:162: NumbaDeprecationWarning:
Fall-back from the nopython compilation path to the object mode compilation path has been detected, this is deprecated behaviour.

For more information visit https://numba.pydata.org/numba-doc/latest/reference/deprecation.html#deprecation-of-object-mode-fall-back-behaviour-when-using-jit

File "core/geometry.py", line 157:
def points_in_convex_polygon_jit(points, polygon, clockwise=True):

# first convert polygon to directed lines
num_points_of_polygon = polygon.shape[1]
^

state.func_ir.loc))
/home/cute/anaconda3/envs/spconv-1.0/lib/python3.6/site-packages/numba/core/typed_passes.py:327: NumbaPerformanceWarning:
The keyword argument 'parallel=True' was specified but no transformation for parallel execution was possible.

To find out why, try turning on parallel diagnostics, see https://numba.pydata.org/numba-doc/latest/user/parallel.html#diagnostics for help.

File "utils/eval.py", line 127:
@numba.jit(nopython=True,parallel=True)
def build_stage2_training(boxes, query_boxes, criterion, scores_3d, scores_2d, dis_to_lidar_3d,overlaps,tensor_index):
^

state.func_ir.loc))
/home/cute/anaconda3/envs/spconv-1.0/lib/python3.6/site-packages/numba/core/typed_passes.py:327: NumbaPerformanceWarning:
The keyword argument 'parallel=True' was specified but no transformation for parallel execution was possible.

To find out why, try turning on parallel diagnostics, see https://numba.pydata.org/numba-doc/latest/user/parallel.html#diagnostics for help.

File "utils/eval.py", line 231:
@numba.jit(nopython=True, parallel=True)
def d3_box_overlap_kernel(boxes, qboxes, rinc, criterion=-1):
^

state.func_ir.loc))
[100.0%][===================>][8.99it/s][06:59>00:00]
generate label finished(8.98/s). start eval:
validation_loss: 306.0920329661988
avg example to torch time: 8.388 ms
avg prep time: 6.362 ms
/home/cute/anaconda3/envs/spconv-1.0/lib/python3.6/site-packages/numba/core/typed_passes.py:327: NumbaPerformanceWarning:
The keyword argument 'parallel=True' was specified but no transformation for parallel execution was possible.

To find out why, try turning on parallel diagnostics, see https://numba.pydata.org/numba-doc/latest/user/parallel.html#diagnostics for help.

File "utils/eval.py", line 231:
@numba.jit(nopython=True, parallel=True)
def d3_box_overlap_kernel(boxes, qboxes, rinc, criterion=-1):
^

state.func_ir.loc))
Car [email protected], 0.70, 0.70:
bbox AP:38.95, 33.66, 29.18
bev AP:2.89, 2.67, 2.12
3d AP:0.97, 0.79, 0.65
aos AP:22.53, 19.04, 16.56
Car [email protected], 0.50, 0.50:
bbox AP:38.95, 33.66, 29.18
bev AP:12.86, 11.50, 9.78
3d AP:9.28, 8.18, 6.81
aos AP:22.53, 19.04, 16.56

Car [email protected], 0.70, 0.70:
bbox AP:38.95, 33.66, 29.18
bev AP:2.89, 2.67, 2.12
3d AP:0.97, 0.79, 0.65
aos AP:22.53, 19.04, 16.56
Car [email protected], 0.50, 0.50:
bbox AP:38.95, 33.66, 29.18
bev AP:12.86, 11.50, 9.78
3d AP:9.28, 8.18, 6.81
aos AP:22.53, 19.04, 16.56

from clocs.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.