Git Product home page Git Product logo

chunky3d's People

Contributors

dslawinski-fp avatar lgtm-migrator avatar marcinofulus avatar tgandor avatar wmalara avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

chunky3d's Issues

Problems with testing in conda-forge builds

There are a few issues with test dependencies (cf. #3), notably simpleitk and itk-thickness3d.
Having itk, but missing the itk-thickness3d causes an exception to be raised, failing unit tests.

The idea is to add some skipUnless()-es to the unit tests, when dependencies are not met.
(which means they couldn't have been met, because we try our best to meet them in all environments)

scikit_image-0.19 marching_cubes_lewiner deprecated

method no longer exists

import chunky3d.vtk_utils as vtk_utils /usr/local/lib/python3.7/site-packages/chunky3d/vtk_utils.py:514: in <module> from skimage.measure import marching_cubes_lewiner E ImportError: cannot import name 'marching_cubes_lewiner' from 'skimage.measure' (/usr/local/lib/python3.7/site-packages/skimage/measure/__init__.py)

sparse_func.unique() can break dense_data

It has been reported that:

M_sparse.make_dense_data(envelope=1)
M_sparse.update_grid_mask()
print(M_sparse._memory_blocks_with_holes)  # 0
unique_values = sf.unique(M_sparse, multiprocesses=1)
print(M_sparse._memory_blocks_with_holes)  # 1

This doesn't always happen. Probably this is because using Sparse.run() re-assigns the memory block returned from func.

Proposed fix:
Allow the functions to never assign to the array, by returning None instead of data.
This costs almost nothing, and may even save us performance.

Warning in label function

When using label with Sparse of type other than np.uint32 the following warning is raised:

warnings.warn("label in most cases require uint32 data to work properly")
.
I understand "label" may not work with floats but is it really the case that using other integer types (e.g. uint8) doesn't work correctly?

allowing unsigned shape property causes errors in slicing

See #9 (with discussion) - the problem was caused by allowing the shape to be unsigned in the first plae.

NumPy uses int (signed!) by default in their shapes.

In [8]: y = np.zeros(np.arange(2, 5, dtype=np.uint16))                                                                                                                      

In [9]: type(y.shape[0])                                                                                                                                                    
Out[9]: int

In [10]: y.shape                                                                                                                                                            
Out[10]: (2, 3, 4)

Out[9] is int, not uint16!

Of course, the problems with such uints is in their negation:

In [11]: -np.arange(2, 5, dtype=np.uint16)                                                                                                                                  
Out[11]: array([65534, 65533, 65532], dtype=uint16)

We need to use -shape e.g. to check for negative wrap-around indexing / slicing, so this would be nice. But not to fix is only when we do the lookup. We should fix it at the source.

And have a UT for this.

Too heavy requirements

The associated sparse_func.py have a lot of functionality dependent on 3rd party libraries, like VTK, SimpleITK, networkx and a few others. Right now we're forcing users to install everything (which also makes using it with Python 3.8 harder, because of missing packages).

We could do 2 things - separate the functionality to another package (too much complication), or isolate these requirements, with error messages in case of missing dependencies for non-core functionalities.

The full package could then be installed by setuptools' extra packages, i.e. via pip install chunky3d[all], and normally only the core functionality would be available (with hard requirements like numpy and possibly numba).

Msgpack 1.0.0 save / load test fails

This used to work, e.g. with msgpack 0.6.1

======================================================================
ERROR: test_save_compress (test_sparse.TestImage)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "chunky3d/chunky.py", line 1041, in load
    data = msgpack.unpackb(d, object_hook=m.decode, use_list=False)
  File "msgpack/_unpacker.pyx", line 202, in msgpack._cmsgpack.unpackb
msgpack.exceptions.ExtraData: unpack(b) received extra data.

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "test/test_sparse.py", line 117, in test_save_compress
    self.save_load(6)
  File "test/test_sparse.py", line 97, in save_load
    loaded_s = Sparse.load('save.msgpack')
  File "chunky3d/chunky.py", line 1043, in load
    data = msgpack.unpackb(zlib.decompress(d), object_hook=m.decode, use_list=False)
  File "msgpack/_unpacker.pyx", line 195, in msgpack._cmsgpack.unpackb
ValueError: tuple is not allowed for map key

======================================================================
ERROR: test_save_without_compress (test_sparse.TestImage)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "chunky3d/chunky.py", line 1041, in load
    data = msgpack.unpackb(d, object_hook=m.decode, use_list=False)
  File "msgpack/_unpacker.pyx", line 195, in msgpack._cmsgpack.unpackb
ValueError: tuple is not allowed for map key

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "test/test_sparse.py", line 120, in test_save_without_compress
    self.save_load(0)
  File "test/test_sparse.py", line 97, in save_load
    loaded_s = Sparse.load('save.msgpack')
  File "chunky3d/chunky.py", line 1043, in load
    data = msgpack.unpackb(zlib.decompress(d), object_hook=m.decode, use_list=False)
zlib.error: Error -3 while decompressing data: incorrect header check

---------------------------

pickle and unpickle of sparse gives unusable object

pickling and unpickling Sparse object gives an object that has some missing fields, and e.g. cannot be saved with Sparse.save.

Reproducible example:

sparse = Sparse(shape=(10,10,10))
sparse.set((1,1,1), np.array([[[20]]]))
sparse.save("/tmp/tmp.sparse")
with open("/tmp/tmp.sparse.pickle", "wb") as f:
    pickle.dump(sparse, f)
with open("/tmp/tmp.sparse.pickle", "rb") as f:
    sparse_loaded  =pickle.load(f)
    sparse_loaded.save("/tmp/tmp.sparse")

fails with:

 File "env1/lib/python3.10/site-packages/chunky3d/chunky.py", line 1277, in save
    "_grid": {k: v.to_dict() for k, v in self._grid.items()},
  File "env1/lib/python3.10/site-packages/chunky3d/chunky.py", line 1277, in <dictcomp>
    "_grid": {k: v.to_dict() for k, v in self._grid.items()},
  File "env1/lib/python3.10/site-packages/chunky3d/chunk.py", line 34, in to_dict
    return {b"origin": self.origin, b"spacing": self.spacing, b"ndarray": self}
  File "env1/lib/python3.10/site-packages/chunky3d/chunk.py", line 27, in origin
    return self._origin
AttributeError: 'Chunk' object has no attribute '_origin'. Did you mean: 'origin'?

Broadcasting in assignments

Compare the behavior of numpy and chunky3d:
In:

import numpy as np
import chunky3d

s = chunky3d.Sparse(shape=(4, 4, 4))
s[:3, :3, :3] = np.ones((2, 2, 2))
s[:, :, :]

Out:

array([[[1., 1., 0., 0.],
        [1., 1., 0., 0.],
        [0., 0., 0., 0.],
        [0., 0., 0., 0.]],

       [[1., 1., 0., 0.],
        [1., 1., 0., 0.],
        [0., 0., 0., 0.],
        [0., 0., 0., 0.]],

       [[0., 0., 0., 0.],
        [0., 0., 0., 0.],
        [0., 0., 0., 0.],
        [0., 0., 0., 0.]],

       [[0., 0., 0., 0.],
        [0., 0., 0., 0.],
        [0., 0., 0., 0.],
        [0., 0., 0., 0.]]])

NumPy:
In:

d = np.zeros((4, 4, 4))
d[:3, :3, :3] = np.ones((2, 2, 2))

Out:

ValueError                                Traceback (most recent call last)
<ipython-input-5-233969864b24> in <module>
      1 d = np.zeros((4, 4, 4))
----> 2 d[:3, :3, :3] = np.ones((2, 2, 2))

ValueError: could not broadcast input array from shape (2,2,2) into shape (3,3,3)

I think we could probably fix this easily using: https://docs.scipy.org/doc/numpy/reference/generated/numpy.broadcast_to.html

regression: after #3 in thinning() when multiprocessing>1

A problem appeared after #3 introduced local (in-function) imports.

A lambda transmitted by dill seems to have problems using the itk module.

This only comes up when multiprocessing > 1.

We need to do 2 things:

  • fix this (currently known method: bring back global import - albeit with try-except)
  • Unit test for multiprocessing==1 and > 1
  • add more tests for sparse_func in general, to prevent such regressions.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.