Git Product home page Git Product logo

neuroglancer-scripts's People

Contributors

blowekamp avatar falkben avatar pacher avatar xgui3783 avatar ylep avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

neuroglancer-scripts's Issues

Not enough memory

When I produce precomputed files from png files, the program will take up a lot of memory. Is there any way to reduce memory consumption?

pypi version has bug

Version on pypi (0.1.0) has a bug in which options are not passed correctly between functions (appears to have been fixed 5 months ago). f0e1e79

It would make it easier to use if you had an up to date version on pypi. Thanks!

UnboundLocalError: local variable 'triangles' referenced before assignment

Reproduce the error:

git clone https://github.com/HumanBrainProject/neuroglancer-scripts
cd neuroglancer-scripts
python3 -m venv venv/
. venv/bin/activate
pip install neuroglancer-scripts

mv ~/mygii.gii ./
  mesh-to-precomputed \
  --coord-transform=-1,0,0,0,1,0,0,0,1 \
  mygii.gii \
  ./mymesh

digging into the saucecode, in the github repo:

mesh_to_precomputed.py

def mesh_file_to_precomputed(args):
    # truncated
    triangles = triangles_list[0].data

    if (coord_transform is not None
            and np.linalg.det(coord_transform[:3, :3]) < 0):
        # Flip the triangles to fix inside/outside
        triangles = np.flip(triangles, axis=1)
    # truncated

whereas in the pip installed package:
mesh_to_precomputed.py

def mesh_file_to_precomputed(input_filename, output_filename,
                             coord_transform=None):
    # truncated
    if coord_transform is not None:
        if coord_transform.shape[0] == 4:
            assert np.all(coord_transform[3, :] == [0, 0, 0, 1])
        points = points.T
        points = np.dot(coord_transform[:3, :3], points)
        points += coord_transform[:3, 3, np.newaxis]
        points = points.T
        if np.linalg.det(coord_transform[:3, :3]) < 0:
            # Flip the triangles to fix inside/outside
            triangles = np.flip(triangles, axis=1)

    triangles_list = mesh.get_arrays_from_intent("NIFTI_INTENT_TRIANGLE")
    assert len(triangles_list) == 1
    triangles = triangles_list[0].data

    # truncated

[feature request] neuroglancer http accessor profiler

Is there an interest in adding a profile functionality to neuroglancer-scripts? (similar to scale-stats)

Usecase: we would like to be able to

1/ monitor the "liveliness" of a VM serving precomputed volume

2/ simulate average usecases (e.g. 10 concurrent users) and test the performance (average/median/min/max response time) of image services

I would imagine the usage would be something like:

# N.B. not yet implemented
profile-precomputed [-c/--concurrent int=1] \
  [-n int=1] \
  [-t/--threshold_ms int=-1] \
  PRECOMPUTED_URL1 [PRECOMPUTED_URL2] ...

I would imagine for each iteration the script would select a random x, y, z, level, then fetch volumes at +- 3 levels, with appropriate bounding box (bound by the limit of the volume)

We will happy to put in work for this utility, as we foresee that we will need to create such a utility in any case.

The question is, is neuroglancer-script a good/correct place for such utility?

would love to hear your thoughts @ylep

[bug] RGB nifti encoding standard unclear

despite #15 , it seems different nifti softwares (e.g. fiji nifti plugin) seems to store rgb channel on 3rd index, rather than 4th index. related PR #23 is somewhat suppose to fix this, but it would be more suitable to clarify how RGB is encoded.

Unfortunately, this feature is outside of my capacity at the moment. Perhaps I can pick it up in a few months.

Apache config

Hi,

I am trying to set up Apache (Apache/2.4.37 centos) for file serving but somehow I cannot get it to work the .gz rewriting for the precomputed dataset. Chrome Dev Tool is showing 404 without .gz extensions. Could I be missing something here? I am looking at the following example. google/neuroglancer#357

Thank you,
-m

https://neuroglancer-scripts.readthedocs.io/en/latest/serving-data.html

# If you get a 403 Forbidden error, try to comment out the Options directives
# below (they may be disallowed by your server's AllowOverride setting).

<IfModule headers_module>
    # Needed to use the data from a Neuroglancer instance served from a
    # different server (see http://enable-cors.org/server_apache.html).
    Header set Access-Control-Allow-Origin "*"
</IfModule>

# Data chunks are stored in sub-directories, in order to avoid having
# directories with millions of entries. Therefore we need to rewrite URLs
# because Neuroglancer expects a flat layout.
Options FollowSymLinks
RewriteEngine On
RewriteRule "^(.*)/([0-9]+-[0-9]+)_([0-9]+-[0-9]+)_([0-9]+-[0-9]+)$" "$1/$2/$3/$4"

# Microsoft filesystems do not support colons in file names, but pre-computed
# meshes use a colon in the URI (e.g. 100:0). As :0 is the most common (only?)
# suffix in use, we will serve a file that has this suffix stripped.
RewriteCond "%{REQUEST_FILENAME}" !-f
RewriteRule "^(.*):0$" "$1"

<IfModule mime_module>
    # Allow serving pre-compressed files, which can save a lot of space for raw
    # chunks, compressed segmentation chunks, and mesh chunks.
    #
    # The AddType directive should in theory be replaced by a "RemoveType .gz"
    # directive, but with that configuration Apache fails to serve the
    # pre-compressed chunks (confirmed with Debian version 2.2.22-13+deb7u6).
    # Fixes welcome.
    Options Multiviews
    AddEncoding x-gzip .gz
    AddType application/octet-stream .gz
</IfModule>

uint64 TIFF files not loading (scikit-image loader only supports strings as input not Path objects)

scikit-image io.imread function expects a string for the filename input. It uses that string to determine if the file is a TIFF file, and if it is, it uses the tifffile package instead of PILLOW. PILLOW does not support uint64 TIFFs.

when loading images a Path object is used instead of a String, which for uint64 tiff files, causes an exception. https://github.com/HumanBrainProject/neuroglancer-scripts/blob/master/src/neuroglancer_scripts/scripts/slices_to_precomputed.py#L142-L143

If you replace the above lines with the following, uint64 images load correctly in scikit-image:

block = skimage.io.concatenate_images(
    map(skimage.io.imread, [str(f) for f in slice_filenames[slice_slicing]]))

conversion of nii file to segmentations, NG does not properly display converted segment if original encoding is float instead of uint

when following the guide on converting nii file to NG precomputed segmentations, I ran into the case where the nii file is encoded in float. While NG can display the converted chunks (on hover shows the right label), the segments do not have colour map applied.

The nii had to be converted to one of the uint encodings (in my case, uint8).

Would it be wise to log a warning if user had --type=segmentation flag on, and the nii file is encoded in float?

related: would be cool if the nii encoding conversion can be done by the script (or a separate library, if there is a need as such)

Support for chunking dataset in memory

Hi, Great package :) I have a large nii volume which will not fit in 260GB of ram. Using mmap is not an option since this would take more than a month per volume on my workstation. I am wondering whether it is possible for something in between mmap and loading the full volume into memory. Potentially loading and chunking portions of the volume (or manually specifying maximum available memory). Thank you in advance :)

tqdm causes slices-to-precomputed to fail

I haven't yet figured out exactly why this is happening, but the progress bar in slices-to-precomputed is causing the script to fail prematurely for me.

When called with a relatively simple input:

slices-to-precomputed.exe data\ output\ --flat

Here's the stack trace:

Exception ignored in: <bound method tqdm.__del__ of writing chunks: 100%|████████████████████████████████████████████████████████████████████████████████████████████████████| 90/90 [00:18<00:00,  3.59chunks/s]>
Traceback (most recent call last):
  File "/mnt/c/Users/ben/Documents/AB_MouseRegistrationTests_MSGSW171registrationtest_0_2160_0_2560_0_16_Ch0_1/env/lib/python3.5/site-packages/tqdm/_tqdm.py", line 889, in __del__
    self.close()
  File "/mnt/c/Users/ben/Documents/AB_MouseRegistrationTests_MSGSW171registrationtest_0_2160_0_2560_0_16_Ch0_1/env/lib/python3.5/site-packages/tqdm/_tqdm.py", line 1095, in close
    self._decr_instances(self)
  File "/mnt/c/Users/ben/Documents/AB_MouseRegistrationTests_MSGSW171registrationtest_0_2160_0_2560_0_16_Ch0_1/env/lib/python3.5/site-packages/tqdm/_tqdm.py", line 454, in _decr_instances
    cls.monitor.exit()
  File "/mnt/c/Users/ben/Documents/AB_MouseRegistrationTests_MSGSW171registrationtest_0_2160_0_2560_0_16_Ch0_1/env/lib/python3.5/site-packages/tqdm/_monitor.py", line 52, in exit
    self.join()
  File "/usr/lib/python3.5/threading.py", line 1051, in join
    raise RuntimeError("cannot join current thread")
RuntimeError: cannot join current thread

When this fails in such a way, it's nearly complete, but doesn't get to creating the individual .gz files for that base resolution (it has all the folders for each "block" of data.

If I comment out the lines for the progress bar in the loop for slices_to_raw_chunks the script completes without error.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.