Git Product home page Git Product logo

ndtiffstorage's Introduction

Micro-Manager

Micro-Manager is an application to control microscope hardware, such as cameras, xy-stages, filter wheels, etc. It includes a hardware abstraction layer written in C++ and a user interface written in Java (Swing).

Go to micro-manager.org for documentation and binary downloads.

For support, see Micro-Manager Community.

The Micro-Manager community welcomes you! For our governance structures, go here

Source code

This repository contains the Java projects that make up the Micro-Manager "MMStudio" GUI application. The device control layer is written in C++ and found in a separate repository, mmCoreAndDevices, which is currently a git submodule of this repository.

To checkout both repositories together:

git clone --recurse-submodules https://github.com/micro-manager/micro-manager.git

If you will be making changes to the code, make sure to enable pre-commit hooks as described in doc/pre-commit.md.

Branches

  • main - the main branch of development (Micro-Manager 2.x)
  • svn-mirror - git-svn mirror of the Micro-Manager 1.4 Subversion repository

Other branches are not official.

Developer information

For license information, please see doc/copyright.txt.

For build instructions, please see the doc/how-to-build.md.

Additional information is available on the Micro-Manager website at https://micro-manager.org

Contributing

Start here: https://micro-manager.org/Building_and_debugging_Micro-Manager_source_code

ndtiffstorage's People

Contributors

carlkesselman avatar cgohlke avatar henrypinkard avatar ieivanov avatar manuelmaeritz avatar nicost avatar pavanramkumar avatar zack-insitro avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

ndtiffstorage's Issues

Feature Request: improve TIFF file portability

Imaging data generated using pycromanager can be opened with other readers within the TIFF ecosystem, although there are some restrictions outside of the pycromanager.Dataset route.

In the following, I am using the TIFF file generated by democam.py in micro-manager/pycro-manager#76 as a reference:

  1. ImageJ itself does not recognize acquisition axes, i.e. the output stack shown is flat (instead of having - in this specific case - 2 sliders as in the original acquisition window)
  2. Reading tags using tifffile reveals that there is a superfluous axis 'channel' within 'ImagrStorageAxesPositions' / 'StorageSuperChannelName'; the order of axes is also not preserved (likely due to specifying a 'standard' axis name - I'm using Python 3.8, so dict should preserve order).
  3. The Micro-Manager Image file stack header appears to be incompletely written (see comment in cgohlke/tifffile#23); while I generated the file in question, I cannot verify this myself.
  4. As far as I understand there doesn't appear to be an option for compressed storage (this is not a portability issue per se, but would be nice)

I am unfortunately not sufficiently familiar with Java/TIFF details to tackle this (my background is C++/Python and a few others, and I don't have the bandwidth to go beyond). It also does not appear to be an issue that is specific to an individual use case.

ndtiff Datasets cannot be pickled

Dask arrays created by the ndtiff library cannot be pickled. Pickling is important for multiprocessing applications.

For example,

dataset = Dataset(\path\to\ndtiff\dataset)
data_dask_array = dataset.as_array()
pickle.dumps(data_dask_array)

first throws a AttributeError: Can't pickle local object 'NDTiffDataset.as_array.<locals>.read_one_image', which is fixed by #108. It then throws TypeError: cannot pickle '_thread.RLock' object, which I don't think I'll be able to fix myself.

A workaround may be to open a new dask array in every worker rather than pickling the opened dask array for distribution to workers.

Opening ndtiff dataset in two different processes

Hey @henrypinkard I would like to open a ndtiff Dataset that is currently being acquired in a separate process for viewing with napari. I need to do that because I am running two Acquisitions in parallel and would like to view the data from both. Do you see a problem with opening a current dataset in a different process? Would the builtin thread lock protect from read/write conflicts?

V3.2 breaks reading and writing code, is this so important to break backward compatbility?

The change from axis indices being integers to allowing all objects breaks code working with NDTiffStorage. It makes writing (backward compatible) code to work with NDTiffStorage difficult and awkward. Would it not be much easier to stick to integers as axis indices, and provide an (optional) dictionary mapping indices to Objects (or highly preferable, just Strings) in summary metadata?

At the very least, according to Semver this should be a major version increase, i.e. 4.0 rather than 3.2.

How to remove data in a specific axes of a ndtiff file?

I am using pycromager to acquire many very large datasets. I have a lot of very large ndtiff Dataset files where I want to keep most of the data but I would like to remove an axis (ex: channel 1, or z:0 ) because it does not contain anything valuable. I like the file format of ndtiff so I would like to keep it rather then using another package like tifffile to write the desired arrays to other files. How can I write a new dataset file in python including only the axes I care about?

Please tag the release commits

From a packager viewpoint it would be great if you add git tags to the commits used for releasing a version on PyPI. For example, I need the test files corresponding to the released sdist.

NDTiffStorage in pycro-manager

@henrypinkard ... Acquisitionevents in pycro-manager create TIFF files, which, however, appear to be flat (i.e. they aren't ND Hyperstacks). Is there a way to preserve axes defined in events?

Further, none of the metadata shown in the micro-magellan window appears to be saved to TIFF.

PS: The code I'm using is

import pycromanager as pm

if __name__ == '__main__':

    exposures = [.01, 100, 200, 300, 400, 500, 600, 700, 800, 900, 1000]
    with pm.Acquisition(directory='.', name='darkframes') as acq:
        events = []
        for rep in range(50):
            for idx, exposure in enumerate(exposures):
                evt = {'axes': {'time': 0, 'repetition': rep, 'exposure': idx}, 'exposure': exposure}
                events.append(evt)

        acq.acquire(events)

Dataset.as_array() returns array of wrong size for stitched images

Hello,

I faced an unexpected behavior in the function as_array when using the stitched=True option. The returned dask array seems to be in the wrong shape. When converting it to a numpy array however the shape assumes the right dimensions.

I have tow-channel images of dimensions 2016x852px, which are arranged in two columns and eight rows (no overlap).

Experienced behavior:

d = data.as_array(stitched=True, z=0)
print(d)
>>> a=np.asarray(d)
print(a.shape)
>>> dask.array<read_one_image, shape=(2, 2016, 852), dtype=uint16, chunksize=(1, 2016, 852), chunktype=numpy.ndarray>
a=np.asarray(d)
print(a.shape)
>>>(2, 6816, 4032)

Note that beside the fact that the dimensions are wrong the width and the height of the images are swapped: In the dask-array the columns come before the rows, whereas in the numpy array the rows come before the columns.

Expected behavior:

d = data.as_array(stitched=True, z=0)
print(d)
>>> a=np.asarray(d)
print(a.shape)
>>> dask.array<read_one_image, shape=(2, 6816, 4032), dtype=uint16, chunksize=(1, 6816, 4032), chunktype=numpy.ndarray>
a=np.asarray(d)
print(a.shape)
>>>(2, 6816, 4032)

Fix:

Changing the following rows

483    w = self.image_width if not stitched else self._tile_width
484    h = self.image_height if not stitched else self._tile_height

...

553    chunks += (w, h)

to

483    w = self.image_width if not stitched else self._tile_width*len(self.axes["column"])
484    h = self.image_height if not stitched else self._tile_height*len(self.axes["row"])

...

553    chunks += (h, w)

in nd_tiff_v2.py seems to solve this problem.

Suggestion:

This way however the chunks consists of the already stitched images. For very large stitched images it might be better, though, to have a single chunk for every acquired image, thus a dask array of properties
dask.array<read_one_image, shape=(2, 6816, 4032), dtype=uint16, chunksize=(1, 852, 2016), chunktype=numpy.ndarray>
This might be useful for some people.

The bug is related to the already fixed issue reported by ptbrown1729: micro-manager/pycro-manager#278

Otherwise very nice package! Thanks for developing this open source project!

Add pytest tests for dataset loading

Problem

If people are going to rely on pycromanager to load their datasets then it would be good to have 100% test coverage of the dataset code. Otherwise the potential cost of breakage is high.

Proposed Solution

add tests for dataset. This basically works by having some simple known test images embedded into the repo. and then you use the dataset loading code to load them and do some asserts that everythign is equal.

For example see how aicsimageio does it: https://github.com/AllenCellModeling/aicsimageio/tree/main/aicsimageio/tests/readers

Update requirements

Hi team, in working on the conda forge package for the latest release

conda-forge/ndtiff-feedstock#20

We noticed that the tarball is not including the requirements.txt file:
https://github.com/micro-manager/NDTiffStorage/blob/main/python/setup.py#L14

Which will result in a broken package when installing from a tarball

I am going to add a dummy requirements.txt as the rest of the conda dependencies will be provided by the recipe meta.yaml.

Using this sort of install procedure (relying on an external requirements.txt) file is not recommended.


On another note it seems the package is using sortedcontainers but it has not been declared as a dependency.
See:

from sortedcontainers import SortedSet

Could you look into updating the missing dependencies and look into other ways of specifying then? For instance using a setup.cfg file as described here.

Cheers!

Change signature of MultiResNDTiffAPI.increaseMaxResolution

Is it OK to change the signature of MultiResNDTiffAPI

void increaseMaxResolutionLevel(int newMaxResolutionLevel);

to:

Future<Integer> increaseMaxResolutionLevel(int newMaxResolutionLevel);

where Integer denotes the newMaxResolutionIndex?

This would make it possible to immediately display the new resolution image, rather than having to wait for a user action to update the UI. Currently, in Magellan in Explore mode, zooming out regularly leads to a completely black canvas that only gets filled in when the user moves the mouse. or otherwise forces the display to update.

Document format and API

While some limited usage examples are provided here, the NDTiffStorage format and API are largely undocumented, and module dependencies are not clear (is the format usable/readable without installing micro-manager?).

See also: #10

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.