Git Product home page Git Product logo

nd2's Introduction

nd2

License PyPI Python Version Tests codecov Benchmarks

.nd2 (Nikon NIS Elements) file reader.

This reader provides a pure python implementation of the Nikon ND2 SDK.

It used to wrap the official SDK with Cython, but has since been completely rewritten to be pure python (for performance, ease of distribution, and maintenance) while retaining complete API parity with the official SDK.

Note: This library is not affiliated with Nikon in any way, but we are grateful for assistance from the SDK developers at Laboratory Imaging.

Features good metadata retrieval, direct to_dask and to_xarray options for lazy and/or annotated arrays, and output to OME-TIFF.

This library is tested against many nd2 files with the goal of maximizing compatibility and data extraction. (If you find an nd2 file that fails in some way, please open an issue with the file!)

install

pip install nd2

or from conda:

conda install -c conda-forge nd2

Legacy nd2 file support

Legacy nd2 (JPEG2000) files are also supported, but require imagecodecs. To install with support for these files use the legacy extra:

pip install nd2[legacy]

Faster XML parsing

Much of the metadata in the file stored as XML. If found in the environment, nd2 will use lxml which is much faster than the built-in xml module. To install with support for lxml use:

pip install nd2 lxml

Usage and API

Full API documentation is available at https://tlambert03.github.io/nd2

Quick summary below:

import nd2
import numpy as np

my_array = nd2.imread('some_file.nd2')                          # read to numpy array
my_array = nd2.imread('some_file.nd2', dask=True)               # read to dask array
my_array = nd2.imread('some_file.nd2', xarray=True)             # read to xarray
my_array = nd2.imread('some_file.nd2', xarray=True, dask=True)  # read to dask-xarray

# or open a file with nd2.ND2File
f = nd2.ND2File('some_file.nd2')

# (you can also use nd2.ND2File() as a context manager)
with nd2.ND2File('some_file.nd2') as ndfile:
    print(ndfile.metadata)
    ...


# ATTRIBUTES:   # example output
f.path          # 'some_file.nd2'
f.shape         # (10, 2, 256, 256)
f.ndim          # 4
f.dtype         # np.dtype('uint16')
f.size          # 1310720  (total voxel elements)
f.sizes         # {'T': 10, 'C': 2, 'Y': 256, 'X': 256}
f.is_rgb        # False (whether the file is rgb)
                # if the file is RGB, `f.sizes` will have
                # an additional {'S': 3} component

# ARRAY OUTPUTS
f.asarray()         # in-memory np.ndarray - or use np.asarray(f)
f.to_dask()         # delayed dask.array.Array
f.to_xarray()       # in-memory xarray.DataArray, with labeled axes/coords
f.to_xarray(delayed=True)   # delayed xarray.DataArray

# OME-TIFF OUTPUT (new in v0.10.0)
f.write_tiff('output.ome.tif')  # write to ome-tiff file

                    # see below for examples of these structures
# METADATA          # returns instance of ...
f.attributes        # nd2.structures.Attributes
f.metadata          # nd2.structures.Metadata
f.frame_metadata(0) # nd2.structures.FrameMetadata (frame-specific meta)
f.experiment        # List[nd2.structures.ExpLoop]
f.text_info         # dict of misc info
f.voxel_size()      # VoxelSize(x=0.65, y=0.65, z=1.0)

f.rois              # Dict[int, nd2.structures.ROI]
f.binary_data       # any binary masks stored in the file.  See below.
f.events()          # returns tabular "Recorded Data" view from in NIS Elements/Viewer
                    # with info for each frame in the experiment.
                    # output is passabled to pandas.DataFrame

f.ome_metadata()    # returns metadata as an ome_types.OME object
                    # (requires ome-types package)

# allll the metadata we can find...
# no attempt made to standardize or parse it
# look in here if you're searching for metadata that isn't exposed in the above
# but try not to rely on it, as it's not guaranteed to be stable
f.unstructured_metadata()

f.close()           # don't forget to close when not using a context manager!
f.closed            # boolean, whether the file is closed

Metadata structures

These follow the structure of the nikon SDK outputs (where relevant). Here are some example outputs

attributes
Attributes(
    bitsPerComponentInMemory=16,
    bitsPerComponentSignificant=16,
    componentCount=2,
    heightPx=32,
    pixelDataType='unsigned',
    sequenceCount=60,
    widthBytes=128,
    widthPx=32,
    compressionLevel=None,
    compressionType=None,
    tileHeightPx=None,
    tileWidthPx=None,
    channelCount=2
)
metadata

Note: the metadata for legacy (JPEG2000) files will be a plain unstructured dict.

Metadata(
    contents=Contents(channelCount=2, frameCount=60),
    channels=[
        Channel(
            channel=ChannelMeta(
                name='Widefield Green',
                index=0,
                color=Color(r=91, g=255, b=0, a=1.0),
                emissionLambdaNm=535.0,
                excitationLambdaNm=None
            ),
            loops=LoopIndices(NETimeLoop=None, TimeLoop=0, XYPosLoop=1, ZStackLoop=2),
            microscope=Microscope(
                objectiveMagnification=10.0,
                objectiveName='Plan Fluor 10x Ph1 DLL',
                objectiveNumericalAperture=0.3,
                zoomMagnification=1.0,
                immersionRefractiveIndex=1.0,
                projectiveMagnification=None,
                pinholeDiameterUm=None,
                modalityFlags=['fluorescence']
            ),
            volume=Volume(
                axesCalibrated=[True, True, True],
                axesCalibration=[0.652452890023035, 0.652452890023035, 1.0],
                axesInterpretation=(
                    <AxisInterpretation.distance: 'distance'>,
                    <AxisInterpretation.distance: 'distance'>,
                    <AxisInterpretation.distance: 'distance'>
                ),
                bitsPerComponentInMemory=16,
                bitsPerComponentSignificant=16,
                cameraTransformationMatrix=[-0.9998932296054086, -0.014612644841559427, 0.014612644841559427, -0.9998932296054086],
                componentCount=1,
                componentDataType='unsigned',
                voxelCount=[32, 32, 5],
                componentMaxima=[0.0],
                componentMinima=[0.0],
                pixelToStageTransformationMatrix=None
            )
        ),
        Channel(
            channel=ChannelMeta(
                name='Widefield Red',
                index=1,
                color=Color(r=255, g=85, b=0, a=1.0),
                emissionLambdaNm=620.0,
                excitationLambdaNm=None
            ),
            loops=LoopIndices(NETimeLoop=None, TimeLoop=0, XYPosLoop=1, ZStackLoop=2),
            microscope=Microscope(
                objectiveMagnification=10.0,
                objectiveName='Plan Fluor 10x Ph1 DLL',
                objectiveNumericalAperture=0.3,
                zoomMagnification=1.0,
                immersionRefractiveIndex=1.0,
                projectiveMagnification=None,
                pinholeDiameterUm=None,
                modalityFlags=['fluorescence']
            ),
            volume=Volume(
                axesCalibrated=[True, True, True],
                axesCalibration=[0.652452890023035, 0.652452890023035, 1.0],
                axesInterpretation=(
                    <AxisInterpretation.distance: 'distance'>,
                    <AxisInterpretation.distance: 'distance'>,
                    <AxisInterpretation.distance: 'distance'>
                ),
                bitsPerComponentInMemory=16,
                bitsPerComponentSignificant=16,
                cameraTransformationMatrix=[-0.9998932296054086, -0.014612644841559427, 0.014612644841559427, -0.9998932296054086],
                componentCount=1,
                componentDataType='unsigned',
                voxelCount=[32, 32, 5],
                componentMaxima=[0.0],
                componentMinima=[0.0],
                pixelToStageTransformationMatrix=None
            )
        )
    ]
)
experiment
[
    TimeLoop(
        count=3,
        nestingLevel=0,
        parameters=TimeLoopParams(
            startMs=0.0,
            periodMs=1.0,
            durationMs=0.0,
            periodDiff=PeriodDiff(avg=16278.339965820312, max=16411.849853515625, min=16144.830078125)
        ),
        type='TimeLoop'
    ),
    XYPosLoop(
        count=4,
        nestingLevel=1,
        parameters=XYPosLoopParams(
            isSettingZ=True,
            points=[
                Position(stagePositionUm=[26950.2, -1801.6000000000001, 498.46000000000004], pfsOffset=None, name=None),
                Position(stagePositionUm=[31452.2, -1801.6000000000001, 670.7], pfsOffset=None, name=None),
                Position(stagePositionUm=[35234.3, 2116.4, 664.08], pfsOffset=None, name=None),
                Position(stagePositionUm=[40642.9, -3585.1000000000004, 555.12], pfsOffset=None, name=None)
            ]
        ),
        type='XYPosLoop'
    ),
    ZStackLoop(count=5, nestingLevel=2, parameters=ZStackLoopParams(homeIndex=2, stepUm=1.0, bottomToTop=True, deviceName='Ti2 ZDrive'), type='ZStackLoop')
]
rois

ROIs found in the metadata are available at ND2File.rois, which is a dict of nd2.structures.ROI objects, keyed by the ROI ID:

{
    1: ROI(
        id=1,
        info=RoiInfo(
            shapeType=<RoiShapeType.Rectangle: 3>,
            interpType=<InterpType.StimulationROI: 4>,
            cookie=1,
            color=255,
            label='',
            stimulationGroup=0,
            scope=1,
            appData=0,
            multiFrame=False,
            locked=False,
            compCount=2,
            bpc=16,
            autodetected=False,
            gradientStimulation=False,
            gradientStimulationBitDepth=0,
            gradientStimulationLo=0.0,
            gradientStimulationHi=0.0
        ),
        guid='{87190352-9B32-46E4-8297-C46621C1E1EF}',
        animParams=[
            AnimParam(
                timeMs=0.0,
                enabled=1,
                centerX=-0.4228425369685782,
                centerY=-0.5194951478743071,
                centerZ=0.0,
                rotationZ=0.0,
                boxShape=BoxShape(
                    sizeX=0.21256931608133062,
                    sizeY=0.21441774491682075,
                    sizeZ=0.0
                ),
                extrudedShape=ExtrudedShape(sizeZ=0, basePoints=[])
            )
        ]
    ),
    ...
}
text_info
{
    'capturing': 'Flash4.0, SN:101412\r\nSample 1:\r\n  Exposure: 100 ms\r\n  Binning: 1x1\r\n  Scan Mode: Fast\r\nSample 2:\r\n  Exposure: 100 ms\r\n  Binning: 1x1\r\n  Scan Mode: Fast',
    'date': '9/28/2021  9:41:27 AM',
    'description': 'Metadata:\r\nDimensions: T(3) x XY(4) x λ(2) x Z(5)\r\nCamera Name: Flash4.0, SN:101412\r\nNumerical Aperture: 0.3\r\nRefractive Index: 1\r\nNumber of Picture Planes: 2\r\nPlane #1:\r\n Name: Widefield Green\r\n Component Count: 1\r\n Modality: Widefield Fluorescence\r\n Camera Settings:   Exposure: 100 ms\r\n  Binning: 1x1\r\n  Scan Mode: Fast\r\n Microscope Settings:   Nikon Ti2, FilterChanger(Turret-Lo): 3 (FITC)\r\n  Nikon Ti2, Shutter(FL-Lo): Open\r\n  Nikon Ti2, Shutter(DIA LED): Closed\r\n  Nikon Ti2, Illuminator(DIA): Off\r\n  Nikon Ti2, Illuminator(DIA) Iris intensity: 3.0\r\n  Analyzer Slider: Extracted\r\n  Analyzer Cube: Extracted\r\n  Condenser: 1 (Shutter)\r\n  PFS, state: On\r\n  PFS, offset: 7959\r\n  PFS, mirror: Inserted\r\n  PFS, Dish Type: Glass\r\n  Zoom: 1.00x\r\n  Sola, Shutter(Sola): Active\r\n  Sola, Illuminator(Sola) Voltage: 100.0\r\nPlane #2:\r\n Name: Widefield Red\r\n Component Count: 1\r\n Modality: Widefield Fluorescence\r\n Camera Settings:   Exposure: 100 ms\r\n  Binning: 1x1\r\n  Scan Mode: Fast\r\n Microscope Settings:   Nikon Ti2, FilterChanger(Turret-Lo): 4 (TRITC)\r\n  Nikon Ti2, Shutter(FL-Lo): Open\r\n  Nikon Ti2, Shutter(DIA LED): Closed\r\n  Nikon Ti2, Illuminator(DIA): Off\r\n  Nikon Ti2, Illuminator(DIA) Iris intensity: 1.5\r\n  Analyzer Slider: Extracted\r\n  Analyzer Cube: Extracted\r\n  Condenser: 1 (Shutter)\r\n  PFS, state: On\r\n  PFS, offset: 7959\r\n  PFS, mirror: Inserted\r\n  PFS, Dish Type: Glass\r\n  Zoom: 1.00x\r\n  Sola, Shutter(Sola): Active\r\n  Sola, Illuminator(Sola) Voltage: 100.0\r\nTime Loop: 3\r\n- Equidistant (Period 1 ms)\r\nZ Stack Loop: 5\r\n- Step: 1 µm\r\n- Device: Ti2 ZDrive',
    'optics': 'Plan Fluor 10x Ph1 DLL'
}
binary_data

This property returns an nd2.BinaryLayers object representing all of the binary masks in the nd2 file.

A nd2.BinaryLayers object is a sequence of individual nd2.BinaryLayer objects (one for each binary layer found in the file). Each BinaryLayer in the sequence is a named tuple that has, among other things, a name attribute, and a data attribute that is list of numpy arrays (one for each frame in the experiment) or None if the binary layer had no data in that frame.

The most common use case will be to cast either the entire BinaryLayers object or an individual BinaryLayer to a numpy.ndarray:

>>> import nd2
>>> nd2file = nd2.ND2File('path/to/file.nd2')
>>> binary_layers = nd2file.binary_data

# The output array will have shape
# (n_binary_layers, *coord_shape, *frame_shape).
>>> np.asarray(binary_layers)

For example, if the data in the nd2 file has shape (nT, nZ, nC, nY, nX), and there are 4 binary layers, then the output of np.asarray(nd2file.binary_data) will have shape (4, nT, nZ, nY, nX). (Note that the nC dimension is not present in the output array, and the binary layers are always in the first axis).

You can also cast an individual BinaryLayer to a numpy array:

>>> binary_layer = binary_layers[0]
>>> np.asarray(binary_layer)
events()

This property returns the tabular data reported in the Image Properties > Recorded Data tab of the NIS Viewer.

(There will be a column for each tag in the CustomDataV2_0 section of custom_data above, as well as any additional events found in the metadata)

The format of the return type data is controlled by the orient argument:

  • 'records' : list of dicts - [{column -> value}, ...] (default)
  • 'dict' : dict of dicts - {column -> {index -> value}, ...}
  • 'list' : dict of lists - {column -> [value, ...]}

Not every column header appears in every event, so when orient is either 'dict' or 'list', float('nan') will be inserted to maintain a consistent length for each column.

# with `orient='records'` (DEFAULT)
[
    {
        'Time [s]': 1.32686654,
        'Z-Series': -2.0,
        'Exposure Time [ms]': 100.0,
        'PFS Offset': 0,
        'PFS Status': 0,
        'X Coord [µm]': 31452.2,
        'Y Coord [µm]': -1801.6,
        'Z Coord [µm]': 552.74,
        'Ti2 ZDrive [µm]': 552.74
    },
    {
        'Time [s]': 1.69089657,
        'Z-Series': -1.0,
        'Exposure Time [ms]': 100.0,
        'PFS Offset': 0,
        'PFS Status': 0,
        'X Coord [µm]': 31452.2,
        'Y Coord [µm]': -1801.6,
        'Z Coord [µm]': 553.74,
        'Ti2 ZDrive [µm]': 553.74
    },
    {
        'Time [s]': 2.04194662,
        'Z-Series': 0.0,
        'Exposure Time [ms]': 100.0,
        'PFS Offset': 0,
        'PFS Status': 0,
        'X Coord [µm]': 31452.2,
        'Y Coord [µm]': -1801.6,
        'Z Coord [µm]': 554.74,
        'Ti2 ZDrive [µm]': 554.74
    },
    {
        'Time [s]': 2.38194662,
        'Z-Series': 1.0,
        'Exposure Time [ms]': 100.0,
        'PFS Offset': 0,
        'PFS Status': 0,
        'X Coord [µm]': 31452.2,
        'Y Coord [µm]': -1801.6,
        'Z Coord [µm]': 555.74,
        'Ti2 ZDrive [µm]': 555.74
    },
    {
        'Time [s]': 2.63795663,
        'Z-Series': 2.0,
        'Exposure Time [ms]': 100.0,
        'PFS Offset': 0,
        'PFS Status': 0,
        'X Coord [µm]': 31452.2,
        'Y Coord [µm]': -1801.6,
        'Z Coord [µm]': 556.74,
        'Ti2 ZDrive [µm]': 556.74
    }
]

# with `orient='list'`
{
    'Time [s]': array([1.32686654, 1.69089657, 2.04194662, 2.38194662, 2.63795663]),
    'Z-Series': array([-2., -1.,  0.,  1.,  2.]),
    'Exposure Time [ms]': array([100., 100., 100., 100., 100.]),
    'PFS Offset': array([0, 0, 0, 0, 0], dtype=int32),
    'PFS Status': array([0, 0, 0, 0, 0], dtype=int32),
    'X Coord [µm]': array([31452.2, 31452.2, 31452.2, 31452.2, 31452.2]),
    'Y Coord [µm]': array([-1801.6, -1801.6, -1801.6, -1801.6, -1801.6]),
    'Z Coord [µm]': array([552.74, 553.74, 554.74, 555.74, 556.74]),
    'Ti2 ZDrive [µm]': array([552.74, 553.74, 554.74, 555.74, 556.74])
}

# with `orient='dict'`
{
    'Time [s]': {0: 1.32686654, 1: 1.69089657, 2: 2.04194662, 3: 2.38194662, 4: 2.63795663},
    'Z-Series': {0: -2.0, 1: -1.0, 2: 0.0, 3: 1.0, 4: 2.0},
    'Exposure Time [ms]': {0: 100.0, 1: 100.0, 2: 100.0, 3: 100.0, 4: 100.0},
    'PFS Offset []': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0},
    'PFS Status []': {0: 0, 1: 0, 2: 0, 3: 0, 4: 0},
    'X Coord [µm]': {0: 31452.2, 1: 31452.2, 2: 31452.2, 3: 31452.2, 4: 31452.2},
    'Y Coord [µm]': {0: -1801.6, 1: -1801.6, 2: -1801.6, 3: -1801.6, 4: -1801.6},
    'Z Coord [µm]': {0: 552.74, 1: 553.74, 2: 554.74, 3: 555.74, 4: 556.74},
    'Ti2 ZDrive [µm]': {0: 552.74, 1: 553.74, 2: 554.74, 3: 555.74, 4: 556.74}
}

You can pass the output of events() to pandas.DataFrame:

In [1]: pd.DataFrame(nd2file.events())
Out[1]:
     Time [s]  Z-Series  Exposure Time [ms]  PFS Offset  PFS Status []  X Coord [µm]  Y Coord [µm]  Z Coord [µm]  Ti2 ZDrive [µm]
0    1.326867      -2.0               100.0              0              0       31452.2       -1801.6        552.74           552.74
1    1.690897      -1.0               100.0              0              0       31452.2       -1801.6        553.74           553.74
2    2.041947       0.0               100.0              0              0       31452.2       -1801.6        554.74           554.74
3    2.381947       1.0               100.0              0              0       31452.2       -1801.6        555.74           555.74
4    2.637957       2.0               100.0              0              0       31452.2       -1801.6        556.74           556.74
5    8.702229      -2.0               100.0              0              0       31452.2       -1801.6        552.70           552.70
6    9.036269      -1.0               100.0              0              0       31452.2       -1801.6        553.70           553.70
7    9.330319       0.0               100.0              0              0       31452.2       -1801.6        554.68           554.68
8    9.639349       1.0               100.0              0              0       31452.2       -1801.6        555.70           555.70
9    9.906369       2.0               100.0              0              0       31452.2       -1801.6        556.64           556.64
10  11.481439      -2.0               100.0              0              0       31452.2       -1801.6        552.68           552.68
11  11.796479      -1.0               100.0              0              0       31452.2       -1801.6        553.68           553.68
12  12.089479       0.0               100.0              0              0       31452.2       -1801.6        554.68           554.68
13  12.371539       1.0               100.0              0              0       31452.2       -1801.6        555.68           555.68
14  12.665469       2.0               100.0              0              0       31452.2       -1801.6        556.68           556.68
ome_metadata()

See the ome-types documentation for details on the OME type returned by this method.

In [1]: ome = nd2file.ome_metadata()

In [2]: print(ome)
OME(
    instruments=[<1 Instrument>],
    images=[<1 Image>],
    creator='nd2 v0.7.1'
)

In [3]: print(ome.to_xml())
<OME xmlns="http://www.openmicroscopy.org/Schemas/OME/2016-06"
     xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
     xsi:schemaLocation="http://www.openmicroscopy.org/Schemas/OME/2016-06 http://www.openmicroscopy.org/Schemas/OME/2016-06/ome.xsd"
     Creator="nd2 v0.7.1.dev2+g4ea166e.d20230709">
  <Instrument ID="Instrument:0">
    <Detector Model="Hamamatsu Dual C14440-20UP" SerialNumber="Hamamatsu Dual C14440-20UP" ID="Detector:0"/>
  </Instrument>
  <Image ID="Image:0" Name="test39">
    <AcquisitionDate>2023-07-08T09:30:55</AcquisitionDate>
    ...

Contributing / Development

To test locally and contribute. Clone this repo, then:

pip install -e .[dev]

To download sample data:

pip install requests
python scripts/download_samples.py

then run tests:

pytest

(and feel free to open an issue if that doesn't work!)

alternatives

Here are some other nd2 readers that I know of, though many of them are unmaintained:

  • pims_nd2 - pims-based reader. ctypes wrapper around the v9.00 (2015) SDK
  • nd2reader - pims-based reader, using reverse-engineered file headers. mostly tested on files from NIS Elements 4.30.02
  • nd2file - another pure-python, chunk map reader, unmaintained?
  • pyND2SDK - windows-only cython wrapper around the v9.00 (2015) SDK. not on PyPI

The motivating factors for this library were:

  • support for as many nd2 files as possible, with a large test suite an and emphasis on correctness
  • pims-independent delayed reader based on dask
  • axis-associated metadata via xarray

nd2's People

Contributors

dependabot[bot] avatar fdrgsp avatar georgeoshardo avatar pre-commit-ci[bot] avatar seanleroy avatar shenker avatar tlambert03 avatar volkerh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nd2's Issues

Symbols and libraries required in vendored binaries

To make sure the binaries are conda-forge compatible, I did the following:

$> docker run -it condaforge/linux-anvil-cos7-x86_64 bash
# now inside docker:
$> mamba create -n nd2 -c conda-forge compilers sysroot_linux-64=2.17 python pip cython numpy libtiff libzlib jpeg 
$> conda activate nd2
$> git clone https://github.com/tlambert03/nd2
$> cd nd2

$> LD_LIBRARY_PATH=$CONDA_PREFIX/lib ldd -r -v src/sdk/Linux/lib/liblimfile.so 
src/sdk/Linux/lib/liblimfile.so: /opt/conda/envs/nd2/lib/libtiff.so.5: no version information available (required by src/sdk/Linux/lib/liblimfile.so)
        linux-vdso.so.1 =>  (0x00007fff97fb4000)
        libz.so.1 => /opt/conda/envs/nd2/lib/libz.so.1 (0x00007f16e597e000)
        libtiff.so.5 => /opt/conda/envs/nd2/lib/libtiff.so.5 (0x00007f16e58e9000)
        libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f16e5190000)
        libstdc++.so.6 => /opt/conda/envs/nd2/lib/libstdc++.so.6 (0x00007f16e4fe5000)
        libm.so.6 => /lib64/libm.so.6 (0x00007f16e4ce3000)
        libgcc_s.so.1 => /opt/conda/envs/nd2/lib/libgcc_s.so.1 (0x00007f16e58d0000)
        libc.so.6 => /lib64/libc.so.6 (0x00007f16e4915000)
        libwebp.so.7 => /opt/conda/envs/nd2/lib/./libwebp.so.7 (0x00007f16e5847000)
        libzstd.so.1 => /opt/conda/envs/nd2/lib/./libzstd.so.1 (0x00007f16e4827000)
        liblzma.so.5 => /opt/conda/envs/nd2/lib/./liblzma.so.5 (0x00007f16e581e000)
        libLerc.so => /opt/conda/envs/nd2/lib/./libLerc.so (0x00007f16e478a000)
        libjpeg.so.9 => /opt/conda/envs/nd2/lib/./libjpeg.so.9 (0x00007f16e57df000)
        libdeflate.so.0 => /opt/conda/envs/nd2/lib/./libdeflate.so.0 (0x00007f16e57cf000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f16e5778000)
        librt.so.1 => /lib64/librt.so.1 (0x00007f16e4582000)

        Version information:
        src/sdk/Linux/lib/liblimfile.so:
                libgcc_s.so.1 (GCC_3.0) => /opt/conda/envs/nd2/lib/libgcc_s.so.1
                libz.so.1 (ZLIB_1.2.0) => /opt/conda/envs/nd2/lib/libz.so.1
                libm.so.6 (GLIBC_2.2.5) => /lib64/libm.so.6
                libtiff.so.5 (LIBTIFF_4.0) => not found
                libpthread.so.0 (GLIBC_2.2.5) => /lib64/libpthread.so.0
                libc.so.6 (GLIBC_2.14) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.6) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libstdc++.so.6 (GLIBCXX_3.4.20) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3.8) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.17) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3.9) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.9) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.14) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.11) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3.5) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.15) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.22) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.21) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4) => /opt/conda/envs/nd2/lib/libstdc++.so.6
        /opt/conda/envs/nd2/lib/libz.so.1:
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/libtiff.so.5:
                libm.so.6 (GLIBC_2.2.5) => /lib64/libm.so.6
                liblzma.so.5 (XZ_5.0) => /opt/conda/envs/nd2/lib/./liblzma.so.5
                libjpeg.so.9 (LIBJPEG_9.0) => /opt/conda/envs/nd2/lib/./libjpeg.so.9
                libc.so.6 (GLIBC_2.3) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.11) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /lib64/libpthread.so.0:
                ld-linux-x86-64.so.2 (GLIBC_2.2.5) => /lib64/ld-linux-x86-64.so.2
                ld-linux-x86-64.so.2 (GLIBC_2.3) => /lib64/ld-linux-x86-64.so.2
                ld-linux-x86-64.so.2 (GLIBC_PRIVATE) => /lib64/ld-linux-x86-64.so.2
                libc.so.6 (GLIBC_2.14) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3.2) => /lib64/libc.so.6
                libc.so.6 (GLIBC_PRIVATE) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/libstdc++.so.6:
                libm.so.6 (GLIBC_2.2.5) => /lib64/libm.so.6
                ld-linux-x86-64.so.2 (GLIBC_2.3) => /lib64/ld-linux-x86-64.so.2
                libgcc_s.so.1 (GCC_4.2.0) => /opt/conda/envs/nd2/lib/libgcc_s.so.1
                libgcc_s.so.1 (GCC_3.4) => /opt/conda/envs/nd2/lib/libgcc_s.so.1
                libgcc_s.so.1 (GCC_3.3) => /opt/conda/envs/nd2/lib/libgcc_s.so.1
                libgcc_s.so.1 (GCC_3.0) => /opt/conda/envs/nd2/lib/libgcc_s.so.1
                libc.so.6 (GLIBC_2.6) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3.2) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /lib64/libm.so.6:
                ld-linux-x86-64.so.2 (GLIBC_PRIVATE) => /lib64/ld-linux-x86-64.so.2
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
                libc.so.6 (GLIBC_PRIVATE) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/libgcc_s.so.1:
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /lib64/libc.so.6:
                ld-linux-x86-64.so.2 (GLIBC_2.3) => /lib64/ld-linux-x86-64.so.2
                ld-linux-x86-64.so.2 (GLIBC_PRIVATE) => /lib64/ld-linux-x86-64.so.2
        /opt/conda/envs/nd2/lib/./libwebp.so.7:
                libm.so.6 (GLIBC_2.2.5) => /lib64/libm.so.6
                libpthread.so.0 (GLIBC_2.3.2) => /lib64/libpthread.so.0
                libpthread.so.0 (GLIBC_2.2.5) => /lib64/libpthread.so.0
                libc.so.6 (GLIBC_2.3.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/./libzstd.so.1:
                libpthread.so.0 (GLIBC_2.3.2) => /lib64/libpthread.so.0
                libpthread.so.0 (GLIBC_2.2.5) => /lib64/libpthread.so.0
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/./liblzma.so.5:
                librt.so.1 (GLIBC_2.2.5) => /lib64/librt.so.1
                libpthread.so.0 (GLIBC_2.3.3) => /lib64/libpthread.so.0
                libpthread.so.0 (GLIBC_2.3.2) => /lib64/libpthread.so.0
                libpthread.so.0 (GLIBC_2.2.5) => /lib64/libpthread.so.0
                libc.so.6 (GLIBC_2.3.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.6) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/./libLerc.so:
                libgcc_s.so.1 (GCC_3.0) => /opt/conda/envs/nd2/lib/libgcc_s.so.1
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
                libstdc++.so.6 (GLIBCXX_3.4.21) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3.9) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4) => /opt/conda/envs/nd2/lib/libstdc++.so.6
        /opt/conda/envs/nd2/lib/./libjpeg.so.9:
                libc.so.6 (GLIBC_2.3.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.7) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/./libdeflate.so.0:
                libc.so.6 (GLIBC_2.3.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /lib64/librt.so.1:
                libpthread.so.0 (GLIBC_2.3.2) => /lib64/libpthread.so.0
                libpthread.so.0 (GLIBC_PRIVATE) => /lib64/libpthread.so.0
                libpthread.so.0 (GLIBC_2.2.5) => /lib64/libpthread.so.0
                libc.so.6 (GLIBC_2.14) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3.2) => /lib64/libc.so.6
                libc.so.6 (GLIBC_PRIVATE) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6


$> LD_LIBRARY_PATH=$CONDA_PREFIX/lib ldd -r -v src/sdk/Linux/lib/libnd2readsdk-shared.so 
ldd: warning: you do not have execution permission for `src/sdk/Linux/lib/libnd2readsdk-shared.so'
src/sdk/Linux/lib/libnd2readsdk-shared.so: /opt/conda/envs/nd2/lib/libtiff.so.5: no version information available (required by /home/conda/nd2/src/sdk/Linux/lib/liblimfile.so)
        linux-vdso.so.1 =>  (0x00007ffeb3fd2000)
        libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f03a8bbc000)
        liblimfile.so => /home/conda/nd2/src/sdk/Linux/lib/liblimfile.so (0x00007f03a87f0000)
        libstdc++.so.6 => /opt/conda/envs/nd2/lib/libstdc++.so.6 (0x00007f03a908f000)
        libgcc_s.so.1 => /opt/conda/envs/nd2/lib/libgcc_s.so.1 (0x00007f03a907a000)
        libc.so.6 => /lib64/libc.so.6 (0x00007f03a8422000)
        /lib64/ld-linux-x86-64.so.2 (0x00007f03a901e000)
        libz.so.1 => /opt/conda/envs/nd2/lib/libz.so.1 (0x00007f03a905f000)
        libtiff.so.5 => /opt/conda/envs/nd2/lib/libtiff.so.5 (0x00007f03a838e000)
        libm.so.6 => /lib64/libm.so.6 (0x00007f03a808c000)
        libwebp.so.7 => /opt/conda/envs/nd2/lib/./libwebp.so.7 (0x00007f03a8004000)
        libzstd.so.1 => /opt/conda/envs/nd2/lib/./libzstd.so.1 (0x00007f03a7f16000)
        liblzma.so.5 => /opt/conda/envs/nd2/lib/./liblzma.so.5 (0x00007f03a7eed000)
        libLerc.so => /opt/conda/envs/nd2/lib/./libLerc.so (0x00007f03a7e50000)
        libjpeg.so.9 => /opt/conda/envs/nd2/lib/./libjpeg.so.9 (0x00007f03a7e12000)
        libdeflate.so.0 => /opt/conda/envs/nd2/lib/./libdeflate.so.0 (0x00007f03a904c000)
        librt.so.1 => /lib64/librt.so.1 (0x00007f03a7c0a000)

        Version information:
        src/sdk/Linux/lib/libnd2readsdk-shared.so:
                libgcc_s.so.1 (GCC_3.0) => /opt/conda/envs/nd2/lib/libgcc_s.so.1
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
                libpthread.so.0 (GLIBC_2.2.5) => /lib64/libpthread.so.0
                libstdc++.so.6 (GLIBCXX_3.4.20) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3.9) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.21) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4) => /opt/conda/envs/nd2/lib/libstdc++.so.6
        /lib64/libpthread.so.0:
                ld-linux-x86-64.so.2 (GLIBC_2.2.5) => /lib64/ld-linux-x86-64.so.2
                ld-linux-x86-64.so.2 (GLIBC_2.3) => /lib64/ld-linux-x86-64.so.2
                ld-linux-x86-64.so.2 (GLIBC_PRIVATE) => /lib64/ld-linux-x86-64.so.2
                libc.so.6 (GLIBC_2.14) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3.2) => /lib64/libc.so.6
                libc.so.6 (GLIBC_PRIVATE) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /home/conda/nd2/src/sdk/Linux/lib/liblimfile.so:
                libgcc_s.so.1 (GCC_3.0) => /opt/conda/envs/nd2/lib/libgcc_s.so.1
                libz.so.1 (ZLIB_1.2.0) => /opt/conda/envs/nd2/lib/libz.so.1
                libm.so.6 (GLIBC_2.2.5) => /lib64/libm.so.6
                libtiff.so.5 (LIBTIFF_4.0) => not found
                libpthread.so.0 (GLIBC_2.2.5) => /lib64/libpthread.so.0
                libc.so.6 (GLIBC_2.14) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.6) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libstdc++.so.6 (GLIBCXX_3.4.20) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3.8) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.17) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3.9) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.9) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.14) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.11) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3.5) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.15) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.22) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4.21) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4) => /opt/conda/envs/nd2/lib/libstdc++.so.6
        /opt/conda/envs/nd2/lib/libstdc++.so.6:
                libm.so.6 (GLIBC_2.2.5) => /lib64/libm.so.6
                ld-linux-x86-64.so.2 (GLIBC_2.3) => /lib64/ld-linux-x86-64.so.2
                libgcc_s.so.1 (GCC_4.2.0) => /opt/conda/envs/nd2/lib/libgcc_s.so.1
                libgcc_s.so.1 (GCC_3.4) => /opt/conda/envs/nd2/lib/libgcc_s.so.1
                libgcc_s.so.1 (GCC_3.3) => /opt/conda/envs/nd2/lib/libgcc_s.so.1
                libgcc_s.so.1 (GCC_3.0) => /opt/conda/envs/nd2/lib/libgcc_s.so.1
                libc.so.6 (GLIBC_2.6) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3.2) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/libgcc_s.so.1:
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /lib64/libc.so.6:
                ld-linux-x86-64.so.2 (GLIBC_2.3) => /lib64/ld-linux-x86-64.so.2
                ld-linux-x86-64.so.2 (GLIBC_PRIVATE) => /lib64/ld-linux-x86-64.so.2
        /opt/conda/envs/nd2/lib/libz.so.1:
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/libtiff.so.5:
                libm.so.6 (GLIBC_2.2.5) => /lib64/libm.so.6
                liblzma.so.5 (XZ_5.0) => /opt/conda/envs/nd2/lib/./liblzma.so.5
                libjpeg.so.9 (LIBJPEG_9.0) => /opt/conda/envs/nd2/lib/./libjpeg.so.9
                libc.so.6 (GLIBC_2.3) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.11) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /lib64/libm.so.6:
                ld-linux-x86-64.so.2 (GLIBC_PRIVATE) => /lib64/ld-linux-x86-64.so.2
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
                libc.so.6 (GLIBC_PRIVATE) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/./libwebp.so.7:
                libm.so.6 (GLIBC_2.2.5) => /lib64/libm.so.6
                libpthread.so.0 (GLIBC_2.3.2) => /lib64/libpthread.so.0
                libpthread.so.0 (GLIBC_2.2.5) => /lib64/libpthread.so.0
                libc.so.6 (GLIBC_2.3.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/./libzstd.so.1:
                libpthread.so.0 (GLIBC_2.3.2) => /lib64/libpthread.so.0
                libpthread.so.0 (GLIBC_2.2.5) => /lib64/libpthread.so.0
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/./liblzma.so.5:
                librt.so.1 (GLIBC_2.2.5) => /lib64/librt.so.1
                libpthread.so.0 (GLIBC_2.3.3) => /lib64/libpthread.so.0
                libpthread.so.0 (GLIBC_2.3.2) => /lib64/libpthread.so.0
                libpthread.so.0 (GLIBC_2.2.5) => /lib64/libpthread.so.0
                libc.so.6 (GLIBC_2.3.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.6) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/./libLerc.so:
                libgcc_s.so.1 (GCC_3.0) => /opt/conda/envs/nd2/lib/libgcc_s.so.1
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
                libstdc++.so.6 (GLIBCXX_3.4.21) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3.9) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (CXXABI_1.3) => /opt/conda/envs/nd2/lib/libstdc++.so.6
                libstdc++.so.6 (GLIBCXX_3.4) => /opt/conda/envs/nd2/lib/libstdc++.so.6
        /opt/conda/envs/nd2/lib/./libjpeg.so.9:
                libc.so.6 (GLIBC_2.3.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.7) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /opt/conda/envs/nd2/lib/./libdeflate.so.0:
                libc.so.6 (GLIBC_2.3.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.4) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6
        /lib64/librt.so.1:
                libpthread.so.0 (GLIBC_2.3.2) => /lib64/libpthread.so.0
                libpthread.so.0 (GLIBC_PRIVATE) => /lib64/libpthread.so.0
                libpthread.so.0 (GLIBC_2.2.5) => /lib64/libpthread.so.0
                libc.so.6 (GLIBC_2.14) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.3.2) => /lib64/libc.so.6
                libc.so.6 (GLIBC_PRIVATE) => /lib64/libc.so.6
                libc.so.6 (GLIBC_2.2.5) => /lib64/libc.so.6

Looks like everything is in order. LD_LIBRARY_PATH is needed but this should be patched once conda-build does its magic.

Add better time-stamp extraction

Time stamp info is currently scattered in different places depending on the nd2 version.

While there is some information on time loop interval in reader.experiment

the actual timestamp for newer files timestamps for each frame is at

channels = nd2file._rdr._frame_metadata(i).get('channels')
timestamp = channels[0]['time']['relativeTimeMs']

for legacy readers, it's likely at

nd2file._rdr._get_xml_dict(b"VIMD", n).get('TimeMSec')  # where n is the frame number

this should be unified for the public API

error while loading nd2 image

  • nd2 version: 0.4.0
  • Python version: 3.9 / 3.10
  • Operating System: ubuntu 18.04 / win 10

Description

Loading a nd2 image leads to corrupted data. Rows are shifted by 1 pixels and there is a residual error even after shifing rows back compared to the image decoded by bioformats and a tif exported from NIS Elements.

Data accessible from here: https://cloud.mrc-lmb.cam.ac.uk/s/jJMMtqAaRALnnTq

What I Did

import nd2
import matplotlib.pyplot as plt
import numpy as np
import javabridge
import bioformats 
import tifffile
javabridge.start_vm(class_path=bioformats.JARS,run_headless=True)
fname = 'test.nd2'
imgs = [
    nd2.imread(fname),
    np.stack([line[np.arange(k,k+line.shape[0])%line.shape[0]] for k,line in enumerate(nd2.imread(fname))]),
    bioformats.load_image(fname, rescale=False),
    tifffile.imread(fname.replace('.nd2','.tif'))]
fig, ax = plt.subplots(1,len(imgs),figsize=(20,40))
titles = ['nd','nd2 shifted','bioformats','tiff']
for img,a,t in zip(imgs,ax,titles):    
    a.imshow(img)
    a.axis('off')
    a.set_title(t)

[np.abs(img.astype(float)-imgs[3].astype(float)).max() for img in imgs]

image

Max absolute error with tiff reference: [2707.0, 1564.0, 0.0, 0.0]

plt.imshow(np.abs(imgs[1].astype(np.float32)-imgs[3].astype(np.float32)),vmax=100)
plt.colorbar()

image

Some datasets appear skewed in Ubuntu

  • nd2 version: 0.4.0
  • Python version: 3.8.12
  • Operating System: Ubuntu

Description

Opening some large-scale (tiled) nd2 files in Ubuntu appears skewed. Same files read correctly in Windows.

What I Did

I used AICSImageIO plugin in napari or nd2.ND2File().to_dask() - the result is the same as in the attachment.
image

Ubuntu-report:

{
  "Version": "20.04",
  "OEM": {
    "Vendor": "Dell Inc.",
    "Product": "Precision 3630 Tower",
    "Family": "Precision",
    "DCD": "canonical-oem-somerville-bionic-amd64-20180608-47+beaver-turtlebay+X59"
  },
  "BIOS": {
    "Vendor": "Dell Inc.",
    "Version": "2.8.0"
  },
  "CPU": {
    "OpMode": "32-bit, 64-bit",
    "CPUs": "12",
    "Threads": "2",
    "Cores": "6",
    "Sockets": "1",
    "Vendor": "GenuineIntel",
    "Family": "6",
    "Model": "158",
    "Stepping": "10",
    "Name": "Intel(R) Core(TM) i7-8700 CPU @ 3.20GHz",
    "Virtualization": "VT-x"
  },
  "Arch": "amd64",
  "GPU": [
    {
      "Vendor": "10de",
      "Model": "1c30"
    }
  ],
  "RAM": 32.7,
  "Partitions": [
    485.3,
    0.8
  ],
  "Screens": [
    {
      "Size": "527mmx296mm",
      "Resolution": "1920x1080",
      "Frequency": "60.00"
    }
  ],
  "Autologin": false,
  "LivePatch": false,
  "Session": {
    "DE": "ubuntu:GNOME",
    "Name": "ubuntu",
    "Type": "x11"
  },
  "Language": "en_US",
  "Timezone": "Europe/Paris",
  "Install": {
    "Media": "Ubuntu 18.04 \"Bionic\" - Build amd64 LIVE Binary 20180608-09:38\n",
    "Type": "GTK",
    "DownloadUpdates": true,
    "Language": "en",
    "Minimal": false,
    "RestrictedAddons": false,
    "Stages": {
      "0": "language",
      "3": "user_done",
      "271": "done"
    }
  }
}

pip freeze:

aicsimageio==4.9.2
aicspylibczi==3.0.5
aiohttp==3.8.1
aiosignal==1.2.0
alabaster==0.7.12
apoc==0.7.0
appdirs==1.4.4
argon2-cffi==21.3.0
argon2-cffi-bindings==21.2.0
asciitree==0.3.3
astropy==5.0.2
asttokens==2.0.5
async-timeout==4.0.2
attrs==21.4.0
autopep8==1.6.0
Babel==2.9.1
backcall==0.2.0
bbii-decon==0.0.1
beautifulsoup4==4.11.1
bioformats-jar==6.7.0.post2
black==22.1.0
bleach==5.0.0
cachey==0.2.1
cellpose==2.0.5
cellpose-napari==0.1.4
certifi==2021.10.8
cffi==1.15.0
charset-normalizer==2.0.12
click==8.0.4
cloudpickle==2.0.0
cupy==10.2.0
cycler==0.11.0
dask==2022.2.0
debugpy==1.5.1
decorator==5.1.1
defusedxml==0.7.1
dnspython==2.2.1
docstring-parser==0.13
docutils==0.17.1
elementpath==2.5.0
email-validator==1.1.3
entrypoints==0.4
executing==0.8.2
fasteners==0.17.3
fastjsonschema==2.15.3
fastremap==1.12.2
fastrlock==0.8
fire==0.4.0
fonttools==4.33.3
freetype-py==2.2.0
frozenlist==1.3.0
fsspec==2022.2.0
HeapDict==1.0.1
hsluv==5.0.2
idna==3.3
imagecodecs==2022.2.22
imageio==2.10.5
imageio-ffmpeg==0.4.5
imagesize==1.3.0
importlib-metadata==4.11.1
importlib-resources==5.4.0
intervaltree==3.1.0
ipykernel==6.9.1
ipython==8.0.1
ipython-genutils==0.2.0
ipywidgets==7.7.0
jedi==0.18.1
Jinja2==3.0.3
joblib==1.1.0
JPype1==1.3.0
jsonschema==4.4.0
jupyter==1.0.0
jupyter-client==7.1.2
jupyter-console==6.4.3
jupyter-core==4.9.2
jupyterlab-pygments==0.2.2
jupyterlab-widgets==1.1.0
jupytext==1.13.8
kiwisolver==1.3.2
llvmlite==0.38.0
locket==0.2.1
loguru==0.6.0
lxml==4.8.0
magicgui==0.3.7
markdown-it-py==2.1.0
MarkupSafe==2.1.0
matplotlib==3.5.2
matplotlib-inline==0.1.3
mdit-py-plugins==0.3.0
mdurl==0.1.1
mistune==0.8.4
mrc==0.2.0
multidict==6.0.2
mypy-extensions==0.4.3
napari==0.4.15
napari-accelerated-pixel-and-object-classification==0.7.3
napari-aicsimageio==0.6.1
napari-assistant==0.2.0
napari-console==0.0.4
napari-cupy-image-processing==0.2.3
napari-elementary-numpy-operations==0.0.5
napari-plugin-engine==0.2.0
napari-pyclesperanto-assistant==0.18.0
-e git+https://github.com/aaristov/napari-segment.git@aef1dbb101ba39bd0880c62a8aba1947064dcfa7#egg=napari_segment
napari-skimage-regionprops==0.5.0
napari-stracking==0.1.9
napari-svg==0.1.6
napari-time-slicer==0.4.3
napari-tools-menu==0.1.11
napari-workflows==0.1.8
natsort==8.1.0
nbclient==0.6.2
nbconvert==6.5.0
nbformat==5.4.0
nd2==0.4.0
nest-asyncio==1.5.4
networkx==2.6.3
notebook==6.4.11
npe2==0.1.2
numba==0.55.1
numcodecs==0.9.1
numpy==1.21.6
numpydoc==1.2
ome-types==0.2.10
opencv-python-headless==4.5.5.64
packaging==21.3
pandas==1.4.1
pandocfilters==1.5.0
parso==0.8.3
partd==1.2.0
pathspec==0.9.0
pexpect==4.8.0
pickleshare==0.7.5
Pillow==8.4.0
Pint==0.18
platformdirs==2.5.1
pooch==1.6.0
prometheus-client==0.14.1
prompt-toolkit==3.0.28
psutil==5.9.0
psygnal==0.3.3
ptyprocess==0.7.0
pure-eval==0.2.2
pyclesperanto-prototype==0.17.1
pycodestyle==2.8.0
pycparser==2.21
pydantic==1.9.0
pyerfa==2.0.0.1
Pygments==2.11.2
pyopencl==2022.1.3
PyOpenGL==3.1.6
pyparsing==3.0.7
pyperclip==1.8.2
pypher==0.7.0
PyQt5==5.15.6
PyQt5-Qt5==5.15.2
PyQt5-sip==12.9.1
pyrsistent==0.18.1
python-dateutil==2.8.2
pytomlpp==1.0.10
pytools==2022.1.6
pytz==2021.3
PyWavelets==1.2.0
PyYAML==6.0
pyzmq==22.3.0
qtconsole==5.2.2
QtPy==2.0.1
readlif==0.6.5
requests==2.27.1
resource-backed-dask-array==0.1.0
scikit-image==0.19.2
scikit-learn==1.0.2
scipy==1.8.0
Send2Trash==1.8.0
setuptools-scm==6.4.2
six==1.16.0
snowballstemmer==2.2.0
sortedcontainers==2.4.0
soupsieve==2.3.2.post1
Sphinx==4.4.0
sphinxcontrib-applehelp==1.0.2
sphinxcontrib-devhelp==1.0.2
sphinxcontrib-htmlhelp==2.0.0
sphinxcontrib-jsmath==1.0.1
sphinxcontrib-qthelp==1.0.3
sphinxcontrib-serializinghtml==1.1.5
stack-data==0.2.0
stracking==0.1.9
superqt==0.3.0
termcolor==1.1.0
terminado==0.13.3
threadpoolctl==3.1.0
tifffile==2022.2.9
tinycss2==1.1.1
toml==0.10.2
tomli==2.0.1
toolz==0.11.2
torch==1.11.0
tornado==6.1
tqdm==4.62.3
traitlets==5.1.1
transforms3d==0.3.1
typer==0.4.0
typing_extensions==4.1.1
urllib3==1.26.8
vispy==0.9.6
wcwidth==0.2.5
webencodings==0.5.1
widgetsnbextension==3.6.0
wrapt==1.13.3
wurlitzer==3.0.2
xarray==2022.3.0
xmlschema==1.9.2
yarl==1.7.2
zarr==2.11.0
-e git+ssh://[email protected]/aaristov/zarr-tools.git@3ab710131768d8be8b6af88111916f646ce2153f#egg=zarr_tools
zipp==3.7.0

ND2 files saved with lossless compression do not open correctly.

  • nd2 version: 0.2.5
  • Python version: 3.9.9
  • Operating System: Windows 10 Pro

Description

Files acquired by NIS Elements JOBS can be saved as ND2 files with either no compression or lossless compression. When saved with no compression, the nd2 library opens these files correctly. When saved with lossless compression, nd2 returns incorrect data.

What I Did

I acquired two images of background (no sample) using JOBS, with and without saving with compression. You can see the results below - the version with no compression returns values close to 100 as expected for a background image; the version with compression returns nonsense values.

dir = r'C:\Users\kthorn\ArrePath, Inc\ArrePath Science - Documents\Imaging experiments\062322 Format Tests/'
with nd2.ND2File(dir+'NoCompression.nd2') as nd2_file:
    nd2_image = nd2_file.asarray()
    imdims = nd2_image.shape

nd2_image[1,0,0:10,0]
array([124, 108, 131, 113, 148, 131, 120, 127, 121, 129], dtype=uint16)

with nd2.ND2File(dir+'LosslessCompression.nd2') as nd2_file:
    nd2_image = nd2_file.asarray()
    imdims = nd2_image.shape

nd2_image[1,0,0:10,0]
with nd2.ND2File(dir+'LosslessCompression.nd2') as nd2_file:
    nd2_image = nd2_file.asarray()
    imdims = nd2_image.shape

nd2_image[1,0,0:10,0]
with nd2.ND2File(dir+'LosslessCompression.nd2') as nd2_file:
    nd2_image = nd2_file.asarray()
    imdims = nd2_image.shape
​
nd2_image[1,0,0:10,0]
array([40056, 11979, 13782, 37643,  2349, 20231,  1135, 16315, 57624,
       54800], dtype=uint16)

I can provide the example files that reproduce this problem.

liblimfile.so: cannot open shared object file on CentOS

nd2.imread('test.nd2')
ImportError: liblimfile.so: cannot open shared object file: No such file or directory
% lsb_release -a
LSB Version:	:core-4.1-amd64:core-4.1-noarch:cxx-4.1-amd64:cxx-4.1-noarch:desktop-4.1-amd64:desktop-4.1-noarch:languages-4.1-amd64:languages-4.1-noarch:printing-4.1-amd64:printing-4.1-noarch
Distributor ID:	CentOS
Description:	CentOS Linux release 7.9.2009 (Core)
Release:	7.9.2009
Codename:	Core

Pip Installation fails - Double requirement given: numpy

  • nd2 version: 0.2.2
  • Python version: 3.8.10
  • Operating System: Ubuntu 20.x

Description

Can not install nd2 package using simple PIP

What I Did

pip install nd2[legacy]

$pip install nd2[legacy]
Collecting nd2[legacy]
  Using cached nd2-0.2.2.tar.gz (5.4 MB)
  Installing build dependencies ... error
  ERROR: Command errored out with exit status 1:
   command: /usr/bin/python3 /usr/lib/python3/dist-packages/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-mbhnace4/overlay --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- setuptools wheel cython numpy 'numpy==1.14.5; python_version=='"'"'3.7'"'"'' 'numpy==1.17.3; python_version=='"'"'3.8'"'"'' 'numpy==1.19.3; python_version=='"'"'3.9'"'"'' 'numpy==1.21.3; python_version=='"'"'3.10'"'"''
       cwd: None
  Complete output (2 lines):
  Ignoring numpy: markers 'python_version == "3.7"' don't match your environment
  ERROR: Double requirement given: numpy==1.17.3 (already in numpy, name='numpy')
  ----------------------------------------
ERROR: Command errored out with exit status 1: /usr/bin/python3 /usr/lib/python3/dist-packages/pip install --ignore-installed --no-user --prefix /tmp/pip-build-env-mbhnace4/overlay --no-warn-script-location --no-binary :none: --only-binary :none: -i https://pypi.org/simple -- setuptools wheel cython numpy 'numpy==1.14.5; python_version=='"'"'3.7'"'"'' 'numpy==1.17.3; python_version=='"'"'3.8'"'"'' 'numpy==1.19.3; python_version=='"'"'3.9'"'"'' 'numpy==1.21.3; python_version=='"'"'3.10'"'"'' Check the logs for full command output.


$ pip list | grep numpy
numpy                       1.22.3              

Linux library paths for libnd2readsdk-shared.so not set correctly on install

  • nd2 version: latest main
  • Python version: 3.9
  • Operating System: Ubuntu 20.04

Description

When pulling the latest main and doing a local install with pip install . or pip install -e . from the nd2 folder the shared library doesn't get installed correctly.
When executing code from the location of where I installed the repo, everything works fine. When executing the code from a different
working directory I get import errors.

What I Did

running my tests from the nd2 directory works without ImportError (note that the Python Kernel dies when reading nd2, I am investigating and will open a separate issue).

(napari_latest) hilsenst@itservices-XPS-15-9500:~/Github/nd2$ pytest ~/GitlabEMBL/spacem-ht/src/spacem-mosaic/tests/test_io.py 
=========================================================================================== test session starts ===========================================================================================
platform linux -- Python 3.9.5, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
PyQt5 5.15.4 -- Qt runtime 5.15.2 -- Qt compiled 5.15.2
rootdir: /home/hilsenst/GitlabEMBL/spacem-ht/src/spacem-mosaic
plugins: order-1.0.0, napari-0.4.11, timeout-1.4.2, anyio-3.3.0, napari-plugin-engine-0.1.9, qt-4.0.2, hypothesis-6.14.4
collected 4 items                                                                                                                                                                                         

../../GitlabEMBL/spacem-ht/src/spacem-mosaic/tests/test_io.py ...Killed

running my tests from some other directory fails with ImportError:

(napari_latest) hilsenst@itservices-XPS-15-9500:~/Github$ pytest ~/GitlabEMBL/spacem-ht/src/spacem-mosaic/tests/test_io.py 
=========================================================================================== test session starts ===========================================================================================
platform linux -- Python 3.9.5, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
PyQt5 5.15.4 -- Qt runtime 5.15.2 -- Qt compiled 5.15.2
rootdir: /home/hilsenst/GitlabEMBL/spacem-ht/src/spacem-mosaic
plugins: order-1.0.0, napari-0.4.11, timeout-1.4.2, anyio-3.3.0, napari-plugin-engine-0.1.9, qt-4.0.2, hypothesis-6.14.4
collected 4 items                                                                                                                                                                                         

../GitlabEMBL/spacem-ht/src/spacem-mosaic/tests/test_io.py ...F                                                                                                                                     [100%]

================================================================================================ FAILURES =================================================================================================
___________________________________________________________________________________________ test_load_tiles_nd2 ___________________________________________________________________________________________

    def test_load_tiles_nd2():
>       tile_arr, channel_dict = load_tiles(Path(dataset_nd2))

../GitlabEMBL/spacem-ht/src/spacem-mosaic/tests/test_io.py:30: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
../GitlabEMBL/spacem-ht/src/spacem-mosaic/spacem_mosaic/io.py:110: in load_tiles
    return load_tiles_nd2(file_path)
../GitlabEMBL/spacem-ht/src/spacem-mosaic/spacem_mosaic/io.py:54: in load_tiles_nd2
    f = ND2File(file_path)
../miniconda3/envs/napari_latest/lib/python3.9/site-packages/nd2/nd2file.py:42: in __init__
    self._rdr = get_reader(self._path)
../miniconda3/envs/napari_latest/lib/python3.9/site-packages/nd2/_util.py:24: in get_reader
    from ._sdk.latest import ND2Reader
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

>   from . import latest
E   ImportError: libnd2readsdk-shared.so: cannot open shared object file: No such file or directory

../miniconda3/envs/napari_latest/lib/python3.9/site-packages/nd2/_sdk/__init__.py:1: ImportError
============================================================================================ warnings summary =============================================================================================
../miniconda3/envs/napari_latest/lib/python3.9/site-packages/pims/cine.py:29
  /home/hilsenst/miniconda3/envs/napari_latest/lib/python3.9/site-packages/pims/cine.py:29: DeprecationWarning: Using or importing the ABCs from 'collections' instead of from 'collections.abc' is deprecated since Python 3.3, and in 3.10 it will stop working
    from collections import Iterable

-- Docs: https://docs.pytest.org/en/stable/warnings.html
========================================================================================= short test summary info =========================================================================================
FAILED ../GitlabEMBL/spacem-ht/src/spacem-mosaic/tests/test_io.py::test_load_tiles_nd2 - ImportError: libnd2readsdk-shared.so: cannot open shared object file: No such file or directory
================================================================================= 1 failed, 3 passed, 1 warning in 8.65s ==================================================================================
(napari_latest) hilsenst@itservices-XPS-15-9500:~/Github$ cd nd2
(napari_latest) hilsenst@itservices-XPS-15-9500:~/Github/nd2$ pytest ~/GitlabEMBL/spacem-ht/src/spacem-mosaic/tests/test_io.py 
=========================================================================================== test session starts ===========================================================================================
platform linux -- Python 3.9.5, pytest-6.2.4, py-1.10.0, pluggy-0.13.1

I suspect that the issue is somewhere here:

https://github.com/tlambert03/nd2/blob/main/setup.py#L10-L13

But haven't been able to figure out exactly what goes wrong.

nd2[legacy] dependency

Not sure if anyone else encounter this?

  • nd2 version: latest
  • Python version: 3.8.10
  • Operating System: Ubuntu

Description

When installing nd2[legacy] through pip outside of condo environment, wurlizter fails to install due to the tag extra.

What I Did

pip install nd2[legacy]

ERROR: Could not find a version that satisfies the requirement wurlizter; extra == "legacy" (from nd2[legacy]) (from versions: none)
ERROR: No matching distribution found for wurlizter; extra == "legacy" (from nd2[legacy])

Getting ND2 series images in the order they were captured

  • nd2 version: 0.5.3
  • Python version: 3.114
  • Operating System: Windows 10

Description

Using a multipoint image file and the metadata on the stage position for each field of view in the series, I am trying to stitch the field of view together into a single image. The problem I have encountered is the series is loaded into a third dimension of an array in what seems to be an incorrect order. Is there something I am missing or a workaround to this problem?

Thank you!

What I Did

# Get all image data
f = nd2.ND2File(image_path)
raw_image_array = img_as_float(f)

# Get metadata on the x,y, and z positions

# Populate the dictionary with integers as keys from 0 to 36, image_array and positional data as values

# Create an empty dictionary
xy_positions_dict = {}

# Populate the dictionary with integers as keys from 0 to 36, image_array and positional data as values
for i in range(fov_count):

    # Correct for abnormal layering of FOV series
    k = 1%6 + int(i/6)

    # Get tile array
    tile_array = raw_image_array[k,:,:]

    # Get full FOV string
    main_pattern = r'Position\([^)]+\)'
    full_string = re.findall(main_pattern, xy_positions_str)[i]

    # Get stage position
    stage_position = full_string[full_string.find("[")+len("["):full_string.find("]")]
    stage_position = [float(x) for x in stage_position.split(",")]

    # Get PFS offset
    pfs_off = full_string[full_string.find("pfsOffset=")+len("pfsOffset="):full_string.find(", name")]

    xy_positions_dict[i] = {'Image array': tile_array, 'x': stage_position[0], 'y': stage_position[1], 'z': stage_position[2], 'PFS offset': pfs_off}

# Define stitching function
def stitch_image(dict):
    
    # Create an empty array
    x_values_min = min([xy_positions_dict[i]['x'] for i in range(fov_count)])
    x_values_max = max([xy_positions_dict[i]['x'] for i in range(fov_count)])
    y_values_min = min([xy_positions_dict[i]['y'] for i in range(fov_count)])
    y_values_max = max([xy_positions_dict[i]['y'] for i in range(fov_count)])

    # Create a new array
    array_x_size = int((x_values_max - x_values_min) * vx_per_um[0]) + fov_x_length
    array_y_size = int((y_values_max - y_values_min) * vx_per_um[1]) + fov_y_length
    output_array = np.zeros((array_x_size, array_y_size), dtype=np.float64)
    output_array = output_array + 0
        
     for key in dict:

        # Get the image x and y values
        x = dict[key]['x']
        y = dict[key]['y']

        # Calculate the boundaries of the array
        x_l = int(vx_per_um[0] * (x - x_values_min))
        x_r = x_l + fov_x_length
        y_t = int(vx_per_um[1] * (y - y_values_min))
        y_b = y_t + fov_y_length
        # print(x_l, x_r, y_t, y_b)
        output_array[x_l:x_r, y_t:y_b] = dict[key]['Image array']

    return output_array

# Excecute image stitching function
unprocessed_output_array = stitch_image(xy_positions_dict)

only metadata

  • nd2 version: 0.2.2
  • Python version: 3.7.13
  • Operating System: Windows Server 2012

Description

Is there a way to open a file only for reading metadata, without the overhead generated by preparing to read image data?

Package not found on conda-forge channels

  • nd2 version:
  • Python version: 3.11.0
  • Operating System: macOS

Description

I'd like to install nd2 via conda-forge, but I'm getting a PackagesNotFoundError.

What I Did

Attempted to install nd2 into environment directly during environment creation and after creation using install command.

conda create -n py-nd2 -c conda-forge python=3.11 nd2
Collecting package metadata (current_repodata.json): done
Solving environment: failed with repodata from current_repodata.json, will retry with next repodata source.
Collecting package metadata (repodata.json): done
Solving environment: failed

PackagesNotFoundError: The following packages are not available from current channels:

  - nd2

Current channels:

  - https://conda.anaconda.org/conda-forge/osx-arm64
  - https://conda.anaconda.org/conda-forge/noarch
  - https://repo.anaconda.com/pkgs/main/osx-arm64
  - https://repo.anaconda.com/pkgs/main/noarch
  - https://repo.anaconda.com/pkgs/r/osx-arm64
  - https://repo.anaconda.com/pkgs/r/noarch

To search for alternate channels that may provide the conda package you're
looking for, navigate to

    https://anaconda.org

and use the search bar at the top of the page.


conda install -n py-nd2 -c conda-forge nd2
Collecting package metadata (current_repodata.json): done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.
Collecting package metadata (repodata.json): /done
Solving environment: failed with initial frozen solve. Retrying with flexible solve.

PackagesNotFoundError: The following packages are not available from current channels:

  - nd2

Current channels:

  - https://conda.anaconda.org/conda-forge/osx-arm64
  - https://conda.anaconda.org/conda-forge/noarch
  - https://repo.anaconda.com/pkgs/main/osx-arm64
  - https://repo.anaconda.com/pkgs/main/noarch
  - https://repo.anaconda.com/pkgs/r/osx-arm64
  - https://repo.anaconda.com/pkgs/r/noarch

To search for alternate channels that may provide the conda package you're
looking for, navigate to

    https://anaconda.org

and use the search bar at the top of the page.

Shared library problems

  • nd2 version: master (2f1f137)
  • Python version: 3.10.4
  • Operating System: Linux 3.10.0-1160.62.1.el7.x86_64 (this is on HMS O2 with conda-forge python installed via mamba)

Description

I think I am running into two separate issues:

  1. When building locally, either with pip install or pip install -e, the shared library is not installed correctly (related to #24).
  2. Even when I run pytest from the root of the repo, to work around the previous problem, liblimfile.so complains about undefined symbols.

I spent a couple hours playing around and am rapidly running against the limits of my understanding of modern Python packaging, so any assistance or hints you could provide @tlambert03 would be much appreciated!

What I Did

g cl [email protected]:tlambert03/nd2.git
cd nd2
mamba create -n nd2test
mamba activate nd2test
mamba install python=3.10.4 pip gcc
pip install -e .[dev]

(installing gcc in the virtualenv is necessary because the default gcc on HMS O2 is ancient)

If I then run pytest from the root of the repo, I get the error

============================= test session starts ==============================
platform linux -- Python 3.10.4, pytest-7.1.2, pluggy-1.0.0
rootdir: /home/jqs1/_temp/nd2, configfile: setup.cfg
plugins: cov-3.0.0
collected 30 items / 2 errors

==================================== ERRORS ====================================
___________________ ERROR collecting tests/test_aicsimage.py ___________________
ImportError while importing test module '/home/jqs1/_temp/nd2/tests/test_aicsimage.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../../mambaforge/envs/nd2test/lib/python3.10/importlib/__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/test_aicsimage.py:11: in <module>
    from aicsimageio.readers.nd2_reader import ND2Reader
E   ModuleNotFoundError: No module named 'aicsimageio'
______________________ ERROR collecting tests/test_sdk.py ______________________
ImportError while importing test module '/home/jqs1/_temp/nd2/tests/test_sdk.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../../mambaforge/envs/nd2test/lib/python3.10/importlib/__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/test_sdk.py:5: in <module>
    from nd2._sdk import latest
src/nd2/_sdk/__init__.py:1: in <module>
    from . import latest
E   ImportError: src/sdk/Linux/x86_64/lib/liblimfile.so: symbol TIFFReadRGBATileExt, version LIBTIFF_4.0 not defined in file libtiff.so.5 with link time reference
=========================== short test summary info ============================
ERROR tests/test_aicsimage.py
ERROR tests/test_sdk.py
!!!!!!!!!!!!!!!!!!! Interrupted: 2 errors during collection !!!!!!!!!!!!!!!!!!!!
============================== 2 errors in 1.14s ===============================

If I then pip install aicsimageio and re-run pytest:

============================= test session starts ==============================
platform linux -- Python 3.10.4, pytest-7.1.2, pluggy-1.0.0
rootdir: /home/jqs1/_temp/nd2, configfile: setup.cfg
plugins: cov-3.0.0
collected 40 items / 1 error

==================================== ERRORS ====================================
______________________ ERROR collecting tests/test_sdk.py ______________________
ImportError while importing test module '/home/jqs1/_temp/nd2/tests/test_sdk.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
../../mambaforge/envs/nd2test/lib/python3.10/importlib/__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
tests/test_sdk.py:5: in <module>
    from nd2._sdk import latest
src/nd2/_sdk/__init__.py:1: in <module>
    from . import latest
E   ImportError: src/sdk/Linux/x86_64/lib/liblimfile.so: symbol TIFFReadRGBATileExt, version LIBTIFF_4.0 not defined in file libtiff.so.5 with link time reference
=========================== short test summary info ============================
ERROR tests/test_sdk.py
!!!!!!!!!!!!!!!!!!!! Interrupted: 1 error during collection !!!!!!!!!!!!!!!!!!!!
=============================== 1 error in 2.08s ===============================

From the root of the repo, if I run python -c "from nd2._sdk import latest", I get:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/jqs1/_temp/nd2/src/nd2/_sdk/__init__.py", line 1, in <module>
    from . import latest
ImportError: /lib64/libstdc++.so.6: version `GLIBCXX_3.4.20' not found (required by src/sdk/Linux/x86_64/lib/liblimfile.so)

If I cd outside the repo, the same command gives:

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/home/jqs1/_temp/nd2/src/nd2/_sdk/__init__.py", line 1, in <module>
    from . import latest
ImportError: liblimfile.so: cannot open shared object file: No such file or directory

I was thinking this had something to do with the editable pip install, so I tried installing via pip install .[dev] (no -e). This produces the same results as above. Here's a clue: find /home/jqs1/mambaforge/envs/nd2test/lib/python3.10/site-packages/nd2/ -name "*.so" gives

/home/jqs1/mambaforge/envs/nd2test/lib/python3.10/site-packages/nd2/_sdk/latest.cpython-310-x86_64-linux-gnu.so

Namely, the Nikon SDK shared libraries (liblimfile.so and libnd2readsdk-shared.so) aren't even being installed. I noticed that the paths in MANIFEST.in are wrong. I believe

graft src/sdk/*/lib
graft src/sdk/*/include

should be

graft src/sdk/*/*/lib
graft src/sdk/*/*/include

Unfortunately, this change does not appear to fix this problem, the Nikon shared libraries are still not installed.

Potential secutiry vulnerabilities in the shared libraries which nd2 depends on. Can you help upgrade to patch versions?

Hi, @tlambert03 , @VolkerH , I'd like to report a vulnerability issue in nd2_0.2.2.

Dependency Graph between Python and Shared Libraries

image

Issue Description

As shown in the above dependency graph, nd2_0.2.2 directly or transitively depends on 5 C libraries (.so). However, I noticed that one C library is vulnerable, containing the following CVEs:
libjpeg-0784ef09.so.62.2.0from C project libjpeg-turbo(version:1.5.2) exposed 2 vulnerabilities:
CVE-2018-14498, CVE-2017-15232

Suggested Vulnerability Patch Versions

libjpeg-turbo has fixed the vulnerabilities in versions >=2.0.0

Python build tools cannot report vulnerable C libraries, which may induce potential security issues to many downstream Python projects.
As a popular python package (nd2 has 15,416 downloads per month), could you please upgrade the above shared libraries to their patch versions?

Thanks for your help~
Best regards,
MikeWazowski

order of axes

Apologies, this is not really an issue but is there a way to set the order of axes on the object that is returned from nd2.ND2File. For example when you read an nd2 file then the object has the dimensions if the format TZCYX. Can I set this order to TCZYX please?

nd2 not working on M1 architecture

  • Python version: 3.10.2
  • Operating System: macOS Big Sur 11.6

Description

nd2 not working on M1 architecture.

What I Did

I had tried running the package on a M1 iMac and ran into some issues with opening files. I downloaded the test data and tried to open a file as such.

import nd2

f = nd2.imread('nd2_test_data/aryeh_MeOh_high_fluo_007.nd2')

This generated the following error message.

Traceback (most recent call last):
  File "/Users/wniu/Documents/Code/test.py", line 3, in <module>
    f = nd2.imread('/Users/wniu/Downloads/nd2_test_data/aryeh_MeOh_high_fluo_007.nd2')
  File "/Users/wniu/miniforge3/envs/test/lib/python3.10/site-packages/nd2/nd2file.py", line 506, in imread
    with ND2File(file) as nd2:
  File "/Users/wniu/miniforge3/envs/test/lib/python3.10/site-packages/nd2/nd2file.py", line 54, in __init__
    self._rdr = get_reader(self._path)
  File "/Users/wniu/miniforge3/envs/test/lib/python3.10/site-packages/nd2/_util.py", line 24, in get_reader
    from ._sdk.latest import ND2Reader
  File "/Users/wniu/miniforge3/envs/test/lib/python3.10/site-packages/nd2/_sdk/__init__.py", line 1, in <module>
    from . import latest
ImportError: dlopen(/Users/wniu/miniforge3/envs/test/lib/python3.10/site-packages/nd2/_sdk/latest.cpython-310-darwin.so, 2): Symbol not found: _Lim_DestroyPicture
  Referenced from: /Users/wniu/miniforge3/envs/test/lib/python3.10/site-packages/nd2/_sdk/latest.cpython-310-darwin.so
  Expected in: flat namespace
 in /Users/wniu/miniforge3/envs/test/lib/python3.10/site-packages/nd2/_sdk/latest.cpython-310-darwin.so

read speed

  • nd2 version: 0.2.2
  • Python version: 3.7.13
  • Operating System: Windows Server 2012

Description

I have long microscopy movies - e.g. 30000 frames, otherwise only dimensions X and Y. All frames are analysed independently.
For reading .nd2 files, I've tested nd2reader, and nd2 (which I'd prefer because of better access to the metadata).
using the nd2reader, I can read at a speed of about 800 frames/second (single-process)
using nd2, I get 20 frames/second (when converting frames from dask to numpy using .compute()). Only loading the frames as dask arrays, I get 2000 frames/second - but of course I need to do the computations in numpy.

Is there a way to load planes faster? I don't have enough memory to directly load the complete dataset.

What I Did

import time, sys, os
tic = time.time()
import numpy as np
import nd2
from nd2reader import ND2Reader
import yaml
from tqdm import tqdm


def test_readspeed_nd2direct():
    print('testing read speed of nd2 file via nd2 package directly.')
    t00 = time.time()

    movie = nd2.ND2File(filename_nd2).to_dask()

    dt = np.round(time.time() - t00, 2)
    print(f"File loaded in {dt} seconds.")

    tic = time.time()
    n_frames = len(movie)
    with tqdm(total=n_frames, unit="frame") as progress_bar:
        for i in range(n_frames):
            frame = movie[i].compute()
            progress_bar.update()
    print('Loaded all frames in {:.2f} seconds per frame.'.format(
        (n_frames/(time.time()-tic))))


def test_readspeed_nd2reader():
    print('testing read speed of nd2 file using nd2reader')
    t00 = time.time()

    # with ND2Reader(filename_nd2) as movie:
    movie = ND2Reader(filename_nd2)
    dt = np.round(time.time() - t00, 2)
    print(f"File loaded in {dt} seconds.")

    tic = time.time()
    n_frames = len(movie)
    with tqdm(total=n_frames, unit="frame") as progress_bar:
        for i in range(n_frames):
            frame = movie[i]
            progress_bar.update()
    print('Loaded all frames in {:.2f} frames per second.'.format(
        (n_frames/(time.time()-tic))))
    movie.close()


if __name__ == '__main__':
    print('imported packages in {:.2f} seconds.'.format(
        time.time()-tic))

    test_readspeed_nd2reader()
    test_readspeed_nd2direct()

output

imported packages in 15.89 seconds.
testing read speed of nd2 file using nd2reader
File loaded in 0.06 seconds.
100%|¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦¦| 30000/30000 [00:39<00:00, 753.40frame/s]
Loaded all frames in 753.22 frames per second.
testing read speed of nd2 file via nd2 package directly.
File loaded in 3.09 seconds.
  7%|¦¦¦¦¦¦¦?                                                                                                | 2109/30000 [02:05<24:28, 19.00frame/s]

cannot find metadata

  • nd2 version: 0.4.6
  • Python version: 3.10.6
  • Operating System: Ubuntu 22.04

Description

I am searching for metadata but cannot find it.
I know the metadata exists because when using NIS elements software I can export as an excel file:
https://hhmionline-my.sharepoint.com/:x:/g/personal/busheyd_hhmi_org/EdGxdrwO_o9Ml1m3M6NS7WsBxXaAv2Ej9y7Ra-dwyFPR-g?e=aycWd2
I am specifically looking for data in sheet "Recorded Data" column 640 nm[%].

The nd2 file can be found here (large file 18.6 GB):
https://hhmionline-my.sharepoint.com/:u:/g/personal/busheyd_hhmi_org/ESt_jvLHeMVIhiXF3VAk4aYB0NZYxLe3HT4OVH6T8s_iZA?e=zwHEbt

What I Did

I've looked in (f= nd2.ND2File(target_file)):
f.frame_metadata(0) #looking at first frame. Assuming the 640 nm would be here per frame in time series image.
Also, I looked in other places but do not indicate per frame state:
f.custom_data - can see 640 nm as tag 6
f.attributes
f.experiment
f.unstructured_metadata() #when running this it hangs
f.metadata

Assistance is greatly appreciated. Although I can export the data via elements it takes a long time.
Thanks for nd2

ValueError for the big 2d image in the get_frame method

  • nd2 version: 0.3.0
  • Python version: 3.10
  • Operating System:Window10 (RAM size is 15 GB)

Description

I would like to slice the big image data.

Code

dask_array = nd2.imread(file_path, dask=True)
dask_array =dask_array[0,0:100,0:100,:]
result_ndarray = dask_array.compute()

Error

  File "C:\{my environment}\.venv\lib\site-packages\nd2\nd2file.py", line 510, in _get_frame
    frame.shape = self._raw_frame_shape
ValueError: cannot reshape array of size 50059620352 into shape (26420,37152,1,3)

What I Did

Tried to compute the following nd2 file.

The size information

Attributes(bitsPerComponentInMemory=8, bitsPerComponentSignificant=8, componentCount=3, heightPx=26420, pixelDataType='unsigned', sequenceCount=17, widthBytes=111456, widthPx=37152, compressionLevel=None, compressionType=None, tileHeightPx=None, tileWidthPx=None, channelCount=1)

Note

Another trial

I tried another file and it was working well.

Attributes(bitsPerComponentInMemory=8, bitsPerComponentSignificant=8, componentCount=3, heightPx=5530, pixelDataType='unsigned', 
sequenceCount=16, widthBytes=15984, widthPx=5328, compressionLevel=None, compressionType=None, tileHeightPx=None, tileWidthPx=None, channelCount=1)
5530, 5328, 1, 3

Question

_get_frame in nd2file.py seems to require a big memory if the data is huge in width and height due to converting it to ndarray?

My status

Sorry to day, I'm a beginner at Python. Just using the debugger and running a straightforward script is the maximum that I can do.
Please inform me what you would like to make me do some more investigation.

Memory usage

  • nd2 version: 0.1.4
  • Python version: 3.7.10
  • Operating System: Windows10

Description

I try to load selected parts of nd2 files but too much memory is allocated for the objects that need to be computed. As a consequence, it fails to load objects that are bigger than ~4 times available memory.

What I Did

Test on a time lapse experiment:
nd2_ram

Test on a single time point big image:

nd2_single_frame

In the second example the memory allocation is correct when it has to compute the whole file.

It may be related to the problem of calculating object size incorrectly as shown here:

nd2_measure_size

Incompatibility with numpy<=1.21

  • nd2 version: 0.2.1
  • Python version: 3.10.2
  • Operating System: macOS 11.4

Description

The package seems to be currently incompatible with numpy<=1.21. On numpy==1.22.3, it works just fine, but downgrading to 1.21.5 for example gives the following error.

ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject

This is problematic because numba currently only supports numpy<=1.21 and it is very commonly used in a lot of packages.

What I Did

import nd2
nd2.imread('some.nd2')

File handle left open when an nd2 file fails to read

  • nd2 version: 0.5.1
  • Python version: 3.9
  • Operating System: Ubuntu 20.04

Description

I try to open the nd2 file downloaded from here: https://downloads.openmicroscopy.org/images/ND2/aryeh/b16_pdtB+y50__crop.nd2
It fails with TypeError: <lambda>() missing 6 required positional arguments: 'bitsPerComponentInMemory', 'bitsPerComponentSignificant', 'componentCount', 'heightPx', 'pixelDataType', and 'sequenceCount'.

If I catch that exception and do something else, I notice that there is an open file handle to the file I tried to load that never closes, even when all my local variables are deleted and gc.collect() is run. If I try to open a number of bad files, eventually, a lot of file handles are permanently open (cab be checked via /proc//fd, for instance).

What I Did

python test.py

# where test.py is
---
import gc
import time
import nd2

try:
    f = nd2.ND2File("b16_pdtB+y50__crop.nd2")
    print(f.sizes)
except Exception as exc:
    print exc
f = None
gc.collect()
time.sleep(1000)
---

ls /proc/<pid of python process>/fd

Naturally, it would be great to read all files, but on failure it would be nice to release the file handle. I tried explicitly calling f.close() and f._rdr.close() in the exception handler, but that didn't release it either.

Position names are 'None'

  • nd2 version: 4.50.00
  • Python version: 3.9.5
  • Operating System: Windows 11 21H2 (Build 22000.739)

Description

I would like to find out the names of the different positions I gave during the experiment. This should be the M-axis in NIS Elements.

What I Did

I guess they should be in

nd2_file.experiment[1].parameters.points[i].name

Where nd2_file.experiment[1] is a XYPosLoop.

However, these values are of type None. In NIS Elements the different positions have names.

ImportError: libnd2readsdk-shared.so: cannot open shared object file (CentOS)

  • nd2 version: 0.1.4
  • Python version: Python 3.8.1
  • Operating System: CentOS-7

Description

I was trying to use the library for the first time to open a .nd2 files.
I got an error that seems to show that
the ND2SDK is somehow not correctly installed. I downloaded it, got it installed by rpm -i nd2readsdk-static-1.7.2.0-Linux.rpm but the problem persisted.

Any insight on how this could be solved?

What I Did

import nd2
import numpy as np

my_array = nd2.imread('myfile.nd2') 

and got

---------------------------------------------------------------------------
ImportError                               Traceback (most recent call last)
<ipython-input-5-4b821c59f310> in <module>
      2 import numpy as np
      3 
----> 4 my_array = nd2.imread('myfile.nd2')

~/anaconda3/envs/newly/lib/python3.8/site-packages/nd2/nd2file.py in imread(file, dask, xarray)
    440 
    441 def imread(file: str, dask: bool = False, xarray: bool = False):
--> 442     with ND2File(file) as nd2:
    443         if xarray:
    444             return nd2.to_xarray(delayed=dask)

~/anaconda3/envs/newly/lib/python3.8/site-packages/nd2/nd2file.py in __init__(self, path)
     40     def __init__(self, path: Union[Path, str]) -> None:
     41         self._path = str(path)
---> 42         self._rdr = get_reader(self._path)
     43         self._closed = False
     44         self._is_legacy = "Legacy" in type(self._rdr).__name__

~/anaconda3/envs/newly/lib/python3.8/site-packages/nd2/_util.py in get_reader(path)
     22         magic_num = fh.read(4)
     23         if magic_num == NEW_HEADER_MAGIC:
---> 24             from ._sdk.latest import ND2Reader
     25 
     26             return ND2Reader(path)

~/anaconda3/envs/newly/lib/python3.8/site-packages/nd2/_sdk/__init__.py in <module>
----> 1 from . import latest
      2 
      3 __all__ = ["latest"]

ImportError: libnd2readsdk-shared.so: cannot open shared object file: No such file or directory

Error loading nd2 from N-SIM

  • nd2 version: 0.1.6
  • Python version: 3.7.12
  • Operating System: Ubuntu 18.04

Description

Load N-SIM data stored in nd2 file. The data is interesting, all of the angles/phases for a channel are in one image when I load via bioformats. I'm not sure if the nd2 sdk is trying to crop these into individual images?

What I Did

Try to load data using

import nd2
import numpy as np

my_array = nd2.imread('test_sim.nd2')

---------------------------------------------------------------------------
ValueError                                Traceback (most recent call last)
[<ipython-input-14-d50f1da17a57>](https://localhost:8080/#) in <module>()
----> 1 my_array = nd2.imread('test_sim.nd2')

3 frames
[/usr/local/lib/python3.7/dist-packages/nd2/_sdk/__init__.py](https://localhost:8080/#) in <module>()
----> 1 from . import latest
      2 
      3 __all__ = ["latest"]

src/nd2/_sdk/latest.pyx in init nd2._sdk.latest()

ValueError: numpy.ndarray size changed, may indicate binary incompatibility. Expected 88 from C header, got 80 from PyObject

bug: Lim_FileGetAttributes failing in certain linux environments

Originally posted by @manthey in #114 (comment)

Description

I try to open the nd2 file downloaded from here: https://downloads.openmicroscopy.org/images/ND2/aryeh/b16_pdtB+y50__crop.nd2
It fails with TypeError: <lambda>() missing 6 required positional arguments: 'bitsPerComponentInMemory', 'bitsPerComponentSignificant', 'componentCount', 'heightPx', 'pixelDataType', and 'sequenceCount'.

As one more interesting clue to failing to read some nd2 files in some environments and not others: When I use stock Ubuntu python 3.8 or python 3.9 installed from the deadsnakes ppa, I get the failure to read attributes. If I use pyenv to compile python 3.9 locally, that reads the nd2 file successfully. The only difference I have noticed is that using LD_TRACE_LOADED_OBJECTS=1 shows that the system/deadsnakes pythons both pull in libexpat, whereas the pyenv python does not.

Slow creation of ND2File object from large ND2 files

  • nd2 version: 0.2.2
  • Python version: 3.8.13
  • Operating System: Windows 10

Description

Thank you for creating this useful package. However, in my use case, creating ND2File objects from a ~30 GB ND2 file seems quite slow, taking about 30 s, while in comparison creating an ND2Reader object using nd2reader takes ~1 s. Reading of subsets of the data once the object is created seems similar. I would much prefer the ND2File version for the slicing interface and dask compatibility, but the initial overhead for creating a series of ND2File/dask objects is (surprisingly?) large. Using nd2.imread(filepath, dask = True) gives similar overhead, while running to_dask() on the ND2 file once created is less than a second. File details are below. Both CPU and drive usage are very moderate as the object is created, so there do not seem to be hardware bottlenecks.

What I Did

import nd2
arr = nd2.ND2File(filepath)
arr
--> <ND2File at ...: 'Time00010_Channel555 nm,635 nm_Seq0010.nd2' uint16: {'P': 30, 'Z': 61, 'C': 2, 'Y': 2304, 'X': 2304}>

Difference between NETimeLoop and TimeLoop?

Hi, this is not a bug report as I think it is actually related to the information that NIS Elements itself saves to the nd2 files. However, I have pretty much the same experimental files but from two different microscopes running with the same settings (I haven't checked the concrete NIS Elements versions yet, though). For all experiments I can extract the timestamps. They are also correct for the experiments from the first microscope. But for the experiments of the second microscope, the timestamps do not change over time. When I investigated differences between the nd2-files, I found that in the ones where the timestamps are correct there is a NETimeLoop present, while in those were the timestamps are incorrect a TimeLoop is present. I suspect that difference is responsible for the incorrect timestamps. However, I do not understand how this could happen, and whether I can somehow recover the correct timestamps for these experiments. Maybe, you can help me with this?

When importing the files with OME in Fiji, the timestamps are also incorrect. So it is really, saved wrongly in the nd2-file from the very beginning. But I would like to find out how I can resolve that problem.

Hidden issues with DaskArrayProxy no longer hidden: fails to work with NEP18 dispatch mechanism, np forces compute

  • nd2 version: main
  • Python version: 3.9
  • Operating System:

Description

In this #19 (comment), @tlambert03 wrote:

that is, it's a dask array that, whenever you try to call compute or np.asarray, will re-open the underlying file (with self.wrapped.ctx is essentially just with ND2File()....)

It looks a tad bit risky at first, but I haven't run into any issues with it yet. In any case, I suspect the issue of trying to use a dask array after closing the file is far more common than whatever hidden issues there are with this proxy. I'm inclined to try it

The hidden issues are coming out of hiding.
Where the NEP-18 mechanism would dispatch the dask array method corresponding to a numpy method when passing a dask array to the numpy method, this no longer works with the DaskArrayProxy. This triggers a compute() on the array underlying the proxy where no compute() would have happened on a non-proxied array. In my case (large array) that kills the Linux kernel.

To reproduce (here I use a 4d nd2-file):

test_nd2.py

from nd2 import ND2File

import numpy as np
import dask.array as da
dataset_nd2 = "/home/hilsenst/Documents/Luisa_Reference_HT/PreMaldi/Seq0000.nd2"


def test_nd2_dask_einsum():
    f = ND2File(dataset_nd2)
    arr = f.to_dask()
    print(f"Array shape {arr.shape}")
    reordered_dask = da.einsum('abcd->abcd', arr)
    print(reordered_dask[:1,:1,:1,:1].compute())


def test_synthetic_dask_einsum_via_nep18():
    arr = da.zeros([1000,1000,100,100])
    print(f"Array shape {arr.shape}")
    reordered_nep18 = np.einsum('abcd->abcd', arr)
    print(type(reordered_nep18))
    print(reordered_nep18[:1,:1,:1,:1].compute())


def test_nd2_dask_einsum_via_nep18_small():
    f = ND2File(dataset_nd2)
    arr = f.to_dask()
    arr = arr[:10,:10,:10,:10]
    print(f"Array shape {arr.shape}")
    print(f"arr has type {type(arr)}")
    reordered_nep18 = np.einsum('abcd->abcd', arr)
    print(type(reordered_nep18))
    print(reordered_nep18[:1,:1,:1,:1].compute())


def test_nd2_dask_einsum_via_nep18():
    f = ND2File(dataset_nd2)
    arr = f.to_dask()
    print(f"Array shape {arr.shape}")
    reordered_nep18 = np.einsum('abcd->abcd', arr)
    print(type(reordered_nep18))
    print(reordered_nep18[:1,:1,:1,:1].compute())

Running these tests shows the problem

(napari_latest) hilsenst@itservices-XPS-15-9500:~/GitlabEMBL/spacem-ht/src/spacem-mosaic$ pytest tests/test_nd2.py  --capture=no
=========================================================================================== test session starts ===========================================================================================
platform linux -- Python 3.9.5, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
PyQt5 5.15.4 -- Qt runtime 5.15.2 -- Qt compiled 5.15.2
rootdir: /home/hilsenst/GitlabEMBL/spacem-ht/src/spacem-mosaic
plugins: order-1.0.0, napari-0.4.11, timeout-1.4.2, anyio-3.3.0, napari-plugin-engine-0.1.9, qt-4.0.2, hypothesis-6.14.4
collected 4 items                                                                                                                                                                                         

tests/test_nd2.py Array shape (734, 2, 2060, 2044)
[[[[96]]]]
.Array shape (1000, 1000, 100, 100)
<class 'dask.array.core.Array'>
[[[[0.]]]]
.Array shape (10, 2, 10, 10)
arr has type <class 'nd2._dask_proxy.DaskArrayProxy'>
<class 'numpy.ndarray'>
FArray shape (734, 2, 2060, 2044)
Killed

For me, the convenience of using NEP-18 dispatch almost outweighs the problem of a few open file handles without the array proxy.
I guess the chances to get numpy to support ObjectProxies with NEP18 as well are fairly slim.

issues with dask arrays returned by nd2: depend on ND2File being open, can't be pickled

Hi Talley,

a couple of observations regarding the dask arrays returned by nd2. None of this is terribly surprising (so you are probbably aware), and I am not expecting this can really be fixed. But maybe some lines in the readme file could save users some headaches.

keep ND2File object in open() state when working with dask array

For my use case I created a little helper function (pseudo-code):

def provide_tiles_as_daskarray_from_nd2(filepath):
      with nd2.ND2File(filepath) as f:
            arr = f.to_dask()
            # do a few more things on array: cropping, sorting dimensions
            return arr

I have similiar functions for other file types, e.g. something like provide_tiles_as_daskarray_from_folder_of_imgs(filepath).

On first sight, this works. I used this function from a Jupyter Notebook and it seemed as if it returns a working dask array.
Until you do anything that forces a compute(), in which case the python kernel crashes without a traceback.
The issue is that f is closed when returning out of the context manager. The same thing happens if you don't use a context manager and call f.close() before returning. Again, not terribly surprising. When running in the VS Code debugger I actually saw a trace back and saw that there is a segmentation fault when the nd2 library is trying to read data.

My workaround is to not close f which is a bit dirty as there will be open file handles hanging around possibly causing memory leaks.

cloudpickle

The dask arrays returned by nd2 can't be serialized using cloudpickle due to the ND2File object not being pickleable:

---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
<ipython-input-20-8a76360da592> in <module>
----> 1 cloudpickle.dump(array,f)

~/miniconda3/envs/napari_latest/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py in dump(obj, file, protocol, buffer_callback)
     53         compatibility with older versions of Python.
     54         """
---> 55         CloudPickler(
     56             file, protocol=protocol, buffer_callback=buffer_callback
     57         ).dump(obj)

~/miniconda3/envs/napari_latest/lib/python3.9/site-packages/cloudpickle/cloudpickle_fast.py in dump(self, obj)
    561     def dump(self, obj):
    562         try:
--> 563             return Pickler.dump(self, obj)
    564         except RuntimeError as e:
    565             if "recursion" in e.args[0]:

~/miniconda3/envs/napari_latest/lib/python3.9/site-packages/nd2/_sdk/latest.cpython-39-x86_64-linux-gnu.so in nd2._sdk.latest.ND2Reader.__reduce_cython__()

TypeError: no default __reduce__ due to non-trivial __cinit__

Again, not terribly surprising. However, the implication is (I haven't tried it, but am fairly certain this is the case) that the dask arrays returned by nd2 cannot be used in a dask.distributed context where the pickled array needs to be sent to the various worker processes.

A workaround for both would be to create a temporary .zarr file (pseudo-code):

def provide_tiles_as_daskarray_from_nd2(filepath):
      with nd2.ND2File(filepath) as f:
            arr = f.to_dask()
            # do a few more things on array: cropping, sorting dimensions
            arr.to_zarr("_tmpfile.zarr")
            f.close()
            return dask.array.from_zarr("_tmpfile.zarr")

with obvious drawback of duplicating the required storarge space.

I don't know whether there is much that can be done about this, but it may be useful to add a note of caution in the Readme.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.