Comments (10)
with NWBHDF5IO('test_a.h5', 'r') as base_io: nwb_add = base_io.read().copy() new_nwb.add_acquisition(TimeSeries(name='series3', unit='V', data=np.arange(300), rate=0.5)) with NWBHDF5IO('test_b.h5', 'w', manager=base_io.manager) as out_io: out_io.write(nwb_add)
Could you help me understand this part of the example. It's unclear to me why add_acquisition
is being called for new_nwb
but the write is then done for nwb_add
. I think what may be happening is because new_nwb.add_acquisition
is being called on new_nwb
but new_nwb
is not being saved afterwards, that this may create a broken link. Also, I don't think that calling copy
on the container is the right approach here. I think using export
is probably more appropriate here, see https://pynwb.readthedocs.io/en/stable/export.html I think something along the following lines:
with NWBHDF5IO('test_a.h5', 'r') as base_io:
nwb_add = base_io.read()
nwb_add.add_acquisition(TimeSeries(name='series3', unit='V', data=np.arange(300), rate=0.5))
with NWBHDF5IO('test_b.h5', 'w', manager=base_io.manager) as out_io:
out_io.export(srd_io=base_io, nwbfile=nwb_add)
from pynwb.
Whoops, that was a mistake. But even removing that line entirely, i.e. writing the copied NWBFile
, leads to the same result.
And yes, your suggestion does maintain the (or at least an equivalent) timeseries link.
from pynwb.
I believe the main issue is the probably because of out_io.write(nwb_add)
. The write
method is intended for writing a new file. When writing an existing file to a new target, you will need to use export
instead to make sure that linkage are being resolved correctly when writing to a new target. Using write
will try to write only the parts that are new to the file (in your case series3
), but when writing to a new file you also want to copy (or at least link) the existing content of the file.
Here an updated version of your script that uses export instead. At least on my labtop this works:
from datetime import datetime
from uuid import uuid4
import os
import numpy as np
from dateutil.tz import tzlocal
from pynwb import NWBFile, TimeSeries, NWBHDF5IO
# create the NWBFile
new_nwb = NWBFile(
session_description="my first synthetic recording", # required
identifier=str(uuid4()), # required
session_start_time=datetime(2017, 4, 3, 11, tzinfo=tzlocal()), # required
experimenter="Baggins, Bilbo", # optional
lab="Bag End Laboratory", # optional
institution="University of Middle Earth at the Shire", # optional
experiment_description="I went on an adventure with thirteen dwarves to reclaim vast treasures.", # optional
session_id="LONELYMTN", # optional
)
# create some example TimeSeries
test_ts = TimeSeries(
name="series1",
data=np.arange(1000),
unit="m",
timestamps=np.linspace(0.5, 601, 1000),
)
rate_ts = TimeSeries(
name="series2", data=np.arange(600), unit="V", starting_time=0.0, rate=1.0
)
# Add the TimeSeries to the file
new_nwb.add_acquisition(test_ts)
new_nwb.add_acquisition(rate_ts)
new_nwb.add_trial(start_time=1.0, stop_time=2.0, timeseries=[test_ts])
new_nwb.add_trial(start_time=10.0, stop_time=11.0, timeseries=[test_ts, rate_ts])
new_nwb.add_trial(start_time=30.0, stop_time=31.0, timeseries=[test_ts, rate_ts])
with NWBHDF5IO('test_a.h5', 'w') as io:
io.write(new_nwb)
with NWBHDF5IO('test_a.h5', 'r') as io:
nwb = io.read()
print(len(nwb.trials))
with NWBHDF5IO('test_a.h5', 'r') as base_io:
nwb_add = base_io.read()
nwb_add.add_acquisition(TimeSeries(name='series3', unit='V', data=np.arange(300), rate=0.5))
with NWBHDF5IO('test_b.h5', 'w', manager=base_io.manager) as out_io:
out_io.export(src_io=base_io, nwbfile=nwb_add)
with NWBHDF5IO('test_a.h5', 'r') as io:
nwb = io.read()
print(len(nwb.trials))
with NWBHDF5IO('test_b.h5', 'r') as io:
nwb = io.read()
print(len(nwb.trials))
from pynwb.
Thank you for the suggestions. I'm glad to know there is a way to make this work.
As for the bug issue, I am using NWBFile.copy
combined with NWBHDF5IO.write
in practice for linking elements from a parent file. For example, the original MWE (modulo the typo you pointed out) runs just fine if add_trial
does not have links to timeseries.
from pynwb.
As for the bug issue, I am using
NWBFile.copy
combined withNWBHDF5IO.write
in practice for linking elements from a parent file. For example, the original MWE (modulo the typo you pointed out) runs just fine ifadd_trial
does not have links to timeseries.
@rly could you take a look. Is using NWBFile.copy
combined with NWBHDF5IO.write
in this way intended usage or should this use export
?
from pynwb.
@oruebel To make a shallow copy, then yes, one should use NWBFile.copy
combined with NWBHDF5IO.write
. This will copy the TimeIntervals
table but the columns are links to columns in the original file. (note that these links rely on relative paths, so these files need to stay within the same relative path to each other for the link to work).
@miketrumpis Thanks for the thorough bug report! You are correct about the fix. I created a PR in #1865 with that fix.
from pynwb.
Thanks @rly for taking a look at this
from pynwb.
@miketrumpis could you please install the latest dev
branch of this github repo and see if that solves the issue for you?
from pynwb.
@rly excuse the delay, but yes it's all good here
270291ad (HEAD -> dev, tag: latest, origin/dev, origin/HEAD) Update GitHub release checklist (#1872)
idk if there is a good way like __eq__
to compare the timeseries, but they look functionally equal!
from pynwb.
Great, thank you!
from pynwb.
Related Issues (20)
- [Documentation]: intro tutorial bugs HOT 1
- [Feature]: Set default identifier to UUID HOT 2
- [Documentation]: Make all tutorials show dynamic outputs HOT 1
- [Bug] `NWBHDF5IO.can_read()` fails for missing version HOT 3
- [Feature]: Allow passing DynamicTable for ElectricalSeries.electrodes and RoiResponseSeries.rois HOT 6
- [Documentation]: question about pynwb.ophys modules HOT 4
- [Bug]: float ROI positions lose precision due to uint32 conversion during export HOT 2
- [Bug]: Unbounded stream to dataset with H5DataIO timestamps fails HOT 4
- [Documentation]: How to stream Zarr files from DANDI
- [Documentation]: Iterative writing to DynamicTable (trials, epochs, TimeIntervals ) HOT 3
- [Documentation]: add documentation example for SpikeEventSeries
- [Documentation]: add documentation example for DecompositionSeries
- [Bug]: UserWarning: NWBFile not found within the configuration for core HOT 1
- [Feature]: Chunk all TimeSeries data and timestamps by default HOT 5
- [Bug]: Adding Large Stimulus Table with add_interval takes incredibly long HOT 3
- [Feature]: Create a type for DecompositionSeries
- [Documentation]: Add example for using `family` driver to modular stroage docs
- [Documentation]: Append a dataset of references HOT 2
- [Documentation]: Invalid(?) dtype in general/extensions.py tutorial HOT 2
- [Bug]: Python warning when executing validate HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from pynwb.