Comments (9)
Hi @NilsNyberg
What version of h5py
are you using? If it's >= 3, can you try again after downgrading to 2.10.0
? The new version of h5py
introduced some API changes at different levels and many packages still need to be updated.
Let me know if that works
Alessio
from spikeinterface.
I was using the newest version, but tried downgrading to 2.10.0 now. I have confirmed it successfully downgraded to 2.10.0 via pip show h5py. I also again open the Anaconda console with administrative privileges. However, when I run the code I get the same error message.
I can also mention that I have tried different NWB files of different sizes (ranging from 0.66mb to 20+ gb) and I get the same error message on all of them. I haven't had a chance to try another computer yet, but gonna try to do it this afternoon.
I have attached the message again, in case you want to have a look:
---------------------------------------------------------------------------
OSError Traceback (most recent call last)
<ipython-input-5-5b1002091086> in <module>
3
4 recording_folder = 'nwb-dataset/'
----> 5 recording = se.NwbRecordingExtractor(recording_folder)
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\spikeextractors\extractors\nwbextractors\nwbextractors.py in __init__(self, file_path, electrical_series_name)
156 se.RecordingExtractor.__init__(self)
157 self._path = str(file_path)
--> 158 with NWBHDF5IO(self._path, 'r') as io:
159 nwbfile = io.read()
160 if electrical_series_name is not None:
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
559 def func_call(*args, **kwargs):
560 pargs = _check_args(args, kwargs)
--> 561 return func(args[0], **pargs)
562 else:
563 def func_call(*args, **kwargs):
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\pynwb\__init__.py in __init__(self, **kwargs)
244 elif manager is None:
245 manager = get_manager()
--> 246 super(NWBHDF5IO, self).__init__(path, manager=manager, mode=mode, file=file_obj, comm=comm)
247
248 @docval({'name': 'src_io', 'type': HDMFIO, 'doc': 'the HDMFIO object for reading the data to export'},
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
559 def func_call(*args, **kwargs):
560 pargs = _check_args(args, kwargs)
--> 561 return func(args[0], **pargs)
562 else:
563 def func_call(*args, **kwargs):
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\hdf5\h5tools.py in __init__(self, **kwargs)
66 self.__mode = mode
67 self.__file = file_obj
---> 68 super().__init__(manager, source=path)
69 self.__built = dict() # keep track of each builder for each dataset/group/link for each file
70 self.__read = dict() # keep track of which files have been read. Key is the filename value is the builder
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
559 def func_call(*args, **kwargs):
560 pargs = _check_args(args, kwargs)
--> 561 return func(args[0], **pargs)
562 else:
563 def func_call(*args, **kwargs):
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\io.py in __init__(self, **kwargs)
15 self.__built = dict()
16 self.__source = getargs('source', kwargs)
---> 17 self.open()
18
19 @property
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\hdf5\h5tools.py in open(self)
682 else:
683 kwargs = {}
--> 684 self.__file = File(self.source, open_flag, **kwargs)
685
686 def close(self):
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\h5py\_hl\files.py in __init__(self, name, mode, driver, libver, userblock_size, swmr, rdcc_nslots, rdcc_nbytes, rdcc_w0, track_order, **kwds)
406 fid = make_fid(name, mode, userblock_size,
407 fapl, fcpl=make_fcpl(track_order=track_order),
--> 408 swmr=swmr)
409
410 if isinstance(libver, tuple):
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\h5py\_hl\files.py in make_fid(name, mode, userblock_size, fapl, fcpl, swmr)
171 if swmr and swmr_support:
172 flags |= h5f.ACC_SWMR_READ
--> 173 fid = h5f.open(name, flags, fapl=fapl)
174 elif mode == 'r+':
175 fid = h5f.open(name, h5f.ACC_RDWR, fapl=fapl)
h5py\_objects.pyx in h5py._objects.with_phil.wrapper()
h5py\_objects.pyx in h5py._objects.with_phil.wrapper()
h5py\h5f.pyx in h5py.h5f.open()
OSError: Unable to open file (unable to open file: name = 'nwb-dataset/', errno = 13, error message = 'Permission denied', flags = 0, o_flags = 0)
Any other suggestions for what might be causing the permission error?
from spikeinterface.
Can you try to open it directly in NWB?
import pynwb
with pynwb.NWBHDF5IO('file-path.nwb', 'r') as io:
nwbfile = io.read()
If this fail as well, then the problem is somewher in the nwb file. Maybe you don't have the right file permissions?
from spikeinterface.
I get the "No data_type found for builder root" error (I have attached it below). Doing a quick google search and found this: NeurodataWithoutBorders/pynwb#1077 suggesting this is likely due to the file being in NWB 1.0 and not 2.0. Does spikeinterface only work with NWB 2.0? If so, is there any way to get spikeinterface to work with NWB 1.0 as well?Afaik that is the only version of NWB that can be outputted by Open Ephys recordings (you can always record in another format and convert to NWB later I suppose, but would be great if spikeinterface would work with all the formats directly outputted by open ephys).
Using the following code
import pynwb
fpath = 'E:/spiketutorials/NWB_Developer_Breakout_Session_Sep2020/nwb-dataset/experiment_1.nwb'
with pynwb.NWBHDF5IO('experiment_1.nwb', 'r') as io:
nwbfile = io.read()
gives
---------------------------------------------------------------------------
ValueError Traceback (most recent call last)
<ipython-input-9-d7594c4c141d> in <module>
2 fpath = 'E:/spiketutorials/NWB_Developer_Breakout_Session_Sep2020/nwb-dataset/experiment_1.nwb'
3 with pynwb.NWBHDF5IO('experiment_1.nwb', 'r') as io:
----> 4 nwbfile = io.read()
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\hdf5\h5tools.py in read(self, **kwargs)
412 % (self.source, self.__mode))
413 try:
--> 414 return call_docval_func(super().read, kwargs)
415 except UnsupportedOperation as e:
416 if str(e) == 'Cannot build data. There are no values.': # pragma: no cover
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in call_docval_func(func, kwargs)
403 def call_docval_func(func, kwargs):
404 fargs, fkwargs = fmt_docval_args(func, kwargs)
--> 405 return func(*fargs, **fkwargs)
406
407
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
559 def func_call(*args, **kwargs):
560 pargs = _check_args(args, kwargs)
--> 561 return func(args[0], **pargs)
562 else:
563 def func_call(*args, **kwargs):
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\backends\io.py in read(self, **kwargs)
34 # TODO also check that the keys are appropriate. print a better error message
35 raise UnsupportedOperation('Cannot build data. There are no values.')
---> 36 container = self.__manager.construct(f_builder)
37 return container
38
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
559 def func_call(*args, **kwargs):
560 pargs = _check_args(args, kwargs)
--> 561 return func(args[0], **pargs)
562 else:
563 def func_call(*args, **kwargs):
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\build\manager.py in construct(self, **kwargs)
236 # we are at the top of the hierarchy,
237 # so it must be time to resolve parents
--> 238 result = self.__type_map.construct(builder, self, None)
239 self.__resolve_parents(result)
240 self.prebuilt(result, builder)
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
559 def func_call(*args, **kwargs):
560 pargs = _check_args(args, kwargs)
--> 561 return func(args[0], **pargs)
562 else:
563 def func_call(*args, **kwargs):
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\build\manager.py in construct(self, **kwargs)
850 if build_manager is None:
851 build_manager = BuildManager(self)
--> 852 obj_mapper = self.get_map(builder)
853 if obj_mapper is None:
854 dt = builder.attributes[self.namespace_catalog.group_spec_cls.type_key()]
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
559 def func_call(*args, **kwargs):
560 pargs = _check_args(args, kwargs)
--> 561 return func(args[0], **pargs)
562 else:
563 def func_call(*args, **kwargs):
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\build\manager.py in get_map(self, **kwargs)
770 data_type = self.get_builder_dt(obj)
771 namespace = self.get_builder_ns(obj)
--> 772 container_cls = self.get_cls(obj)
773 # now build the ObjectMapper class
774 mapper = self.__mappers.get(container_cls)
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\utils.py in func_call(*args, **kwargs)
559 def func_call(*args, **kwargs):
560 pargs = _check_args(args, kwargs)
--> 561 return func(args[0], **pargs)
562 else:
563 def func_call(*args, **kwargs):
c:\users\julie\miniconda3\envs\spiketutorial\lib\site-packages\hdmf\build\manager.py in get_cls(self, **kwargs)
699 data_type = self.get_builder_dt(builder)
700 if data_type is None:
--> 701 raise ValueError("No data_type found for builder %s" % builder.path)
702 namespace = self.get_builder_ns(builder)
703 if namespace is None:
ValueError: No data_type found for builder root
from spikeinterface.
@bendichter maybe it'd make sense to keep back-compatibility with NWB 1.0? But I guess that would require to have an old version of pynwb
installed? Is there a way to open 1.0 files with the current pynwb
version?
@NilsNyberg at the moment only 2.0 is supported unfortunately. Sorry about that!
I recommend saving Open Ephys data to binary and using the OpenEphysRecordingExtractor
for the moment
from spikeinterface.
Thanks for the help (again!) - yeah I think for future-proofing I'd better move to binary instead... It's a shame Open Ephys does not yet give the availability to save directly to NWB 2.0, but I think its a work in progress at least.
from spikeinterface.
No worries at all :)
from spikeinterface.
@alejoe91, unfortunately there is no way to open NWB 1.0 files with pynwb, and they are unlikely to support it going forward. It would be nice to support OpenEphys data in spikeextractors, particularly if in a version of NWB. It would be easiest to do this via h5py. @NilsNyberg, can you share an example file?
from spikeinterface.
Can we close this ?
from spikeinterface.
Related Issues (20)
- sorting with kilosrt3 not working HOT 8
- Speed up LFP signal extraction HOT 1
- waveclus not working HOT 4
- Motion correction not working after high pass spatial filter HOT 3
- Spike timepoints for individual units? HOT 9
- Is there a function to extract correlation between units in their spike timing? HOT 2
- Different errors in importing TDT and Blackrock files. HOT 1
- read_phy(): excluding units marked as noise HOT 18
- Could we reuse plotting function made for WaveformExtractor for Templates HOT 3
- recording.set_probe gives an error HOT 4
- How to split the waveform object into several waveforms object for a single recording and sorting ? HOT 3
- getting spike time for all good units HOT 15
- Error in the set_probes function in spikeinterface ? HOT 4
- Absolute path to recording isn't saved when using export_to_phy HOT 4
- localize_peaks takes forever to complete HOT 17
- Error running SpykingCircus2 clustering HOT 9
- Adding exclude_filename argument to NeuralynxRecordingExtractor HOT 3
- separate folders for saved clustering objects when using 2 probes HOT 3
- export_to_phy with amplitudes and principal components HOT 4
- Error in run_sorter : file spikeinterface_log.json empty HOT 13
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from spikeinterface.