nipy / nitime Goto Github PK
View Code? Open in Web Editor NEWTimeseries analysis for neuroscience data
Home Page: http://nipy.org/nitime
License: BSD 3-Clause "New" or "Revised" License
Timeseries analysis for neuroscience data
Home Page: http://nipy.org/nitime
License: BSD 3-Clause "New" or "Revised" License
right now, not only do sampling_rate
and sampling_interval
not know about each other, so you have to set both if you're changing them, but they also don't know about their underlying data.
In [21]: ts.UniformTime(0,10,10)
Out[21]: UniformTime([ 0., 1., 2., 3., 4., 5., 6., 7., 8., 9.], time_unit='s')
In [22]: ut = _
In [23]: ut*10
Out[23]: UniformTime([ 0., 10., 20., 30., 40., 50., 60., 70., 80., 90.], time_unit='s')
In [24]: ut2 = ut*10
In [25]: ut2.sampling_rate
Out[25]: 1.0 Hz
Reported by Yarick:
neurodebian@head2:~/deb/builds/nitime/0.3~dev$ grep Error.*tril_indices *build nitime_0.2.9-1~nd09.10+1_amd64.build:AttributeError: 'module' object has no attribute 'tril_indices' nitime_0.2.9-1~nd10.04+1_amd64.build:AttributeError: 'module' object has no attribute 'tril_indices' nitime_0.2.9-1~nd10.10+1_amd64.build:AttributeError: 'module' object has no attribute 'tril_indices'
We should definitely fix this before 0.3 is out officially.
The ability to draw many matrices side by side in the same figure would be useful
======================================================================
ERROR: Testing the initialization of the uniform time series object
----------------------------------------------------------------------
Traceback (most recent call last):
File "/usr/lib/python2.7/site-packages/nose/case.py", line 197, in runTest
self.test(*self.arg)
File "/builddir/build/BUILD/nitime-rel-0.5/nitime/tests/test_timeseries.py", line 545, in test_TimeSeries
tseries2 /= 2
File "/builddir/build/BUILD/nitime-rel-0.5/nitime/timeseries.py", line 1021, in __idiv__
self.data.__itruediv__(other)
TypeError: ufunc 'true_divide' output (typecode 'd') could not be coerced to provided output parameter (typecode 'l') according to the casting rule ''same_kind''
I am working through http://www.nipy.org/nitime/examples/multi_taper_coh.html with a 4 second sine wave split into 63 channels, (512 samples each) and am getting this warning, should I explore some parameters settings for MTCoherenceAnalyzer.
lib\nitime\utils.py:571: RuntimeWarning: Breaking due to iterative meltdown in nitime.utils.adaptive_weights.
warnings.warn(e_s, RuntimeWarning)
Make a class, which receives an analyzer, together with it's data and then, produces the jack-knifing product - the methods of the original analyzer, together with their jack-knife confidence intervals.
IOError: [Errno 2] No such file or directory: 'tseries12.txt'
while running:
nosetests nitime/
Running nitime on epd 7.2-2 winx86
Traceback (most recent call last):
File "C:\Python27\lib\site-packages\nose\loader.py", line 390, in loadTestsFro
mName
addr.filename, addr.module)
File "C:\Python27\lib\site-packages\nose\importer.py", line 39, in importFromP
ath
return self.importFromDir(dir_path, fqname)
File "C:\Python27\lib\site-packages\nose\importer.py", line 86, in importFromD
ir
mod = load_module(part_fqname, fh, filename, desc)
File "c:\users\sargun\documents\wildsoet lab files\nitime\nitime\fmri\__init__
.py", line 18, in <module>
import hrf
File "c:\users\sargun\documents\wildsoet lab files\nitime\nitime\fmri\hrf.py",
line 2, in <module>
from scipy import factorial
ImportError: cannot import name factorial
Many doc-tests fail when running nosetest --with-doctest. All of these need to be fixed
Should work faster:
http://mail.scipy.org/pipermail/nipy-devel/2011-August/006645.html
This is somehow not on our website anymore (got kicked out of the documentation generation process...)
All these old branches really shouldn't be there: cython_coherence, granger, etc.
I have times series with 166800 samples (raw MEG data).
alg.dpss_windows(166800, 4, 8)
fails. However, in Matlab it works.
any idea of how to fix this?
Make it more conscious. For example, look at the time-stamps of the .py file and the .rst file and only generate the .rst file if the .py file has a later time-stamp (has been changed since last time you generated the .rst).
When doing :
f, psd, sigma = nia.multi_taper_psd(data[0], Fs=raw.info['sfreq'],
jackknife=True)
or
f, psd, sigma = nia.multi_taper_psd(5e10*data[0], Fs=raw.info['sfreq'],
jackknife=True)
You get same values for sigma. That is, sigma is blind to the scale of the data.
in the following example:
http://nipy.sourceforge.net/nitime/examples/filtering_fmri.html
the first figure shows the FFT as having negative power.
Use the gh-pages system in the same way as the data-array repo, to generate documentation for each tag.
The failure is
====================================================================== ERROR: Do the dpss windows resemble the equivalent matlab result ---------------------------------------------------------------------- Traceback (most recent call last): File "/usr/lib/pymodules/python2.6/nose/case.py", line 183, in runTest self.test(*self.arg) File "/home/fperez/usr/local/lib/python2.6/site-packages/nitime/tests/test_algorithms.py", line 234, in test_DPSS_matlab b = np.loadtxt('dpss_matlab.txt') File "/home/fperez/usr/opt/lib/python2.6/site-packages/numpy/lib/npyio.py", line 599, in loadtxt fh = open(fname, 'U') IOError: [Errno 2] No such file or directory: 'dpss_matlab.txt' ---------------------------------------------------------------------- Ran 387 tests in 2.883s
It's just a matter of putting some path logic in the test. I may get to it this weekend while I work on the coherence talk for Monday.
This default behavior should be to create a time-series with the dimensions of the original fmri file. Read in the data from the nifti header. Maybe output something which would tell you what shape you are looking at?
The following warning is issued in testing:
/Users/alex/local/lib/python2.7/site-packages/nitime/analysis/coherence.py:152:
ComplexWarning: Casting complex values to real discards the imaginary
part
self.spectrum[j][j])
since self.spectrum is complex valued and you initialize coherence as float. If coherence is expected to be between 0 and 1, then this might be a bug in coherence_calculate.
If possible, it would be good to replace the api-gen script which is currently in tools and instead use the (relatively) new sphinx auto-summary in order to generate the automatic API documentation.
Dear all,
When I run a script like that:
>>>sampling_rate=1000
>>>freq_idx_G
Out[7]: array([40, 41])
>>>G.frequencies.shape[0]
Out[8]: 513
>>>g1 = np.mean(G.causality_xy[:, :, freq_idx_G], -1)
it met the following memory error (freq_idx_G=):
---------------------------------------------------------------------------
MemoryError Traceback (most recent call last)
<ipython-input-6-b3dd332ebe13> in <module>()
----> 1 g1 = np.mean(G.causality_xy[:, :, freq_idx_G], -1)
/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/descriptors.pyc in __get__(self, obj, type)
138 # Errors in the following line are errors in setting a
139 # OneTimeProperty
--> 140 val = self.getter(obj)
141
142 setattr(obj, self.name, val)
/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc in causality_xy(self)
202 @desc.setattr_on_read
203 def causality_xy(self):
--> 204 return self._dict2arr('gc_xy')
205
206 @desc.setattr_on_read
/home/qdong/Enthought/Canopy_64bit/User/lib/python2.7/site-packages/nitime/analysis/granger.pyc in _dict2arr(self, key)
191 arr = np.empty((self._n_process,
192 self._n_process,
--> 193 self.frequencies.shape[0]))
194
195 arr.fill(np.nan)
MemoryError:
Can anyone give me some tips? Thanks!
I'm having an import issue, looking closer at mpl_toolkits\axes_grid__init__.py
indicates something may have changed with the import for make_axes_locatable
import nitime.viz
FILE: mpl_toolkits\axes_grid__init__.py
from .axes_divider import Divider, SubplotDivider, LocatableAxes,
make_axes_locatable
from .axes_grid import Grid, ImageGrid, AxesGrid
= ERROR
lib\matplotlib\cbook.py:137: MatplotlibDeprecationWarning: The matplotlib.mpl module was deprecated in version 1.3. Use import matplotlib as mpl
instead.
warnings.warn(message, mplDeprecation, stacklevel=1)
ImportError: 'No module named mpl_toolkits.axes_grid'
Traceback (most recent call last):
File "/Library/Frameworks/Python.framework/Versions/7.0/lib/python2.7/site-packages/nose/case.py",
line 187, in runTest
self.test(*self.arg)
File "/Users/alex/work/src/nitime/nitime/tests/test_analysis.py",
line 220, in test_FilterAnalyzer
npt.assert_almost_equal(f_slow.fir.data.mean(),
File "/Users/alex/work/src/nitime/nitime/descriptors.py", line 140, in get
val = self.getter(obj)
File "/Users/alex/work/src/nitime/nitime/analysis/spectral.py", line
357, in fir
b = signal.firwin(n_taps, ub_frac, self._win)
File "/Library/Frameworks/Python.framework/Versions/7.0/lib/python2.7/site-packages/scipy/signal/fir_filter_design.py",
line 255, in firwin
atten = kaiser_atten(numtaps, float(width)/nyq)
ValueError: could not convert string to float: hamming
nx (currently a strong recommendation) is required for building the docs.
The formula in there probably needs to be multiplied by df (the bandwidth of each frequency band in Hz).
what are the naming convention for functions?
below the list of functions in spectral.
algorithms/spectral.py:
20 : def get_spectra(time_series, method=None):
144 : def get_spectra_bi(x, y, method=None):
186 : def periodogram(s, Fs=2 * np.pi, Sk=None, N=None, sides='default',
262 : def periodogram_csd(s, Fs=2 * np.pi, Sk=None, NFFT=None, sides='default',
369 : def dpss_windows(N, NW, Kmax):
449 : def mtm_cross_spectrum(tx, ty, weights, sides='twosided'):
557 : def multi_taper_psd(s, Fs=2 * np.pi, BW=None, adaptive=False,
710 : def multi_taper_csd(s, Fs=2 * np.pi, BW=None, low_bias=True,
834 : def freq_response(b, a=1., n_freqs=1024, sides='onesided'):
run:
import numpy as np
data = np.random.randn(10, 1000)
f, psd, sigma = nia.multi_taper_psd(data, jackknife=False) # PASS
f, psd, sigma = nia.multi_taper_psd(data, jackknife=True) # FAIL
Running Sphinx v1.1.2
/Library/Frameworks/EPD64.framework/Versions/7.2/lib/python2.7/site-packages/matplotlib/init.py:908: UserWarning: This call to matplotlib.use() has no effect
because the the backend has already been chosen;
matplotlib.use() must be called before pylab, matplotlib.pyplot,
or matplotlib.backends is imported for the first time.
if warn: warnings.warn(_use_error_msg)
WARNING: extension 'ipython_console_highlighting' has no setup() function; is it really a Sphinx extension module?
loading pickled environment... not yet created
building [html]: targets for 71 source files that are out of date
updating environment: 71 added, 0 changed, 0 removed
/Users/arokem/projects/nitime/doc/sphinxext/docscrape.py:117: UserWarning: Unknown section Note
warn("Unknown section %s" % key)
/Users/arokem/projects/nitime/doc/sphinxext/docscrape.py:117: UserWarning: Unknown section Warning
warn("Unknown section %s" % key)
/Users/arokem/projects/nitime/doc/sphinxext/docscrape.py:117: UserWarning: Unknown section Example
warn("Unknown section %s" % key)
reading sources... [ 29%] api/generated/nitime.lazyimports
Exception occurred:
File "/Library/Frameworks/EPD64.framework/Versions/7.2/lib/python2.7/site-packages/sphinx/environment.py", line 828, in read_doc
pickle.dump(doctree, f, pickle.HIGHEST_PROTOCOL)
PicklingError: Can't pickle <type 'module'>: attribute lookup builtin.module failed
The full traceback has been saved in /var/folders/sf/3b6q6p1d7518rpb4882pzsxw0000gn/T/sphinx-err-9UpBlB.log, if you want to report the issue to the developers.
Please also report this if it was a user error, so that a better error message can be provided next time.
Either send bugs to the mailing list at http://groups.google.com/group/sphinx-dev/,
or report them in the tracker at http://bitbucket.org/birkenfeld/sphinx/issues/. Thanks!
make: *** [htmlonly] Error 1
utils.py:964 uses interect1d_nu which has been removed from numpy 1.6
1.5 docstring:
Docstring:
`intersect1d_nu` is deprecated!
This function is deprecated. Use intersect1d()
see the release notes:
http://sourceforge.net/projects/numpy/files/NumPy/1.6.1/
Dear all,
I have two 500ms' length of MEG time series, along the time, directional information of correspond cortical fields will change dynamically. If I want to model this directional changing, how to do? Welcome to any suggestion, thanks!
....
Arrays are not almost equal
(mismatch 99.4140625%)
x: array([[ -7.09068221e-17, 5.00000000e-01, 5.00000000e-01, ...,
5.83775176e-17, 5.00000000e-01, 5.00000000e-01],
[ -8.56663207e-18, 4.34684287e-17, -2.60671132e-17, ...,...
y: array([[ 0. , 0.01227154, 0.02454123, ..., -0.03680722,
-0.02454123, -0.01227154],
[ 1. , 0.9999247 , 0.99969882, ..., 0.99932238,...
Ran 381 tests in 44.490s
something as simple as :
In []: import nitime.timeseries as ts
In []: t = ts.TimeArray([1,2,3])
In []: ep = ts.Epochs(.3,1.4)
In []: t[ep]
Out[]: TimeArray([], dtype=float64, time_unit='s')
If the NFFT input to the method dict is longer than the time-series itself, all the coherences will be 1, per definition. At the very least raise a warning when that happens.
This commit: e8fb5fb (Working on slow tests and testlib) we removed 'exit=False' from the call to TestProgram, so now running nitime.test() in interactive environment (ipython) results in this backtrace:
/tmp/ in ()
/usr/lib/pymodules/python2.6/nitime/testlib.pyc in test(doctests)
48 # Now nose can run
49 try:
---> 50 TestProgram(argv=argv)#, exit=False)
51 finally:
52 np.set_printoptions(**opt_dict)
/usr/lib/pymodules/python2.6/nose/core.pyc in init(self, module, defaultTest, argv, testRunner, testLoader, env, config, suite, exit, plugins, addplugins)
116 self, module=module, defaultTest=defaultTest,
117 argv=argv, testRunner=testRunner, testLoader=testLoader,
--> 118 **extra_args)
119
120 def makeConfig(self, env, plugins=None):
/usr/lib/python2.6/unittest.pyc in init(self, module, defaultTest, argv, testRunner, testLoader)
815 self.progName = os.path.basename(argv[0])
816 self.parseArgs(argv)
--> 817 self.runTests()
818
819 def usageExit(self, msg=None):
/usr/lib/pymodules/python2.6/nose/core.pyc in runTests(self)
198 self.success = result.wasSuccessful()
199 if self.exit:
--> 200 sys.exit(not self.success)
201 return self.success
202
SystemExit: False
Type %exit or %quit to exit IPython (%Exit or %Quit do so unconditionally).
Originally, this argument was commented, because nose was ignoring the input 'argv', when this was provided. Maybe there is some way around this?
Make it run together with the examples, in order to make sure that it actually does something that the code does.
With a check-list of the process. Just so that we don't forget anything.
Is there a common method for using Multi-taper coherence estimation with signals of different lengths? Let's say I have signal1 S1, and signal2 S2, and S2 is 0.1% of the length of S2. Is it possible / practical to use Multi-taper coherence estimation in this situation to see if S2 is similar to subsection of s1.
Thank you.
I'm seeing this race condition with regards to "_conversion_factor" in the TimeArray. It's popped up under the nose tests (output pasted below), and when trying to build the examples in docs.
Recently updated MPL, so I'll see if reverting makes a difference.
nosetest output:
....................................................Exception RuntimeError: 'maximum recursion depth exceeded while calling a Python object' in <type 'exceptions.AttributeError'> ignored
ERROR: nitime.tests.test_viz.test_plot_xcorr
Traceback (most recent call last):
File "/opt/local/Library/Frameworks/Python.framework/Versions/2.6/lib/python2.6/site-packages/nose/case.py", line 197, in runTest
self.test(_self.arg)
File "/Users/mike/workywork/nitime/nitime/tests/test_viz.py", line 41, in test_plot_xcorr
line_labels=['a', 'b'])
File "/Users/mike/workywork/nitime/nitime/viz.py", line 462, in plot_xcorr
label=this_labels.pop())
File "/Users/mike/usr/lib/python2.6/site-packages/matplotlib-1.3.1rc2-py2.6-macosx-10.6-x86_64.egg/matplotlib/axes.py", line 4137, in plot
for line in self._get_lines(_args, **kwargs):
File "/Users/mike/usr/lib/python2.6/site-packages/matplotlib-1.3.1rc2-py2.6-macosx-10.6-x86_64.egg/matplotlib/axes.py", line 317, in _grab_next_args
for seg in self._plot_args(remaining, kwargs):
File "/Users/mike/usr/lib/python2.6/site-packages/matplotlib-1.3.1rc2-py2.6-macosx-10.6-x86_64.egg/matplotlib/axes.py", line 295, in _plot_args
x, y = self._xy_from_xy(x, y)
File "/Users/mike/usr/lib/python2.6/site-packages/matplotlib-1.3.1rc2-py2.6-macosx-10.6-x86_64.egg/matplotlib/axes.py", line 213, in _xy_from_xy
bx = self.axes.xaxis.update_units(x)
File "/Users/mike/usr/lib/python2.6/site-packages/matplotlib-1.3.1rc2-py2.6-macosx-10.6-x86_64.egg/matplotlib/axis.py", line 1336, in update_units
converter = munits.registry.get_converter(data)
File "/Users/mike/usr/lib/python2.6/site-packages/matplotlib-1.3.1rc2-py2.6-macosx-10.6-x86_64.egg/matplotlib/units.py", line 148, in get_converter
converter = self.get_converter(xravel[0])
<<< This repeats about ~100x >>>
File "/Users/mike/usr/lib/python2.6/site-packages/matplotlib-1.3.1rc2-py2.6-macosx-10.6-x86_64.egg/matplotlib/units.py", line 148, in get_converter
converter = self.get_converter(xravel[0])
File "/Users/mike/workywork/nitime/nitime/timeseries.py", line 233, in getitem
return self[[key]].reshape(())
File "/Users/mike/workywork/nitime/nitime/timeseries.py", line 239, in getitem
return np.ndarray.getitem(self, key)
File "/Users/mike/workywork/nitime/nitime/timeseries.py", line 207, in array_finalize
if not hasattr(self, '_conversion_factor'):
AttributeError: 'TimeArray' object has no attribute '_conversion_factor'
ATM, this causes a ValueError to be thrown. Needs to be fixed (probably in the Analyzer).
Some places state that we have "simplified BSD", but the LICENSE file has full BSD. Choose one of those.
In test_algorithms: test_coherence_linear_dependence doesn't behave as expected from William Wei's book (see docstring). Not clear why, though some directions:
http://mail.scipy.org/pipermail/nipy-devel/2010-July/004100.html
Something for @arokem to do this weekend...
a = ts.TimeArray([-1, 1.5,2,3])
a.prod()
Out[69]: -8739516.551889159 s
Hi all,
I'd be interested in making contributions towards adding symbolic aggregate approximation representations (along with the other corresponding flavors), a collection of distance measures to calculate similarity between TimeSeries objects, and sliding window functionality to Nitime. I also have a couple of visualizations tools that I'd like to contribute as well.
To get the ball rolling, I threw in a sax
method in the TimeSeries object to output the string representation of the TimeSeries object: stefan-pdx/nitime@be33b5c. Is there a preference where you want representational methods to be implemented? I'd prefer to have it as part of the TimeSeries object, since that would make it a bit more compatible with some of the work that I'll be doing with sliding window code, however if it would be more suitable for the utils.py
file, that's fine.
Also, for implementing a variety of distance measures between TimeSeries objects, where would you recommend is the best place to place it?
Thanks!
there is no doc for input parameters.
The current filtering methods are not very good (I can say that, I wrote them). Integrate filtering methods from scipy into the analysis module.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.