Git Product home page Git Product logo

pywt's People

Contributors

aaren avatar agriyakhetarpal avatar ahmetcansolak avatar alsauve avatar ankit-maverick avatar arfon avatar asnt avatar cclauss avatar cgoldberg avatar cyschneck avatar dependabot[bot] avatar dmpelt avatar drs251 avatar eriol avatar ev-br avatar frankyu avatar goertzenator avatar grlee77 avatar helderc avatar holgern avatar jarrodmillman avatar kwohlfahrt avatar laszukdawid avatar matthew-brett avatar michelp avatar nigma avatar rgommers avatar rthr-rllr avatar saketkc avatar sylvainlan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

pywt's Issues

Independent test comparison

I think this belongs to a new issue.

Right now, the test suite is (roughly) ensuring the results are the same as they were in the last published version of pywt. I think the values we get should be compared to other completely independent implementations. R has several libraries we could use, and there is also the MATLAB comparison.

I don't speak R nor have MATLAB installed, but if you could compile a Windows release, I can ask people to run it for me.

I will poke around to see if anyone has MATLAB on Linux, though.

doc/source/regression should be moved

I'm in the process of converting all those to unit tests. After that the usable examples should be merged with the rest of the documentation, and these files should then be deleted.

wavelet instance vs string

Hi,

In functions.py, intwave and centfreq can receive either a wavelet instance or a string. Should we keep this or should I remove strings as a valid type?

There is the same question for multidim/multilevel actually.

I don't think scipy used to allow two types. @rgommers Thoughts?

faster 2D transforms

As mentioned in #81, this is a reminder to at some point look into moving dwt2, idwt2 (and swt2, iswt2) either entirely to the Cython side or to at least do the bulk of the looping there for speed. This should provide faster 2D transforms in a manner analogous to the n-dimensional case implemented in #68.

I will not have time to work on this in the immediate future so if anyone else wants to volunteer, feel free. I think it should be relatively straightforward to implement.

handling complex dtypes

This is a request for feedback on a potential enhancement to handle complex data for transforms provided by pywt. Personally, I work with MRI images where the data is complex and both the image magnitude and phase are often of interest.

Currently one can perform complex transforms via separate calls on the real and imaginary parts, but this is a bit cumbersome:

coeffs_real = pywt.dwt(data.real, wavelet)
coeffs_imag = pywt.dwt(data.imag, wavelet)

reconstructed_data = pywt.idwt(coeffs_real, wavelet) + 1j*pywt.idwt(coeffs_imag, wavelet)

Is this too niche a case to be worth supporting here in pywt? I see three potential approaches in ascending order of the work required, but would be happy for feedback on others:

  1. leave it up to the user to handle as above.
  2. modify functions such as dwt to accept complex inputs, but run the real and complex parts independently using the existing C codebase. This would essentially operate like in the example above, but the data/coeffs input/returned would be complex. The separate real/imag calls would be transparent to users of the functions.
  3. update the C code to support np.complex64 and np.complex128 in addition to the currently supported np.float32 and np.float64.

Even if it is determined not to add complex support, I think we should raise errors on complex input as the current code will just discard the imaginary part. This does issue a warning ComplexWarning: Casting complex values to real discards the imaginary part, but that could be easy to miss.

Reorganize source tree

To something more standard. The code should be under pywt/, not src/pywt. Tests should be in subdirs of the source, not in tests. util/ should be cleaned up, and renamed to tools/.

BUG: erroneous detail coefficients returned by downcoef with level > 1

As pointed out by @jairongfu in #72, it seems that there is still a problem for downcoef with detail coefficients and level > 1. Unfortunately, the tests i added in #72 only tested the approximation case!

example reproducing the problem:

import pywt
import numpy as np
from numpy.testing import assert_array_almost_equal
r = np.random.randn(16)
a2, d2, d1 = pywt.wavedec(r, 'haar', level=2)

assert_array_almost_equal(a2, pywt.downcoef('a', r, wavelet='haar', level=2))
assert_array_almost_equal(d2, pywt.downcoef('d', r, wavelet='haar', level=2))

The second assert_array_almost_equal call above fails, but the first passes.

Test with Numpy 1.8rc1

The test suite passes with Numpy 1.8rc1.

(I don't know if an issue is the best way to notify this.)

idwtn doesn't tolerate unequal shapes in coefficients

The following two snippets of code should be equivalent, I think:

import numpy as np
import pywt

a = np.random.randn(64,64)
w = pywt.Wavelet('bior3.3')
#w = pywt.Wavelet('db2')
#w = pywt.Wavelet('db1')

coeffs  = pywt.dwtn(a,w)
coeffs2 = pywt.dwtn(coeffs['aa'], w)
coeffs['aa'] = pywt.idwtn(coeffs2, w)
print coeffs['aa'].shape, coeffs['dd'].shape
#b1 = pywt.idwtn(coeffs, w)


coeffs = pywt.dwt2(a,w)
coeffs2 = pywt.dwt2(coeffs[0], w)
coeffs = pywt.idwt2(coeffs2, w), coeffs[1]
print coeffs[0].shape, coeffs[1][0].shape
b2 = pywt.idwt2(coeffs, w)

#print np.max(np.fabs(b1-b2))

However, if one uses 'db2' or 'bior3.3' wavelet (the only one that I know works is 'db1'), the first version crashes in the commented line (that generates b1), because not all coefficients have the same shape.

PEP8 issues

PEP8 should run cleanly with the same exceptions as for scipy (see scipy/.travis.yml).

Copy [pep8] section in .travis.yml from scipy to pywt.

handling level=0 in waverec/wavedec functions

I found an inconsistency in all wavedec functions relative to their waverec counterparts. If the signal is too small for the chosen wavelet, level will be set to 0 in wavedec and a length 1 coeffs_list containing only the original data will be returned. If this is then passed in to waverec, an error will occur, because it only allows coefficient lists of length >= 2.

There are two potential solutions here:
1.) raise an error in _check_level if level = 0.
2.) modify waverecn to allow length 1 coefficient arrays and just return coeffs[0] (which is the original data) if the coeffs input was length 1.

I would lean toward solution 2, possibly adding a warning in wavedecn and waverecn if level=0.

wp.update fails on 2D

From the docs:

x = np.array([[1, 2, 3, 4, 5, 6, 7, 8]] * 8, dtype=np.float64)
wp = pywt.WaveletPacket2D(data=x, wavelet='db1', mode='sym')

new_wp = pywt.WaveletPacket2D(data=None, wavelet='db1', mode='sym')
new_wp['vh'] = wp['vh'].data
new_wp['vv'] = wp['vh'].data
new_wp['vd'] = np.zeros((2, 2), dtype=np.float64)
new_wp['a'] = [[3.0, 7.0, 11.0, 15.0]] * 4
new_wp['d'] = np.zeros((4, 4), dtype=np.float64)
new_wp['h'] = wp['h']       # all zeros

assert_allclose(new_wp.reconstruct(update=False),
                np.array([[1.5, 1.5, 3.5, 3.5, 5.5, 5.5, 7.5, 7.5]] * 8),
                rtol=1e-12)

Produces:
File "/usr/lib64/python2.7/site-packages/PyWavelets-0.2.2-py2.7-linux-x86_64.egg/pywt/wavelet_packets.py", line 667, in reconstruct
data = super(WaveletPacket2D, self).reconstruct(update)
File "/usr/lib64/python2.7/site-packages/PyWavelets-0.2.2-py2.7-linux-x86_64.egg/pywt/wavelet_packets.py", line 185, in reconstruct
return self._reconstruct(update)
File "/usr/lib64/python2.7/site-packages/PyWavelets-0.2.2-py2.7-linux-x86_64.egg/pywt/wavelet_packets.py", line 490, in _reconstruct
data_hl = node_hl.reconstruct()
File "/usr/lib64/python2.7/site-packages/PyWavelets-0.2.2-py2.7-linux-x86_64.egg/pywt/wavelet_packets.py", line 185, in reconstruct
return self._reconstruct(update)
File "/usr/lib64/python2.7/site-packages/PyWavelets-0.2.2-py2.7-linux-x86_64.egg/pywt/wavelet_packets.py", line 502, in _reconstruct
rec = idwt2(coeffs, self.wavelet, self.mode)
File "/usr/lib64/python2.7/site-packages/PyWavelets-0.2.2-py2.7-linux-x86_64.egg/pywt/multidim.py", line 161, in idwt2
L.append(idwt(rowL, rowH, wavelet, mode, 1))
File "_pywt.pyx", line 786, in _pywt.idwt (pywt/src/_pywt.c:8871)
File "_pywt.pyx", line 853, in _pywt._idwt (pywt/src/_pywt.c:9687)
IndexError: Out of bounds on buffer access (axis 0)

Interface improvement: consistent return format between `swt` and `swt2`

swt returns a list of coefficient tuples in descending order of wavelet level, while swt2 returns its list of coefficient tuples in ascending order. This seems confusing and could easily lead to programming mistakes when for users of both functions.

I realise that changing the interface may be dangerous as well for reasons of backward compatibility.

Compiler support

Which C compilers should we fully support? Numpy and Scipy both recommend GCC, and state other compilers are not supported, but this documentation appears dated (references to Python 2.4, which had it's final release 7 years ago).

I had a quick look at what the current packages look like in the table below. Essentially, every platform but CentOS supports C11 in its latest LTS release:

OS Compiler C standard
CentOS 6 gcc-4.4 C99
CentOS 7 gcc-4.8 C99
Debian 7 gcc-4.6 C99
Debian 8 gcc-4.9 C11
Ubuntu 12.04 gcc-4.6 C99
Ubuntu 15.04 gcc-4.9 C11
Fedora 21 gcc-4.9 C11
Fedora 22 gcc-5.1 C11
Windows Python <= 3.2 VS 2008 C89
Windows Python <= 3.4 VS 2010 C89
Windows Python >= 3.5 VS 2015 C99
MinGW-w64 gcc-5.1 C11
Cygwin gcc-4.9 C11

I've been fighting to test Visual Studio, since we have no CI support for it. It has been fairly difficult get a working Visual Studio 2010 installation (the activation button in the installer leads to a dead URL). There is a VS 2008 version out there explicitly for Python 2.7 support, but this is 7 years old! Visual Studio 2013 and up also support using clang as a compiler.

C11 support would get us _Generic, which could be very helpful for avoiding blocks like:

if data.dtype == numpy.float64:
    double_thing(data)
elif data.dtype == numpy.float43:
    float_thing(data)

Personally, given it's age, lag in supporting C standards, and especially the lack of CI support, I would advocate dropping or at least deprecating MSVC. Clang distributes a windows binary now, plus there are always the options of mingw and cygwin. We would however lose support for creating binary distributions for windows.

Comments, especially from @rgommers?

This isn't a priority, but testing MSVC is proving difficult, and having to remind myself to stick to C89 all the time is annoying so I thought I'd put the idea out there.

Add some images that can be loaded via the public API

We now have one image in the repo: demo/data/aero.png. We can't use that in docstring examples however, because it's not importable. Nicer docstring examples instead of just the return values of some random numbers would be very good to have.
We could easily add scipy.misc.face and scipy.misc.ascent next to aero.

Moving pywt forward faster

@aaren @grlee @kwohlfahrt I've been trying to get back to moving pywt forward, but am having trouble spending enough time on it. I don't want to be a bottleneck here. You guys all have PRs open that look quite good, and you seem interested in the development (I also see some of you commenting on each other's PRs). So I'd like to propose the following:

  • I give you commit rights for this repo
  • We work on it together with the same rules of the game as we have for Scipy development (I assume you're familiar with those to some extent, but otherwise see [1, 2]).

What do you think?

Ralf

(note: I'll be offline until next Friday, so will see respond to any replies then)

[1] https://github.com/scipy/scipy/blob/master/doc/MAINTAINERS.rst.txt#some-unwritten-rules-of-scipy-development-written-down
[2] https://github.com/scipy/scipy/blob/master/HACKING.rst.txt

Test on Mac

I have access to a 10.6 machine with Python from MacPorts, I can test it there.

Anyone has tested it already?

Simplify Matlab tests

Decorate the check_accuracy in 'test_matlab_compatibility.py' as "slow" and simplify it to test way fewer data sizes to speed up testing and optionally disable Matlab tests unless full test suite is requested. See #92.

issues to address in moving towards 0.3.0

I opened this issue as a place for us to add/discuss issues to address for a future 0.3.0 release.

I took a quick look at a diff of current master and @nigma's pywt. It looks like the majority of the code should already be backwards compatible (and it already works with Python 3).

Here are a few issues I think need to be addressed:

  • A copy of six was included as _tools/six.py. It appears to only currently be used to import string_types here:
    https://github.com/rgommers/pywt/blob/496fab3ad1cc6fd35f359d560c9b0e4fa40da8a2/pywt/src/_pywt.pyx#L23
    Should we: 1.) keep this as a local copy, 2.) add six as a dependency, 3.) just define string_types directly ourselves?
  • current master already has one new function (idtwn) in multidim.py. Should this be kept in the initial v0.3.0 release because was already committed? We can change to the more efficient implementation I proposed in #52 for a later release.
  • In thresholding.py, the individual functions soft, hard, greater, less have been replaced with a single function called threshold that has a 'mode' argument to determine the threshold type.
    Should we provide wrappers with the old names for backwards compatiblity?
    i.e.
def soft(data, value, substitute=0):
    return threshold(data, value, mode='soft', substitute=substitute)

Backwards compatiblity would also require adding the following to __init__.py:

import tresholding

so that these would be called as pywt.thresholding.soft as in the old API

improve handling of odd sized arrays in DWT/IDWT

Currently if an input to dwt (or dwt2 or dwtn) is odd in size, the resulting coefficient arrays do not reflect this and calling idwt on the coefficients produced will return an even sized image.

If the original odd image extent is extracted, the reconstruction is perfect as expected.

I am not certain of the proper solution, but I think maybe dwt_buffer_length should return different sizes for the approximation and detail coefficients when the input is of odd size. e.g. for MODE_PERIODIZATION, currently this function returns the size ceil(input_len / 2.0).

At least from what I read in the JPEG 2000 standard, I think the approximation coefficients should use ceil(input_len / 2.0), but the detail coefficients should use floor(input_len / 2.0).
Reference: Eqns 7a-b of the JPEG2000 compression standard:
http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.136.3993

As a concrete example, dwtn of a 2D array of size 511 x 255 would give the following coefficient array sizes:
aa : 256 x 128
da : 255 x 128
ad : 256 x 127
dd : 255 x 127

This would require idwtn to allow unequal shapes of coefficients (see #54 )

Appveyor builds

Appveyor builds are triggering, but status is not shown on PRs. The appveyor URL still has rgommers in it, so I think this might have been missed in the move.

Install issue, no module tools.six

BTW, installing pywt on Linux I get:
Reported on the mailing list:

"BTW, installing pywt on Linux I get":

>>> import pywt
Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/usr/lib64/python2.7/site-packages/PyWavelets-0.2.2-py2.7-linux-x86_64.egg/pywt/__init__.py", line 15, in <module>
    from ._pywt import *
  File "_pywt.pyx", line 24, in init pywt._pywt (/home/david/gits/pywt/src/_pywt.c:16572)
ImportError: No module named tools.six

"But it is not in the requirements."

PyWavelets vs. nigma Github repo

I noticed that the "fork on Github" button at http://pywavelets.readthedocs.org/ still points to https://github.com/nigma/pywt

So I wanted to change it to point to https://github.com/PyWavelets/pywt and use Github search to find the location in the code / docs where this is set.

But code search in forked repos doesn't work: https://help.github.com/articles/searching-code/#search-by-the-number-of-forks-the-parent-repository-has

I'm not sure ... maybe this could be resolved by deleting https://github.com/nigma/pywt and then forking it from https://github.com/PyWavelets/pywt ?
https://help.github.com/articles/what-happens-to-forks-when-a-repository-is-deleted-or-changes-visibility/

cc @nigma @rgommers

BUG: _pywt.downcoef always returns level=1 result

pywt._pywt.downcoef always returns the level=1 result regardless of the specified level. I don't believe this is ever used with level>1 in the codebase, but if the option is provided it should work properly.

a fix will be submitted shortly

API issues to discuss or change

This list is meant to keep track of things we come across in the API that don't look right and may need fixing (after completing the refactoring):

  • function names in functions.py need to be longer and clearer (intwave, centrfrq, scal2frq, orthfilt).
  • qmf already exists in scipy.signal
  • wavelet parameter to functions should be uniform everywhere (can be Wavelet instance, string and in some cases intwave even a third option).
  • functions in thresholding.py are (too?) simple and in two cases equal numpy names.
  • MODES is all capitals. It's properties aren't very Pythonic.
  • Wavelet has __str__ but not __repr__.

wp.update fails after removal of nodes

See test_wp/test_removing_nodes. The traceback is:

File "/usr/lib64/python2.7/site-packages/PyWavelets-0.2.2-py2.7-linux-x86_64.egg/pywt/wavelet_packets.py", line 566, in reconstruct
data = super(WaveletPacket, self).reconstruct(update)
File "/usr/lib64/python2.7/site-packages/PyWavelets-0.2.2-py2.7-linux-x86_64.egg/pywt/wavelet_packets.py", line 185, in reconstruct
return self._reconstruct(update)
File "/usr/lib64/python2.7/site-packages/PyWavelets-0.2.2-py2.7-linux-x86_64.egg/pywt/wavelet_packets.py", line 425, in _reconstruct
data_a = node_a.reconstruct() # TODO: (update) ???
File "/usr/lib64/python2.7/site-packages/PyWavelets-0.2.2-py2.7-linux-x86_64.egg/pywt/wavelet_packets.py", line 185, in reconstruct
return self._reconstruct(update)
File "/usr/lib64/python2.7/site-packages/PyWavelets-0.2.2-py2.7-linux-x86_64.egg/pywt/wavelet_packets.py", line 434, in _reconstruct
correct_size=True)
File "_pywt.pyx", line 787, in _pywt.idwt (pywt/src/_pywt.c:8903)
File "_pywt.pyx", line 855, in _pywt._idwt (pywt/src/_pywt.c:9741)
IndexError: Out of bounds on buffer access (axis 0)

new feature: contourlets

Is there interest for adding contourlet support to the package?

I was chatting with Minh Do and he's willing to provide the C source used for his Matlab Contourlet toobox. They use MEX for the Matlab to C interface, but it should be an easy(ish..) traslation.

Check default option consistency

For ex, in class BaseNode, option decompose in walk, walk_depth are not set to the same default value. It might be confusing...

I suggest to check default options everywhere against this remark and fix where needed.

Wavelet string attributes shouldn't be bytes in Python 3

Failing test:

======================================================================
FAIL: test_wavelet.test_wavelet_properties
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/usr/lib/python3/dist-packages/nose/case.py", line 198, in runTest
    self.test(*self.arg)
  File "/home/rgommers/.local/lib/python3.3/site-packages/PyWavelets-0.2.2-py3.3-linux-i686.egg/pywt/tests/test_wavelet.py", line 15, in test_wavelet_properties
    assert_(w.short_family_name == 'db')
  File "/home/rgommers/.local/lib/python3.3/site-packages/numpy/testing/utils.py", line 44, in assert_
    raise AssertionError(msg)
AssertionError

Because w.short_family_name is b'db'. Should be fixed in the Cython code, maybe just by decoding to latin-1 in the properties.

Marking as knownfail for now.

demos should be updated and integrated in docs

There's some pretty cool stuff in demo/, much better than what's in the docs. So that should be updated (no pylab and too many print statements mainly) and integrated into the docs.

MODWT Transform

Hi all,

It seems to me that there's a problem with the pywt.swt function, possibly with documentation only. The docs say

"Returns list of approximation and details coefficients pairs in order similar to wavedec function::
[(cAn, cDn), ..., (cA2, cD2), (cA1, cD1)]"

which implies that this is actually a multi-resolution analysis, which should mean that the sum $\sum\limits_{k=1}^m cDk + cAm = X$ (X denotes the original signal) for each $m \le n$... this does not happen even for $n=1$, i.e. cD1 + cA1 = X. Anyone know what's going on here?

Are the coefficients actually the wavelet and scaling coefficients? FYI the similar R "wavelets" package function (mra) does satisfy those relations.

Indexing the result of multilevel & multidimensional transforms

There have been a few proposals for alternative representations for the output of a transform in #93, and I think this needs some more discussion. It gets rapidly complex, when for example wavelet packet transforms are used.

The following are some examples of weird cases that might need to be accessed:

  • Transform that is approximate in the first dimension, detail in the second
  • Transform that is at the first level approximate in the first dimension, detail in the second, then at the second level detail at the first dimension and detail in the second

I can't see any easy solutions to this, but any ideas to discuss would be great.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.