Git Product home page Git Product logo

lumispy's Introduction

Build Status Tests Codecov Status Code quality scan Documentation Status

Python Version PyPi Version Anaconda Version License: GPL v3 DOI

LumiSpy

LumiSpy is a Python package extending the functionality for multi-dimensional data analysis provided by the HyperSpy library. It is aimed at helping with the analysis of luminescence spectroscopy data (cathodoluminescence, photoluminescence, electroluminescence, Raman, SNOM).

If analysis using LumiSpy forms a part of published work, please consider recognising the code development by citing the project using the Zenodo-DOI.

Go to the documentation for instructions on how to install LumiSpy and start an analysis: Read the docs.

Tutorials and exemplary workflows have been curated as a series of Jupyter notebooks that you can work through and modify to perform many common analyses. These can be either downloaded and run locally or tried out using interactive online sessions.

Everyone is welcome to contribute. Please read our contributing guidelines and get started!

Development of LumiSpy is documented in the changelog.

lumispy's People

Contributors

attolight-ntappy avatar dependabot[bot] avatar dnjohnstone avatar ericpre avatar hakonanes avatar jlaehne avatar jordiferrero avatar lgtm-migrator avatar lmsc-ntappy avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar

lumispy's Issues

Failure with numpy dev

From hyperspy/hyperspy-extensions-list#28

=================================== FAILURES ===================================
______________________ TestLumiSpectrum.test_errors_raise ______________________

a = [array([ 0.        ,  5.44444444, 10.88888889, 16.33333333, 21.77777778,
       [27](https://github.com/hyperspy/hyperspy-extensions-list/actions/runs/3362966097/jobs/5575418400#step:16:28).22222222, [32](https://github.com/hyperspy/hyperspy-extensions-list/actions/runs/3362966097/jobs/5575418400#step:16:33).66666667, 38.11111111,...., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.])]

    @array_function_dispatch(_shape_dispatcher)
    def shape(a):
        """
        Return the shape of an array.
    
        Parameters
        ----------
        a : array_like
            Input array.
    
        Returns
        -------
        shape : tuple of ints
            The elements of the shape tuple give the lengths of the
            corresponding array dimensions.
    
        See Also
        --------
        len : ``len(a)`` is equivalent to ``np.shape(a)[0]`` for N-D arrays with
              ``N>=1``.
        ndarray.shape : Equivalent array method.
    
        Examples
        --------
        >>> np.shape(np.eye(3))
        (3, 3)
        >>> np.shape([[1, 3]])
        (1, 2)
        >>> np.shape([0])
        (1,)
        >>> np.shape(0)
        ()
    
        >>> a = np.array([(1, 2), (3, 4), (5, 6)],
        ...              dtype=[('x', 'i4'), ('y', 'i4')])
        >>> np.shape(a)
        (3,)
        >>> a.shape
        (3,)
    
        """
        try:
>           result = a.shape
E           AttributeError: 'list' object has no attribute 'shape'

/usr/share/miniconda3/lib/python3.10/site-packages/numpy/core/fromnumeric.py:20[33](https://github.com/hyperspy/hyperspy-extensions-list/actions/runs/3362966097/jobs/5575418400#step:16:34): AttributeError

During handling of the above exception, another exception occurred:

self = <lumispy.tests.signals.test_luminescence_spectrum.TestLumiSpectrum object at 0x7fc9d6f7e7a0>

    def test_errors_raise(self):
        s = LumiSpectrum(np.ones(50))
        for bkg, error in error_backgrounds:
            with pytest.raises(error):
>               s.remove_background_from_file(bkg)

/usr/share/miniconda3/lib/python3.10/site-packages/lumispy/tests/signals/test_luminescence_spectrum.py:60: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/share/miniconda3/lib/python3.10/site-packages/lumispy/signals/luminescence_spectrum.py:553: in remove_background_from_file
    if np.shape(background_xy)[0] == 1:
<__array_function__ internals>:200: in shape
    ???
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

a = [array([ 0.        ,  5.44444444, 10.88888889, 16.33333333, 21.77777778,
       27.22222222, 32.66666667, 38.11111111,...., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.])]

    @array_function_dispatch(_shape_dispatcher)
    def shape(a):
        """
        Return the shape of an array.
    
        Parameters
        ----------
        a : array_like
            Input array.
    
        Returns
        -------
        shape : tuple of ints
            The elements of the shape tuple give the lengths of the
            corresponding array dimensions.
    
        See Also
        --------
        len : ``len(a)`` is equivalent to ``np.shape(a)[0]`` for N-D arrays with
              ``N>=1``.
        ndarray.shape : Equivalent array method.
    
        Examples
        --------
        >>> np.shape(np.eye(3))
        (3, 3)
        >>> np.shape([[1, 3]])
        (1, 2)
        >>> np.shape([0])
        (1,)
        >>> np.shape(0)
        ()
    
        >>> a = np.array([(1, 2), (3, 4), (5, 6)],
        ...              dtype=[('x', 'i4'), ('y', 'i4')])
        >>> np.shape(a)
        (3,)
        >>> a.shape
        (3,)
    
        """
        try:
            result = a.shape
        except AttributeError:
>           result = asarray(a).shape
E           ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (2,) + inhomogeneous part.

/usr/share/miniconda3/lib/python3.10/site-packages/numpy/core/fromnumeric.py:20[35](https://github.com/hyperspy/hyperspy-extensions-list/actions/runs/3362966097/jobs/5575418400#step:16:36): ValueError
=============================== warnings summary ===============================

Luminescence specific metadata

I have recently tried to think of what parameters we should include in the s.metadata and what parameters stay as .original_metadata (and #51 made me start this issue).

I normally load .sur files using the hyperspy io plugin. That reads the orignal_metadata (more than 50 parameters) and then I parse some relevant values to the metadata to be used in some functions (e.g. calibration, normalisation...).

  • What parameters should those be? They would add up to the hyperspy specific metadata structure.
  • How would we parse metadata in an automatic way (so from the original_metadata) depending on the initial file extension loaded?

Setting a variance model in the lumispy object after Jacobian transform

After some discussion with @LMSC-NTappy, we realised that the variance model stored in the LumiSpectrum.metadata.Signal.Noise_properties.variance should no longer be homoscedastic.

Instead, the noise variance should also be converted to eV, but with the intensities multiplied by the square of the jacobian (according to Var(aX) = a**2*Var(X)).

I am not sure what the best way to implement this subtlety is in our luminescence object. I understand that, once you have your signal, then you can compute the variance manually and then apply the Jacobian transformation. But is there a way to define this noise in a generic way (to the class rather than to the data), so when the cl.to_eV() function is called, such transformation is also applied to the variance metadata (should there be any).

This is the Hyperspy guideline on noise metadata.

Let's use this chat to discuss further @LMSC-NTappy. Any thoughts?

Zenodo DOI for lumispy

Hi all,

I was recently discussing on setting up a Zenodo DOI for any publication that may come from the code we are actively writing.
I have seen that both hyperspy and pyxem use such a system which keeps track of the versions for referencing.

Before I set up a Zenodo account for this repo, I wanted to discuss with you @jlaehne @ericpre if you have any objections/suggestions before going ahead.

Since we don't have any version released yet (which we should focus on soon, I think), will that be an issue with getting a DOI?

Thank you

Updated map method in HyperSpy v1.6.2

Indeed, the failure are related to changes in the map method:

ValueError: The size of the navigation_shape for the kwarg shift (<(100,)> must be consistent with the size of the mapped signal <(10, 10)>

This point out to an argument which doesn't have the right shape. It is fairly likely that ambiguous usage of map such as this one are not allowed anymore.

Originally posted by @ericpre in #68 (comment)

Azure pipeline failure

The failure on azure pipeline are temporary, because the dask-core package is broken on anaconda defaults. The package on conda-forge is fine but CI is setup with the anaconda defaults channels as higher priority than conda-forge. This should be temporary until AnacondaRecipes/dask-core-feedstock#2 is merged and the packages available on the server.

LumiSpy logo

Any ideas for a LumiSpy logo?

One idea I had was using this map of emission energy around a dislocation in GaN, but any further ideas are welcome.
image

Slicing TransientSpec

When slicing, summing, integrating signals of type TransientSpec, the resulting signal will be a Signal1D. It would be desirable to obtain signals of type Transient or Luminescence depending on what axis the function operates on.

Hyperspy signal registration failing

Following @ericpre #25 PR on fixing the signal registration for the lumispy classes, it is not working fully yet.
The tests in test_signal_registration are passing well, but if I run:

s = hs.load(path, signal_type='CL_SEM',)

The following warning raises:

WARNING:hyperspy.io: signal_type='CL_SEM' not understood. See hs.print_known_signal_types() for a list of known signal types, and the developer guide for details on how to add new signal_types.

And a Signal2D class is loaded.
Moreover, if I run hs.print_known_signal_types() all I get are the native Hyperspy classes.

Slicing of energy/wavenumber signal with isig fails

Describe the bug

Error occurs when slicing a signal with isig[x:y] that was converted using the lumispy functions to_eV() or to_invcm().

To Reproduce

Steps to reproduce the behavior:

import hyperspy.api as hs
S = hs.signals.Signal1D(np.arange(100),axes=[{'axis': np.arange(100)+300}])
S.set_signal_type("CL")
S.to_eV(inplace=True)
S.isig[3.251:4.052]

Leads to TypingError:

TypingError: Failed in nopython mode pipeline (step: nopython frontend)
No implementation of function Function(<function diff at 0x7fad54e2c310>) found for signature:
 
 >>> diff(array(float64, 1d, A))
 
There are 2 candidate implementations:
  - Of which 2 did not match due to:
  Overload in function 'np_diff_impl': File: numba/np/arraymath.py: Line 3684.
    With argument(s): '(array(float64, 1d, A))':
   Rejected as the implementation raised a specific error:
     TypingError: Failed in nopython mode pipeline (step: nopython frontend)
   - Resolution failure for literal arguments:
   reshape() supports contiguous array only
   - Resolution failure for non-literal arguments:
   reshape() supports contiguous array only
   
   During: resolving callee type: BoundFunction(array.reshape for array(float64, 1d, A))
   During: typing of call at /usr/lib/python3.10/site-packages/numba/np/arraymath.py (3702)
   
   
   File "../../../../usr/lib/python3.10/site-packages/numba/np/arraymath.py", line 3702:
       def diff_impl(a, n=1):
           <source elided>
           # To make things easier, normalize input and output into 2d arrays
           a2 = a.reshape((-1, size))
           ^

  raised from /usr/lib/python3.10/site-packages/numba/core/typeinfer.py:1086

During: resolving callee type: Function(<function diff at 0x7fad54e2c310>)
During: typing of call at /home/jonas/medien/pdi/git/hyperspy/hyperspy/misc/array_tools.py (428)


File "../../medien/pdi/git/hyperspy/hyperspy/misc/array_tools.py", line 428:
def numba_closest_index_round(axis_array, value_array):
    <source elided>
    rtol = 1e-10
    machineepsilon = np.min(np.abs(np.diff(axis_array))) * rtol
    ^

Expected behavior

If the energy is set up manually it works

import hyperspy.api as hs
import lumispy as lum
S1 = hs.signals.Signal1D(np.arange(100),axes=[{'axis': lum.nm2eV((np.arange(100)+300)[::-1])}])
S.set_signal_type("CL")
S.isig[3.251:4.052]

Additional context

Fix will be submitted, forcing the dtype in axis2eV and axis2invcm.

Find maximum and width of a peak

Similar to the centroid function from #168 we should create further functions that give back information about a peak without fitting

  1. Maximum of a peak
    HyperSpy provides the valuemax function to find the position of a maximum of a signal. With smoothing and signal ranges that can work also on noisier data or data with multiple peaks, respectively. However, an alternative approach that I would like to see implemented is taking the first derivative and then finding zero-crossings. In particular this method is useful to identify the positions of secondary maxima.
    EDIT: This functionality is actually provided by find_peaks1D_ohaver -- its been a while since I had looked at that function

  2. Width of a peak
    From fits, you can get the FWHM of a peak, but that is not very helpful for asymmetric peak shapes. I therefore envisage a function that calculates the distance between the half maximum points of a peak along the signal axis.
    EDIT: This is actually provided by estimate_peak_width

Failure with numpy 1.24.dev

From https://github.com/hyperspy/hyperspy-extensions-list/actions/runs/3119041322/jobs/5058746331

=================================== FAILURES ===================================
______________________ TestLumiSpectrum.test_errors_raise ______________________

a = [array([ 0.        ,  5.44444444, 10.88888889, 16.33333333, 21.77777778,
       [27](https://github.com/hyperspy/hyperspy-extensions-list/actions/runs/3119041322/jobs/5058746331#step:16:28).22222222, [32](https://github.com/hyperspy/hyperspy-extensions-list/actions/runs/3119041322/jobs/5058746331#step:16:33).66666667, 38.11111111,...., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.])]

    @array_function_dispatch(_shape_dispatcher)
    def shape(a):
        """
        Return the shape of an array.
    
        Parameters
        ----------
        a : array_like
            Input array.
    
        Returns
        -------
        shape : tuple of ints
            The elements of the shape tuple give the lengths of the
            corresponding array dimensions.
    
        See Also
        --------
        len : ``len(a)`` is equivalent to ``np.shape(a)[0]`` for N-D arrays with
              ``N>=1``.
        ndarray.shape : Equivalent array method.
    
        Examples
        --------
        >>> np.shape(np.eye(3))
        (3, 3)
        >>> np.shape([[1, 3]])
        (1, 2)
        >>> np.shape([0])
        (1,)
        >>> np.shape(0)
        ()
    
        >>> a = np.array([(1, 2), (3, 4), (5, 6)],
        ...              dtype=[('x', 'i4'), ('y', 'i4')])
        >>> np.shape(a)
        (3,)
        >>> a.shape
        (3,)
    
        """
        try:
>           result = a.shape
E           AttributeError: 'list' object has no attribute 'shape'

/usr/share/miniconda3/lib/python3.10/site-packages/numpy/core/fromnumeric.py:20[33](https://github.com/hyperspy/hyperspy-extensions-list/actions/runs/3119041322/jobs/5058746331#step:16:34): AttributeError

During handling of the above exception, another exception occurred:

self = <lumispy.tests.signals.test_luminescence_spectrum.TestLumiSpectrum object at 0x7f5d5c5a2500>

    def test_errors_raise(self):
        s = LumiSpectrum(np.ones(50))
        for bkg, error in error_backgrounds:
            with pytest.raises(error):
>               s.remove_background_from_file(bkg)

/usr/share/miniconda3/lib/python3.10/site-packages/lumispy/tests/signals/test_luminescence_spectrum.py:60: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/share/miniconda3/lib/python3.10/site-packages/lumispy/signals/luminescence_spectrum.py:553: in remove_background_from_file
    if np.shape(background_xy)[0] == 1:
<__array_function__ internals>:200: in shape
    ???
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

a = [array([ 0.        ,  5.44444444, 10.88888889, 16.33333333, 21.77777778,
       27.22222222, 32.66666667, 38.11111111,...., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.,
       1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1., 1.])]

    @array_function_dispatch(_shape_dispatcher)
    def shape(a):
        """
        Return the shape of an array.
    
        Parameters
        ----------
        a : array_like
            Input array.
    
        Returns
        -------
        shape : tuple of ints
            The elements of the shape tuple give the lengths of the
            corresponding array dimensions.
    
        See Also
        --------
        len : ``len(a)`` is equivalent to ``np.shape(a)[0]`` for N-D arrays with
              ``N>=1``.
        ndarray.shape : Equivalent array method.
    
        Examples
        --------
        >>> np.shape(np.eye(3))
        (3, 3)
        >>> np.shape([[1, 3]])
        (1, 2)
        >>> np.shape([0])
        (1,)
        >>> np.shape(0)
        ()
    
        >>> a = np.array([(1, 2), (3, 4), (5, 6)],
        ...              dtype=[('x', 'i4'), ('y', 'i4')])
        >>> np.shape(a)
        (3,)
        >>> a.shape
        (3,)
    
        """
        try:
            result = a.shape
        except AttributeError:
>           result = asarray(a).shape
E           ValueError: setting an array element with a sequence. The requested array has an inhomogeneous shape after 1 dimensions. The detected shape was (2,) + inhomogeneous part.

/usr/share/miniconda3/lib/python3.10/site-packages/numpy/core/fromnumeric.py:20[35](https://github.com/hyperspy/hyperspy-extensions-list/actions/runs/3119041322/jobs/5058746331#step:16:36): ValueError

Implementation of Spectroscopy File Readers

Describe the functionality you would like to see.

We want to provide readers for different luminescence spectroscopy file formats. In particular,

Describe the context

Is it already possible to register file reader extensions with HyperSpy @ericpre and @francisco-dlp? As these are not electron microscopy related file formats, I would propose to have them within LumiSpy - but it would be great to register them so that they work with hs.load similar to how we extend HyperSpy by additional signal types.

Additional information

@pietsjoh would work on the implementation of the readers

Extend functionality of crop_edges

Describe the context

crop_edges is a wrapper function for inav that takes the parameter crop_px and removes as many pixels from all sides of a map.

Describe the functionality you would like to see.

To make this function more versatile, it would make sense to alternatively allow the following sets of parameters:

  • crop_x_px and crop_y_px to crop different amounts of pixels from the x and y directions
  • crop_left_px, crop_right_px, crop_top_px, crop_bottom_px to crop different amounts of pixels from all sides (check in matplotlib what order would be typically used for such a set of parameters corresponding to the edges).

Additional information

@jordiferrero , you originally implemented this function. Would this extension make sense to you? We have a student that could implement it.

Package structure

@jordiferrero You put the attolight reader under utils/io_utils. A few points concerning that

  1. I think, we should try to be more consistent in the package structure to that of HyperSpy, therefore I would propose to place IO functions in a folder io_plugins
  2. io_utils is a very generic name for a function that is specific to one brand of instrument. As we want to develop the package in a way that can encompass different manufacturers and on top of CL also PL and at some point maybe Raman, we should have separate io files for different instruments
  3. Maybe we should have a load function similar to hyperspy that is generic and automatically chooses the right io_plugin (this should include the plugins provided by HyperSpy, which will be moved to a separate IO-package at some point).

Energy conversion

My code for energy conversion relies on the non-linear DataAxis that is not part of the HyperSpy release yet, but might hopefully become a pre-release soon. Should I still contribute it here already for you to use with the right hyperspy development branch?

It contains some additional features like a wavelength dependent refractive index of air based on an analytical function (therefore it can not be a FunctionalDataAxis)

[AttoLight] Signal x-axis

When an Attolight file is loaded, the signal x-axis is not calibrated properly.
There are two ways to go ahead with this issue:

  • Solve the spectrometer equation
  • Save a background file manually, then using the x-axis from that file.*

*With this second approach, the Attolight system seems to have a bug, whereby if you are binning, the x-axis saved in the background file in a HYP map is not correct. However, the manually saved background file has the corrected x-axis values.

This figure illustrates that:

x-axis

Here, we show the difference between the correctly saved x-axis, the bugged incorrect x-axis from the HYP map background and the current estimated x-axis (without the spectrometer equation solved yet).

To do:

  • Find out the solution f or the spectrometer equation
  • Edit the load_hypmap() function to account for these small errors.
  • Modify the background_substraction() function to be able to take different x-axis to substract.

Release v0.1.1

What's about making a v0.1.1 release? A new release would fix the integration test suite which is failing with lumispy 0.1.0. This will also save me the reception of an email notification every day to tell me that the integration test suite is failing!

Fix test for `remove_spikes`

Some recent update to HyperSpy or an upstream library breaks the test for automatic peak removal:

>       np.testing.assert_almost_equal(s3.data[0, 2, 29], 1, decimal=5)
E       AssertionError: 
E       Arrays are not almost equal to 5 decimals
E        ACTUAL: 2.000005802611809
E        DESIRED: 1

You can see the full output here: https://github.com/LumiSpy/lumispy/actions/runs/4451990112/jobs/7819241076

The smaller of the two artificial peaks is not caught on every run, if the threshold is auto.

I commented out line 70 and 85 in test_cl_spectrum.py to make the tests run through in #184.
Also I set decimal=4 (instead of 5) in all assertions as it was loosing accuracy on some runs.

So far, I have not found why the numerics suddenly seem to be less reliable.

In principle, the function should still run, but it seems to be less robust.

Any idea @jordiferrero or @ericpre

Azure tests failing

Describe the bug

Azure tests pipeline is failing, as discussed in #168. The error happens when it tries to import:

from numba.np.ufunc import _internal
SystemError: initialization of _internal failed without raising an exception
##[error]Bash exited with code '1'.

To Reproduce

See here for example.


#### Additional context
It seems it could be a problem caused by numpy 1.24 ([see here](https://github.com/numba/numba/issues/8615)).

#### What I already tried
I already tried and did not work:
- Set numpy < 1.24 during the installation (in requirement in `setup.py`)
- Remove all dependencies but only hyperspy 1.7

signals vs. _signals

@dnjohnstone Do you know why in Hyperspy the signal classes are in _signals, while we have them in signals. Would it make sense to copy the structure here in that aspect?

Dealing with SEM/HAADF complementary data with their respective luminescence

I have recently been wondering on what the best strategy would be to do the following:

All of my CL-SEM data always comes with SE images with it (I suppose for CL-TEM the equivalent would be a HAADF image). These SE images are acquired at the same time as the luminescence. However, they are not integrated with hyperspy at all.
While it is true that you can load the SE image separately (as s) and the use this s object as navigation_axis for the cl object, this all needs to be done manually.
What's worse is that when I do some processing on the cl object (e.g. crop or subindex some of the navigation axes) the s SE image does not link with the cl object in anyway.
Ideally, i would like to link these 2 datasets together.

I have an idea to do that, but I wanted too quickly discuss the approach before I get into the coding.
One way would be to keep the SE image array in the metadata and, every time you apply a certain operation that modifies the navigation axis, then those changes are also applied to the SE metadata (e.g. crop pixels). Moreover, I would like to have an option when plotting where we can specify to use the SE image as navigation axis instead of the default "panchromatic".

Can you think of any better ways to implement such feature?

Fix coverage badge

The coveralls badge is claiming 0% coverage, though when you check out the details we are at >90%. Probably something in the coveralls settings?

Or use codecov instead of coveralls, like HyperSpy?

Adding an interactive way to slice HS over a wavelength range, and view the result spatially!

Describe the functionality you would like to see.

In the context of taking hyperspectral maps, sometimes you may have a sample where in different regions one would expect peaks at different wavelengths to become dominant.

It's useful to be able to quickly scan over a given wavelength (or energy!) range, and see how the intensity of signals within that range vary spatially.

Describe the context

If implemented correctly, this could be used for a number of different hyperspectral signals.

The Dorset software for Chronos CL systems has this feature, but as far as I know this doesn't exist in Lumispy/ Hyperspy.

Proposed Solution

Perhaps this specific sort of analysis is outside of the scope of Lumispy. However, I thought something like this might fit well into the utils section of the library.

I have put together a provisional implementation here:

https://github.com/0Hughman0/lumispy/blob/0fc6fa18e82069af116acdfdc066fb244da05387/lumispy/utils/analysis.py#L4

I've called the feature 'AutoPeakMap'... not sure if this is the best name!

An example would be:

from lumispy.utils.analysis import AutoPeakMap, RangeChannel, plot_auto_peakmap

cl = hs.load(path, signal_type='CL_SEM')

apm = AutoPeakMap(cl,
                  RangeChannel('red', range_=[730, 760]),
                  RangeChannel('green', range_=[300, 400]))
apm.plot()

Renders two plots, the 'navigator', in the hyperspy sense:

nav

and the 'signal':

sig

The ranges in the 'navigator' can be moved around interactively and the signal plot will update accordingly.

For more quick and dirty analysis I implemented a convenience method:

plot_auto_peakmap(cl)

Which chooses sensible (enough) defaults for the RangeChannels.

If this is something you think could slot into LumiSpy, I'll go ahead and write the tests and expand the docstrings etc.

Feedback on the implementation/ naming more than welcome!

Thanks,

Hugh

Background storage

In original code, the background was stored as a class attribute (cl.background).
In newer versions of code, the background is stored in the metadata (cl.metadata.Signal.background).

Some background related functions need to be adapted to this new version of code.

Create a `remove_background_signal` function

Describe the functionality you would like to see.

A method called remove_background_signal where a Signal1D object is subtracted from another Signal1D object.
If the axes don't match (in size, offset and scale for UniformDataAxis), the background signal should be rebinned/interpolated to the main signal axis.

Describe the context

When trying to fix some warning bugs, discussion made us realise of the need to rewrite the depreciated remove_background_from_file for a more HyperSpy like method, as discussed here #114 .

Additional information

Ideally it should also support non-uniform data axes with interpolation.
Maybe add support a for fitting a ScalableFixedPattern as @ericpre proposed.

Things to do

  • Decide on where to add this method: Should it be done here in LumiSpy or in HyperSpy Signal1D class?
  • Decide on the approach to take: Should we use rebinning/interpolation or ScalableFixedPattern?
  • Write the method for UniformDataAxis
  • Expand the method to also work for non-uniform axes.

Consolidate axis conversion codes

  • As pointed out in #140 (comment), there is quite some duplicate code in the if/else statements of the axis conversion codes that can be consolidated.

  • It would be desirable to allow any cross-conversions between wavelength, wavenumber and energy (documenting in the metadata the conversions to warn if multiple conversions are done on the same axis). To this end, it may be considered to move the main part of the code into a generalized conversion function.

`to_eV()` function and non-uniform axis

Hi everyone,
first of all thanks for the lumispy package.

I'm working with CL data, and most of the time I'm playing with interactive ROIs or new hyperspy functions like the hyperspy.api.plot.plot_roi_map() but non-uniform axes are not supported yet.

Would it be reasonable to expand the to_eV() function to include a conversion option like in the following code?

from lumispy.signals import CLSEMSpectrum

n_points = 1024
wl = np.linspace(300,700,n_points)
sigma = 25
wl_center = 500
gaussian = 1/(sigma * np.sqrt(2 * np.pi)) * np.exp( - (wl - wl_center)**2 / (2 * sigma**2))
noise = CLSEMSpectrum(np.random.random((10,10,n_points))/1000)
s = CLSEMSpectrum(np.broadcast_to(gaussian, (10,10,n_points))) + noise

s.axes_manager.signal_axes[0].name = 'Wavelength'
s.axes_manager.signal_axes[0].units = 'nm'
s.axes_manager.signal_axes[0].scale = (700-300)/n_points
s.axes_manager.signal_axes[0].offset = 300

s.axes_manager.navigation_axes[0].name = 'X'
s.axes_manager.navigation_axes[0].units = 'nm'
s.axes_manager.navigation_axes[0].scale = 0.1
s.axes_manager.navigation_axes[0].offset = 100

s.axes_manager.navigation_axes[1].name = 'Y'
s.axes_manager.navigation_axes[1].units = 'nm'
s.axes_manager.navigation_axes[1].scale = 0.1
s.axes_manager.navigation_axes[1].offset = 100


s_ev = s.to_eV(inplace=False)
print(type(s_ev.axes_manager[-1]))

import hyperspy.axes as hsax
old_axis = s_ev.axes_manager[-1].axis
new_axis = hsax.DataAxis(axis=old_axis, name='Energy', units='eV')
new_axis.convert_to_uniform_axis()
print(type(new_axis))
test = s_ev.interpolate_on_axis(new_axis, axis='Energy', inplace=False)
test.plot()

hs.plot.plot_spectra([s_ev.inav[5,5], test.inav[5,5]], legend=['Original data', 'Data after uniform + interpolation'])

fig, ax = plt.subplots()
ax.scatter(old_axis, np.ones(len(old_axis)), label='Non uniform original axis')
ax.scatter(new_axis.axis, 2*np.ones(len(old_axis)), label='Uniform new axis')
fig.legend()

thanks a lot

Class hierarchy

An issue to discuss openly class hierarchy for lumispy, from an e-mail with @jlaehne

We want to start out with CL, but need a more general parent that allows to add PL and EL at some point. Also you probably want to add TR-CL/PL at some point.

In general, I guess something like

-lumispec (inheriting from hyperspy/signal1D)
|-cl
|-cl-sem
|-cl-stem
|-pl
|-el
-trlumi (inheriting from hyperspy/signal2D ?)
|-trcl
|-trpl

Added Read the Docs support

Hi all,

I have wanted to do this for very long...
I have now added support for the read the docs. This means that we can have explanations on the functions we write/methodologies in a nice formatted way.

As far as I am aware, such guides can be added to user_guide/file.rst. Feel free to add content there.
I have a list of pages I would like to add for the analysis of PL data (coming soon).
@jlaehne Maybe you could take care on the Jacobian transform documentation if you feel like explaining in a bit more detail how LumiSpy does this transform. That'd be a great opportunity to test it out too.

I believe having this will help external users to get started, as a nicely explained guide (like the EDX analysis page in HyperSpy docs) can really become "the place to check" on how to do data analysis of a particular data signal. I think the demo notebooks are great, but they can also get a bit messy.

Happy to hear suggestions!

PS: The Read the docs should trigger a build every time there is a merge. It will loop though all the docstrings and update the docs automatically.

Increase coverage

We have some low hanging fruit in increasing the test coverage of LumiSpy:

  • px_to_nm_grating_solver in signals/luminescence_spectrum.py is not covered [See codecov, lines 291-310]

  • For to_eV and to_invcm in signals/luminescence_spectrum.py, the case where the variance is a single int/float is not covered when inplace=True. See codecov, lines 154-161 & 309-316

  • scale_by_exposure in signals/common_luminescence.py is missing coverage for the situation where ("Signal.quantity") == "Intensity (Counts)" See codecov, lines 154-155

  • remove_spikes in signals/cl_spectrum.py is missing coverage for show_diagnosis_histogram See codecov, line 92

  • solve_grating_equation in utils/axes.py is missing coverage for axis being non-uniform DataAxis See codecov, line 485

  • remove_background_from_file misses some tests for warnings/errors, but as it will be deprecated is not worth implementing.

[Feature] Statistical analysis of model from data

It would be useful to add a quick function that calculates some parameters to estimate the homogeneity in emission in a given area.
What I did in the past was to take the fitted model from a SI and use mean and std dev of the gaussian intensity and position. Also, I had a threshold on the relative intensity of a peak so that one could have an estimate of how many of the pixels are non-emissive.
I'm currently writing that into a function, let me know if anyone has any thoughts on:

  • what statistical parameters would be interesting to evaluate
  • best practice for functions applied to a model (in case they later need to be extended)
    Also, note that there is a function in hyperspy - print_summary_statistics - but I think that a luminescence-oriented function would be better.
    I suspect that since gaussian fitting will probably be a core function for luminescence there'll need to be a unified way to fit and process the resulting data.

Test deprecation warning

#44 introduced the following deprecation warning, which we should fix:

lumispy/tests/test_luminescence_spectrum.py::TestLumiSpectrum::test_errors_raise
  /home/jonas/medien/pdi/git/lumispy/lumispy/signals/luminescence_spectrum.py:312: VisibleDeprecationWarning: Creating an ndarray from ragged nested sequences (which is a list-or-tuple of lists-or-tuples-or ndarrays with different lengths or shapes) is deprecated. If you meant to do this, you must specify 'dtype=object' when creating the ndarray.
    background_xy = np.array(background)

Transient signal classes

We missed something when introducing the transient signal classes last year (and no one seems to have used them so far? Anyway there are no specific functions). At the moment, we have a set of Signal2D classes for transients:

LumiTransient | 2 | Luminescence | real | TRLumi, TR luminescence, time-resolved luminescence
CLTransient | 2 | TRCL | real | TR cathodoluminescence, time-resolved cathodoluminescence
PLTransient | 2 | TRPL | real | TR photoluminescence, time-resolved photoluminescence

That is correct for a streak camera image <x,y|E,t>, but not for a pure transient <x,y|t> (without spectral information).

The ...Transient classes should in fact be Signal1D, while there should be an additional set of classes ...TransientSpectrum for Streak camera images. Any oppinions? I would propose to fix it for the v0.2 release.

(see original discussion in #5 (comment) )

loadtxt, savetxt

I want to implement a simple set of wrappers that use np.loadtxt and np.savetxt to load and save txt, dat, csv format files. Mostly metadata agnostic of course. At least for single spectra as well as linescans. For more than one navigation dimension it would become more complicated, but at least 2d navigation with 0d signal axis (e.g. single parameters from fit results) would also be feasible. It would be helpful to import/transfer data from systems where the native file system is so far not implemented in hyperspy (e.g. Jobim Yvon spectrometer software or systems with self-programmed data collection), but a text format export is possible. On the writer side, I often want to pass on some data to colleagues who do not use hyperspy or even python (the good old Origin crowd).

@ericpre, @dnjohnstone, @jordiferrero would it make more sense to implement that in the io-part of hyperspy for general usage? Otherwise I would implement a specific set of functions just for lumispy?

Backwards incompatible changes made 0.1.0 to 0.1.2

Hi,

It seems some backwards incompatible changes were made between version 0.1.0 and 0.1.2.

My code for 0.1.0 made use of

lumispy.signals.luminescence_spectrum.LumiSpectrum.background_subtraction

in 0.1.2 this is now broken as this method has been removed. It seems this functionality has now been moved to:

lumispy.signals.luminescence_spectrum.LumiSpectrum.remove_background_from_file

However, it seems the implementation of this method is different to the background_subtraction method from 0.1.0 as swapping the two doesn't seem to fix the issue.

In general, my expectations for versioning are that a bump in patch number i.e. 0.1.0 to 0.1.2 would only be to incorporate backwards compatible bug fixes, as described in the semantic versioning docs:

https://semver.org/

In general my expectations for removing features like lumispy.signals.luminescence_spectrum.LumiSpectrum.background_subtraction would be that moving this would require a major bump in version number, and ideally a deprecation warning added during a minor bump as a warning.

That all said, I am very pleased at the decision to change the name from a noun to a verb!

Thanks for your work on this package, it's super cool, hope to add some features at some point ๐Ÿ˜.

Hugh

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.