Git Product home page Git Product logo

quakemigrate / quakemigrate Goto Github PK

View Code? Open in Web Editor NEW
134.0 11.0 32.0 401.27 MB

A Python package for automatic earthquake detection and location using waveform migration and stacking.

Home Page: https://quakemigrate.readthedocs.io/

License: GNU General Public License v3.0

Python 99.05% C 0.95%
python seismology seismic volcanology signal-processing earthquake detection location research passive volcano-seismology

quakemigrate's Introduction

DOI

QuakeMigrate is a Python package for automatic earthquake detection and location using waveform migration and stacking.

Key Features

QuakeMigrate uses a waveform migration and stacking algorithm to search for coherent seismic phase arrivals across a network of instruments. It produces—from raw data—catalogues of earthquakes with locations, origin times, phase arrival picks, and local magnitude estimates, as well as rigorous estimates of the associated uncertainties.

The package has been built with a modular architecture, providing the potential for extension and adaptation at numerous entry points. This includes, but is not limited to:

  • the calculation or import of traveltime grids
  • the choice of algorithm used to identify phase arrivals (for example by kurtosis, cross-covariance analysis between multiple components, machine learning techniques and more)
  • the stacking function used to combine onset functions
  • the algorithm used to perform phase picking

Documentation

Documentation for QuakeMigrate is hosted here.

Installation

Detailed installation instructions can be found here.

If you're comfortable with virtual environments and just want to get started, QuakeMigrate is available via the Python Package Index, and can be installed via pip:

pip install quakemigrate

Usage

We are working on tutorials covering how each individual aspect of the package works, as well as example use cases where we provide substantive reasoning for the parameter choices used. These examples include applications to cryoseismicity and volcano seismology.

This is a work in progress - see our documentation for full details.

Examples you can run in your browser

To quickly get a taste for how the software works, try out the two icequake examples hosted on Binder:

  • Icequakes at the Rutford Ice Stream, Antarctica badge
  • Icequakes at the Skeiðarárjökull outlet glacier, Iceland badge

And for a more comprehensive demonstration of the options available, see the template scripts.

Citation

If you use this package in your work, please cite the following conference presentation:

Winder, T., Bacon, C.A., Smith, J.D., Hudson, T., Greenfield, T. and White, R.S., 2020. QuakeMigrate: a Modular, Open-Source Python Package for Automatic Earthquake Detection and Location. In AGU Fall Meeting 2020. AGU.

as well as the relevant version of the source code on Zenodo.

We hope to have a publication coming out soon:

Winder, T., Bacon, C.A., Smith, J.D., Hudson, T.S., Drew, J., and White, R.S. QuakeMigrate: a Python Package for Automatic Earthquake Detection and Location Using Waveform Migration and Stacking. (to be submitted to Seismica).

Contributing to QuakeMigrate

Contributions to QuakeMigrate are welcomed. Whether you have identified a bug or would like to request a new feature, your first stop should be to reach out, either directly or—preferably—via the GitHub Issues panel, to discuss the proposed changes. Once we have had a chance to scope out the proposed changes you can proceed with making your contribution following the instructions in our contributions guidelines.

Bug reports, suggestions for new features and enhancements, and even links to projects that have made use of QuakeMigrate are most welcome.

Contact

You can contact us directly at: [email protected]

Any additional comments/questions can be directed to:

License

This package is written and maintained by the QuakeMigrate developers, Copyright QuakeMigrate developers 2020–2023. It is distributed under the GPLv3 License. Please see the LICENSE file for a complete description of the rights and freedoms that this provides the user.

quakemigrate's People

Contributors

hemmelig avatar stickler-ci avatar tmgreenfield1101 avatar tomwinder avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

quakemigrate's Issues

macOS / brew / gcc installation issues

When I install QuakeMigrate in conda env (I have already created conda env, installed scikit-fmm, NLLC and GCC) with 'pip install .`
I got the following error:

`gcc -Wno-unused-result -Wsign-compare -Wunreachable-code -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I/Users/joaofontiela/opt/anaconda3/envs/quakemigrate/include -arch x86_64 -I/Users/joaofontiela/opt/anaconda3/envs/quakemigrate/include -arch x86_64 -I/private/var/folders/z7/ts4khj154vx0s9pgf5kzp6sh0000gn/T/pip-req-build-a2n91akv/quakemigrate/core/src -I/Users/joaofontiela/opt/anaconda3/envs/quakemigrate/lib/python3.8/site-packages/numpy/core/include -I/Users/joaofontiela/opt/anaconda3/envs/quakemigrate/include -I/Users/joaofontiela/opt/anaconda3/envs/quakemigrate/include/python3.8 -c quakemigrate/core/src/quakemigrate.c -o build/temp.macosx-10.9-x86_64-3.8/quakemigrate/core/src/quakemigrate.o -fopenmp -fPIC -Ofast
In file included from quakemigrate/core/src/quakemigrate.c:16:0:
quakemigrate/core/src/qmlib.h:17:18: fatal error: math.h: No such file or directory
#include <math.h>
^
compilation terminated.
error: command 'gcc' failed with exit status 1


ERROR: Command errored out with exit status 1: /Users/joaofontiela/opt/anaconda3/envs/quakemigrate/bin/python3.8 -u -c 'import io, os, sys, setuptools, tokenize; sys.argv[0] = '"'"'/private/var/folders/z7/ts4khj154vx0s9pgf5kzp6sh0000gn/T/pip-req-build-a2n91akv/setup.py'"'"'; file='"'"'/private/var/folders/z7/ts4khj154vx0s9pgf5kzp6sh0000gn/T/pip-req-build-a2n91akv/setup.py'"'"';f = getattr(tokenize, '"'"'open'"'"', open)(file) if os.path.exists(file) else io.StringIO('"'"'from setuptools import setup; setup()'"'"');code = f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' install --record /private/var/folders/z7/ts4khj154vx0s9pgf5kzp6sh0000gn/T/pip-record-rjlkhqzj/install-record.txt --single-version-externally-managed --compile --install-headers /Users/joaofontiela/opt/anaconda3/envs/quakemigrate/include/python3.8/quakemigrate Check the logs for full command output.`

I guess the error is related to GCC instead of QuakeMigrate but I'm not sure.
Anyone had a similar error? How do you fix it?

BUG: Unhandled error where no data passes data quality checks within onset

Describe the bug
Where data is successfully read from the archive for a timestep, but none of it passes the data quality checks (using the criteria specified check_availability()) this leads to an unhandled ValueError in calculate_onsets(). (Reported by @tmgreenfield1101 )

A secondary consequence is that no station availability data is written for the run, because this is currently all output in a single batch at the end of the run.

To Reproduce
Run detect with the timestep and starttime/endtime such that a timestep includes data but where none of this data passes the quality checks (e.g. none of it covers the whole timespan of data requested by scan).

Traceback (most recent call last):
  File "./detect.py", line 64, in <module>
    scan.detect(starttime, endtime)
  File "/space/tg286/quake_migrate/tim_area/QuakeMigrate/quakemigrate/signal/scan.py", line 309, in detect
    self._continuous_compute(starttime, endtime)
  File "/space/tg286/quake_migrate/tim_area/QuakeMigrate/quakemigrate/signal/scan.py", line 404, in _continuous_compute
    self._compute(data)
  File "/space/tg286/quake_migrate/tim_area/QuakeMigrate/quakemigrate/util.py", line 610, in wrapper
    result = func(*args, **kwargs)
  File "/space/tg286/quake_migrate/tim_area/QuakeMigrate/quakemigrate/signal/scan.py", line 543, in _compute
    onsets, onset_data = self.onset.calculate_onsets(data)
  File "/space/tg286/quake_migrate/tim_area/QuakeMigrate/quakemigrate/signal/onsets/stalta.py", line 387, in calculate_onsets
    onsets = np.stack(onsets, axis=0)
  File "<__array_function__ internals>", line 5, in stack
  File "/home/tg286/miniconda3/envs/quakemigrate_dev/lib/python3.8/site-packages/numpy/core/shape_base.py", line 423, in stack
    raise ValueError('need at least one array to stack')
ValueError: need at least one array to stack

Expected behavior
This should be handled, including printing a descriptive message to the logs, appending scanmseed.empty() and successfully completing the run (inc. writing station availability data for the whole run and scanmseed for the affected day(s)).

Desktop (please complete the following information):

  • Operating System: Ubuntu 18.04
  • Python version: 3.8
  • QuakeMigrate version: 1.0.0

Trace gap tolerance - allow for linear interpolation?

Currently a station is removed if it has a gap of more than 1 sample. This is silly as linearly interpolating this would produce a signal that is fine to process. Instead we should define a minimum gap before QM removes the station. this could be user defined - but most users are not likely to come across it. maybe a default of 10 or 100 samples is appropriate.

[BUG]: All traces rejected where one pair can't be merged due to "Incompatible traces" error.

Describe the bug
When the time period of data to be analysed contains a single pair of traces that can't be merged, but have overlapping/adjacent data & identical SEED id's, the whole merge operation is abandoned. This means all data over the day (or hour, etc.) boundary is not merged, leading to zero stations being available for migration.

To Reproduce
Include overlapping data that can't be merged in a way that results in this error (e.g. one file with "FLOAT64" encoding and the other with "STEIM1") in the data archive.

Expected (desired) behaviour
Only the station/stream which is affected by this merge error should be rejected.

Desktop (please complete the following information):

  • Operating System: Ubuntu 16.04
  • Python version: 3.8.4
  • QuakeMigrate version: 1.0.0

Additional context
This is annoying to fix because of the way ObsPy's st.merge() handles the method=-1 option; in that case it does the operation inplace only, and doesn't explicitly return the Stream object (preventing e.g. application of the merge operation on a station-by-station basis). I will raise a linked issue there. For the time being could do:

ids = set([tr.id for tr in st])
st_merged = Stream()
for tr_id in ids:
    st_merged += st.select(id=tr_id)._cleanup()

(i.e. directly using the hidden st._cleanup() function).

Dealing with different networks

QuakeMigrate currently requires all the stations to have unique names. This isn't always the case though, especially when a region has more than one network code. While renaming all the local MiniSEED archive is possible, this shouldn't be a solution as with some large archives this process would be prohibitively slow.

A quick solution is to allow users to use NET.STATION in their input station .csv files and handle this in the archive object. I propose adding a parameter such as catch_network to the Archive object and adding this catch to a few functions.

For example in the Archive.read_waveform_data function a solution could be:


                if self.catch_network and '.' in station:
                    network = station.split('.')[0]
                    station = station.split('.')[1]
                    st_selected += st.select(network=network, 
                                             station=station)
                else:
                    st_selected += st.select(station=station)

It makes sense to include this in the PATH_STRUCTURE. i.e.


        elif archive_format == "YEAR/JD/NETWORK.STATION*":
            self.format = "{year}/{jday:03d}/{network}.{station}*"

The attached file includes these additions
/space/tg286/quake_migrate/QuakeMigrate/quakemigrate/io/data.py

Losing data during time shift

Hi Gents,

Sorry to bother you. I'm trying to run some large scale OBS data through QM v.1 and I seemt o be losing data somewhere, but I'm not sure where.

See below for the screen output. Based on my (admittedly short) perusal of the code the "Trace" should be the new retimestamped data but it seems to have gone missing !

Do you have any suggestions?

Tim

==============================================================================================================
DETECT - Continuous coalescence scan

Scanning from 2020-01-01T00:00:00.000000Z to 2020-01-01T01:00:00.000000Z

Scan parameters:
	Scan sampling rate = 5 Hz
	Thread count       = 20
	Time step          = 300.0 s

Onset parameters - using the classic STA/LTA onset
	Onset function sampling rate = 5 Hz
	Phase(s) = ['P', 'S']

	P bandpass filter  = [0.1, 2.0, 2] (Hz, Hz, -)
	S bandpass filter  = [0.1, 2.0, 2] (Hz, Hz, -)

	P onset [STA, LTA] = [4.0, 20.0] (s, s)
	S onset [STA, LTA] = [4.0, 20.0] (s, s)

==============================================================================================================

Trace
	GE.BKB..BHZ | 2019-12-31T23:58:54.019538Z - 2019-12-31T23:59:59.969538Z | 20.0 Hz, 1320 samples
has off-sample data. Applying -0.019538 s shift to timing.
Trace
	GE.BKB..BHZ | 2020-01-01T00:00:02.019538Z - 2020-01-01T00:09:17.019538Z | 20.0 Hz, 11101 samples
has off-sample data. Applying -0.019538 s shift to timing.
Trace
	GE.TOLI2..BHZ | 2019-12-31T23:58:54.019538Z - 2019-12-31T23:59:59.969538Z | 20.0 Hz, 1320 samples
has off-sample data. Applying -0.019538 s shift to timing.
Trace
	GE.TOLI2..BHZ | 2020-01-01T00:00:12.419538Z - 2020-01-01T00:09:17.019538Z | 20.0 Hz, 10893 samples
has off-sample data. Applying -0.019538 s shift to timing.
Trace
	SC.C08F.00.BH1 | 2019-12-31T23:58:53.990900Z - 2019-12-31T23:59:59.990900Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.009100 s shift to timing.
Trace
	SC.C08F.00.BH1 | 2020-01-01T00:00:00.005900Z - 2020-01-01T00:09:17.005900Z | 50.0 Hz, 27851 samples
has off-sample data. Applying -0.005900 s shift to timing.
Trace
	SC.C08F.00.BH2 | 2019-12-31T23:58:53.990900Z - 2019-12-31T23:59:59.990900Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.009100 s shift to timing.
Trace
	SC.C08F.00.BH2 | 2020-01-01T00:00:00.005900Z - 2020-01-01T00:09:17.005900Z | 50.0 Hz, 27851 samples
has off-sample data. Applying -0.005900 s shift to timing.
Trace
	SC.C08F.00.BHZ | 2019-12-31T23:58:53.990900Z - 2019-12-31T23:59:59.990900Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.009100 s shift to timing.
Trace
	SC.C08F.00.BHZ | 2020-01-01T00:00:00.005900Z - 2020-01-01T00:09:17.005900Z | 50.0 Hz, 27851 samples
has off-sample data. Applying -0.005900 s shift to timing.
Trace
	SC.C09G.00.BH1 | 2019-12-31T23:58:53.991500Z - 2019-12-31T23:59:59.711500Z | 50.0 Hz, 3287 samples
has off-sample data. Applying +0.008500 s shift to timing.
Trace
	SC.C09G.00.BH1 | 2019-12-31T23:59:59.751600Z - 2020-01-01T00:09:16.991600Z | 50.0 Hz, 27863 samples
has off-sample data. Applying +0.008400 s shift to timing.
Trace
	SC.C09G.00.BH2 | 2019-12-31T23:58:53.991500Z - 2019-12-31T23:59:59.711500Z | 50.0 Hz, 3287 samples
has off-sample data. Applying +0.008500 s shift to timing.
Trace
	SC.C09G.00.BH2 | 2019-12-31T23:59:59.751600Z - 2020-01-01T00:09:16.991600Z | 50.0 Hz, 27863 samples
has off-sample data. Applying +0.008400 s shift to timing.
Trace
	SC.C09G.00.BHZ | 2019-12-31T23:58:53.991500Z - 2019-12-31T23:59:59.711500Z | 50.0 Hz, 3287 samples
has off-sample data. Applying +0.008500 s shift to timing.
Trace
	SC.C09G.00.BHZ | 2019-12-31T23:59:59.751600Z - 2020-01-01T00:09:16.991600Z | 50.0 Hz, 27863 samples
has off-sample data. Applying +0.008400 s shift to timing.
Trace
	SC.C12F.00.BH1 | 2019-12-31T23:58:53.990100Z - 2019-12-31T23:59:59.990100Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.009900 s shift to timing.
Trace
	SC.C12F.00.BH1 | 2020-01-01T00:00:00.007800Z - 2020-01-01T00:09:17.007800Z | 50.0 Hz, 27851 samples
has off-sample data. Applying -0.007800 s shift to timing.
Trace
	SC.C12F.00.BH2 | 2019-12-31T23:58:53.990100Z - 2019-12-31T23:59:59.990100Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.009900 s shift to timing.
Trace
	SC.C12F.00.BH2 | 2020-01-01T00:00:00.007800Z - 2020-01-01T00:09:17.007800Z | 50.0 Hz, 27851 samples
has off-sample data. Applying -0.007800 s shift to timing.
Trace
	SC.C12F.00.BHZ | 2019-12-31T23:58:53.990100Z - 2019-12-31T23:59:59.990100Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.009900 s shift to timing.
Trace
	SC.C12F.00.BHZ | 2020-01-01T00:00:00.007800Z - 2020-01-01T00:09:17.007800Z | 50.0 Hz, 27851 samples
has off-sample data. Applying -0.007800 s shift to timing.
Trace
	SC.C15G.00.BH1 | 2019-12-31T23:58:54.005900Z - 2020-01-01T00:00:00.005900Z | 50.0 Hz, 3301 samples
has off-sample data. Applying -0.005900 s shift to timing.
Trace
	SC.C15G.00.BH1 | 2020-01-01T00:00:00.019300Z - 2020-01-01T00:09:16.999300Z | 50.0 Hz, 27850 samples
has off-sample data. Applying +0.000700 s shift to timing.
Trace
	SC.C15G.00.BH2 | 2019-12-31T23:58:54.005900Z - 2020-01-01T00:00:00.005900Z | 50.0 Hz, 3301 samples
has off-sample data. Applying -0.005900 s shift to timing.
Trace
	SC.C15G.00.BH2 | 2020-01-01T00:00:00.019300Z - 2020-01-01T00:09:16.999300Z | 50.0 Hz, 27850 samples
has off-sample data. Applying +0.000700 s shift to timing.
Trace
	SC.C15G.00.BHZ | 2019-12-31T23:58:54.005900Z - 2020-01-01T00:00:00.005900Z | 50.0 Hz, 3301 samples
has off-sample data. Applying -0.005900 s shift to timing.
Trace
	SC.C15G.00.BHZ | 2020-01-01T00:00:00.019300Z - 2020-01-01T00:09:16.999300Z | 50.0 Hz, 27850 samples
has off-sample data. Applying +0.000700 s shift to timing.
Trace
	SC.C18F.00.BH1 | 2019-12-31T23:58:53.991600Z - 2020-01-01T00:09:16.991600Z | 50.0 Hz, 31151 samples
has off-sample data. Applying +0.008400 s shift to timing.
Trace
	SC.C18F.00.BH2 | 2019-12-31T23:58:53.991600Z - 2020-01-01T00:09:16.991600Z | 50.0 Hz, 31151 samples
has off-sample data. Applying +0.008400 s shift to timing.
Trace
	SC.C18F.00.BHZ | 2019-12-31T23:58:53.991600Z - 2020-01-01T00:09:16.991600Z | 50.0 Hz, 31151 samples
has off-sample data. Applying +0.008400 s shift to timing.
Trace
	SC.C21F.00.BH1 | 2019-12-31T23:58:53.998400Z - 2019-12-31T23:59:59.998400Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.001600 s shift to timing.
Trace
	SC.C21F.00.BH1 | 2020-01-01T00:00:00.017000Z - 2020-01-01T00:09:16.997000Z | 50.0 Hz, 27850 samples
has off-sample data. Applying +0.003000 s shift to timing.
Trace
	SC.C21F.00.BH2 | 2019-12-31T23:58:53.998400Z - 2019-12-31T23:59:59.998400Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.001600 s shift to timing.
Trace
	SC.C21F.00.BH2 | 2020-01-01T00:00:00.017000Z - 2020-01-01T00:09:16.997000Z | 50.0 Hz, 27850 samples
has off-sample data. Applying +0.003000 s shift to timing.
Trace
	SC.C21F.00.BHZ | 2019-12-31T23:58:53.998400Z - 2019-12-31T23:59:59.998400Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.001600 s shift to timing.
Trace
	SC.C21F.00.BHZ | 2020-01-01T00:00:00.017000Z - 2020-01-01T00:09:16.997000Z | 50.0 Hz, 27850 samples
has off-sample data. Applying +0.003000 s shift to timing.
Trace
	SC.C23G.00.BH1 | 2019-12-31T23:58:54.007400Z - 2019-12-31T23:59:59.987400Z | 50.0 Hz, 3300 samples
has off-sample data. Applying -0.007400 s shift to timing.
Trace
	SC.C23G.00.BH1 | 2020-01-01T00:00:00.005200Z - 2020-01-01T00:09:17.005200Z | 50.0 Hz, 27851 samples
has off-sample data. Applying -0.005200 s shift to timing.
Trace
	SC.C23G.00.BH2 | 2019-12-31T23:58:54.007400Z - 2019-12-31T23:59:59.987400Z | 50.0 Hz, 3300 samples
has off-sample data. Applying -0.007400 s shift to timing.
Trace
	SC.C23G.00.BH2 | 2020-01-01T00:00:00.005200Z - 2020-01-01T00:09:17.005200Z | 50.0 Hz, 27851 samples
has off-sample data. Applying -0.005200 s shift to timing.
Trace
	SC.C23G.00.BHZ | 2019-12-31T23:58:54.007400Z - 2019-12-31T23:59:59.987400Z | 50.0 Hz, 3300 samples
has off-sample data. Applying -0.007400 s shift to timing.
Trace
	SC.C23G.00.BHZ | 2020-01-01T00:00:00.005200Z - 2020-01-01T00:09:17.005200Z | 50.0 Hz, 27851 samples
has off-sample data. Applying -0.005200 s shift to timing.
Trace
	SC.C28F.00.BH1 | 2019-12-31T23:58:53.996200Z - 2019-12-31T23:59:59.996200Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.003800 s shift to timing.
Trace
	SC.C28F.00.BH1 | 2020-01-01T00:00:00.015800Z - 2020-01-01T00:09:16.995800Z | 50.0 Hz, 27850 samples
has off-sample data. Applying +0.004200 s shift to timing.
Trace
	SC.C28F.00.BH2 | 2019-12-31T23:58:53.996200Z - 2019-12-31T23:59:59.996200Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.003800 s shift to timing.
Trace
	SC.C28F.00.BH2 | 2020-01-01T00:00:00.015800Z - 2020-01-01T00:09:16.995800Z | 50.0 Hz, 27850 samples
has off-sample data. Applying +0.004200 s shift to timing.
Trace
	SC.C28F.00.BHZ | 2019-12-31T23:58:53.996200Z - 2019-12-31T23:59:59.996200Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.003800 s shift to timing.
Trace
	SC.C28F.00.BHZ | 2020-01-01T00:00:00.015800Z - 2020-01-01T00:09:16.995800Z | 50.0 Hz, 27850 samples
has off-sample data. Applying +0.004200 s shift to timing.
Trace
	SC.M01G.00.BH1 | 2019-12-31T23:58:53.991700Z - 2019-12-31T23:59:59.991700Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.008300 s shift to timing.
Trace
	SC.M01G.00.BH1 | 2020-01-01T00:00:00.008400Z - 2020-01-01T00:09:17.008400Z | 50.0 Hz, 27851 samples
has off-sample data. Applying -0.008400 s shift to timing.
Trace
	SC.M01G.00.BH2 | 2019-12-31T23:58:53.991700Z - 2019-12-31T23:59:59.991700Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.008300 s shift to timing.
Trace
	SC.M01G.00.BHZ | 2019-12-31T23:58:53.991700Z - 2019-12-31T23:59:59.991700Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.008300 s shift to timing.
Trace
	SC.M02F.00.BH1 | 2019-12-31T23:58:53.992000Z - 2019-12-31T23:59:59.992000Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.008000 s shift to timing.
Trace
	SC.M02F.00.BH2 | 2019-12-31T23:58:53.992000Z - 2019-12-31T23:59:59.992000Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.008000 s shift to timing.
Trace
	SC.M02F.00.BHZ | 2019-12-31T23:58:53.992000Z - 2019-12-31T23:59:59.992000Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.008000 s shift to timing.
Trace
	SC.M03F.00.BH1 | 2019-12-31T23:58:53.994600Z - 2019-12-31T23:59:59.994600Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.005400 s shift to timing.
Trace
	SC.M03F.00.BH2 | 2019-12-31T23:58:53.994600Z - 2019-12-31T23:59:59.994600Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.005400 s shift to timing.
Trace
	SC.M03F.00.BHZ | 2019-12-31T23:58:53.994600Z - 2019-12-31T23:59:59.994600Z | 50.0 Hz, 3301 samples
has off-sample data. Applying +0.005400 s shift to timing.
		No P onset for M01G.
		No P onset for M02F.
		No P onset for M03F.
		No P onset for C28F.
		No P onset for C15G.
		No P onset for C23G.
		No P onset for C16F.
		No P onset for C12F.
		No P onset for C21F.
		No P onset for C09G.
		No P onset for C08F.
		No P onset for BKB.
		No P onset for TOLI2.
		No S onset for M01G.
		No S onset for M02F.
		No S onset for M03F.
		No S onset for C28F.
		No S onset for C15G.
		No S onset for C23G.
		No S onset for C16F.
		No S onset for C12F.
		No S onset for C21F.
		No S onset for C09G.
		No S onset for C08F.
		No S onset for BKB.
		No S onset for TOLI2.
                     Elapsed time: 0.216390 seconds.

lut.compute_1d_vmodel_skfmm() fatal RuntimeError when using scikit-fmm=2019.1.30

A known bug in scikit-fmm ( scikit-fmm/scikit-fmm#18 ) can lead to lut.compute_1d_vmodel_skfmm() breaking with a RuntimeError:

RuntimeError: Negative discriminant in time marcher quadratic.

As of now this issue is reportedly not well understood. The underlying issue exists in the previous release - scikit-fmm=0.0.9 - but was not known; the change that has been made is to add the test function that gives the RuntimeError.

Batch trigger by Julian day

Issue statement
Trigger is limited to a single core, despite being suitable for multi-processing. The current workaround is to batch up the time window of interest in an external script. This, however, leads to the issue of files from each trigger run overwriting any existing file.

Proposition
Bake the batching of trigger into Trigger. This will also require dealing with overwriting triggered event files. This will be handled by adding the Julian day to the triggered events filename.

Primary issue

  • Bake batching process into Trigger.trigger()
    • Have TriggeredEvent.csv files write with the relevant Julian day
    • Ensure the process of reading these TriggeredEvent files for locate can handle the separate Julian days

Future

  • Multi-process the batched trigger
  • Allow for multi-processing arbitrary periods of time by breaking into batches with non-Julian day lengths.

Additional tasks

  • Break the _trigger() method into two stages
    • _trigger_candidates() - Identify all of the instances of the (normalised) coalescence exceeding the chosen detection threshold
    • _refine_candidates() - Merge events for which the marginal windows overlap with the minimum inter-event time.

Result
Clearer, multi-processed code leading to faster results and a clearer codebase.

Reach
Trigger files generated using the development branch prior to this change will no longer be compatible with locate(starttime, endtime). However, it is still possible to locate the events in this file using locate(trigger_file).

AttributeError: module 'numpy' has no attribute 'bool'.

I am receiving an attribute error relating to numpy when importing from quakemigrate.io / .lut.

Code to reproduce:
from quakemigrate.io import read_stations
from quakemigrate.lut import compute_traveltimes

  • Operating System: Win10
  • Python version: 3.8.16
  • QuakeMigrate version: 1.0.0
  • Numpy version: 1.24.2

Full Error:
AttributeError: module 'numpy' has no attribute 'bool'.
np.bool was a deprecated alias for the builtin bool. To avoid this error in existing code, use bool by itself. Doing this will not modify any behavior and is safe. If you specifically wanted the numpy scalar type, use np.bool_ here.
The aliases was originally deprecated in NumPy 1.20; for more details and guidance see the original release note at:
https://numpy.org/devdocs/release/1.20.0-notes.html#deprecations

FEAT: skfmm dike example

Currently the dike example is the only template use-case looking at non-ice events, and the only one that features magnitude calculation. As such, we would like to make it available to all users.

To maximise it's accessibility, we should add a python-only version, which uses skfmm to calculate the travel-time look-up table.

coordinate decisions

Depth is positive down, Z is positive up. Decide which one we want to output. My vote is for depth.

ValueError converting float NaN to integer in return of time in scan._phase_picker()

An issue with running event relocation in QMigrate master branch. There appears to be an issue if the variable p_ttime[i] value = NaN. I'm not sure how the value became a NaN, and happy to look into it if necessary, but if the issue is known about then guess there must be a fairly easy fix.

Here is the exact error thrown:

========================================================================================================================
EVENT - 1174 of 1393 - 20110720170018600000

Determining event location...

Reading waveform data...
	Elapsed time: 0.439387 seconds.

Computing 4D coalescence grid...
	Elapsed time: 15.360480 seconds.

Making phase picks...

Traceback (most recent call last):
File "run_QMigrate.py", line 224, in
run_locate(data_in, station_file, lut_out, out_path, run_name, locate_sampling_rate, p_bp_filter, s_bp_filter, p_onset_win, s_onset_win, time_step, locate_decimation, marginal_window=marginal_window, n_cores=n_cores)
File "run_QMigrate.py", line 179, in run_locate
scan.locate(starttime, endtime)
File "/Users/eart0504/opt/anaconda3/envs/QMigrate/lib/python3.6/site-packages/QMigrate-1.0.20.1.16-py3.6.egg/QMigrate/signal/scan.py", line 622, in locate
self._locate_events(start_time, end_time)
File "/Users/eart0504/opt/anaconda3/envs/QMigrate/lib/python3.6/site-packages/QMigrate-1.0.20.1.16-py3.6.egg/QMigrate/signal/scan.py", line 926, in _locate_events
phase_picks = self._phase_picker(event_max_coa)
File "/Users/eart0504/opt/anaconda3/envs/QMigrate/lib/python3.6/site-packages/QMigrate-1.0.20.1.16-py3.6.egg/QMigrate/signal/scan.py", line 1471, in _phase_picker
p_arrival = event["DT"] + p_ttime[i]
File "/Users/eart0504/opt/anaconda3/envs/QMigrate/lib/python3.6/site-packages/obspy/core/utcdatetime.py", line 966, in add
return UTCDateTime(ns=self._ns + int(round(value * 1e9)))
ValueError: cannot convert float NaN to integer

RunTimeWarning: invalid value encountered in true_divide

This is a warning that comes up if there is no data available for a given station at any time in the detect/locate time window.

I threw a print line in to confirm:

[ 0.  0.  0. ...,  0.  0.  0.]
RuntimeWarning: invalid value encountered in true_divide
  trace.plot(x, y / np.max(abs(y)) * self.trace_scale + (st_idx + 1)

The simplest solution here is to put a test in the plot function _plot_coa_trace():

    def _plot_coa_trace(self, trace, x, y, st_idx, color):
        if y.any():
            trace.plot(x, y / np.max(abs(y)) * self.trace_scale + (st_idx + 1),
                       color=color, linewidth=0.5, zorder=1)

Installation error - cannot load extension 'qmlib'

Hello!

I have tried to install QM on my MacOS Mojave v10.14.6 machine. At first I think it was trying to use Clang, which didn't work, so I installed gcc with HomeBrew (I got gcc version 11.2.0) and I tried again and the installation seemed successful (there was an error message related to PyProj, but that seems to be installed correctly as far as I can tell). However, when I try to import quakemigrate I get some errors:

(quakemigrate) bagend:QuakeMigrate tmerry$ python
Python 3.8.12 | packaged by conda-forge | (default, Sep 16 2021, 01:59:00) 
[Clang 11.1.0 ] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import quakemigrate
Traceback (most recent call last):
  File "/Users/tmerry/QuakeMigrate/quakemigrate/core/libnames.py", line 39, in _load_cdll
    cdll = ctypes.CDLL(str(lib))
  File "/Applications/anaconda3/envs/quakemigrate/lib/python3.8/ctypes/__init__.py", line 373, in __init__
    self._handle = _dlopen(self._name, mode)
OSError: dlopen(/Users/tmerry/QuakeMigrate/quakemigrate/core/src/qmlib.cpython-38-darwin.so, 6): image not found

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/tmerry/QuakeMigrate/quakemigrate/__init__.py", line 18, in <module>
    from quakemigrate.signal import QuakeScan, Trigger  # NOQA
  File "/Users/tmerry/QuakeMigrate/quakemigrate/signal/__init__.py", line 20, in <module>
    from .scan import QuakeScan  # NOQA
  File "/Users/tmerry/QuakeMigrate/quakemigrate/signal/scan.py", line 23, in <module>
    from quakemigrate.core import find_max_coa, migrate
  File "/Users/tmerry/QuakeMigrate/quakemigrate/core/__init__.py", line 20, in <module>
    from .lib import migrate, find_max_coa  # NOQA
  File "/Users/tmerry/QuakeMigrate/quakemigrate/core/lib.py", line 20, in <module>
    qmlib = _load_cdll("qmlib")
  File "/Users/tmerry/QuakeMigrate/quakemigrate/core/libnames.py", line 45, in _load_cdll
    raise ImportError(msg)
ImportError: Could not load extension library 'qmlib'.

dlopen(/Users/tmerry/QuakeMigrate/quakemigrate/core/src/qmlib.cpython-38-darwin.so, 6): image not found

If you have chosen to install from a clone of the github repository, please ensure you have run 'python setup.py install', which will compile and install the C library. See the installation documentation for more details.

Do you have any idea what might have gone wrong?
Thanks
Tom

libgcc-ng

Hi. I tried installing QuakeMigrate, but fail to do so. I am running latest MacOS.

I removed the libgcc-ng from QuakeMigrate.yml followed by creation of the new env.
When I activate the new env, I tried installing libgcc-ng with the conda command, but I get the

PackagesNotFoundError: The following packages are not available from current channels:

  • libgcc-ng

error.

Any idea how to proceed?

Thanks!

Plotting submodule - deprecation of QuakePlot class

I propose moving the SeisPlot class out into a submodule QMigrate.plot that contains a series of functions that can be called to plot:

  • Summary plot
  • Trace plots
  • Coalescence trace
  • Coalescence video

In theory this is an entirely internal change, so I see zero change for end users.

It has the added benefit of reducing complexity of the signal/scan.py file.

BUG: Binder notebooks do not work.

Describe the bug
Binder notebooks (linked in README.md) do not work - ModuleNotFoundError in first cell.

image

(Thanks to @ThomasLecocq for spotting this.)

To Reproduce
Open any of the binder notebooks for the master branch, and attempt to run.

Expected behavior
quakemigrate should have been successfully built and installed on the binder image, and hence imported successfully in the notebook.

Desktop (please complete the following information):

  • Operating System: binder (some linux distro)
  • Python version: 3.11.3
  • QuakeMigrate version: 1.0.1

Additional context
See https://github.com/TomWinder/QuakeMigrate/tree/fix_binder for a working set of notebooks.

This was fixed in ff205fd by renaming environment.yml to quakemigrate.yml. This then prompts repo2docker to use the PythonBuildPack, and consequently apply the pip install -e . command to build and install quakemigrate.

If repo2docker detects a environment.yml file, it uses the CondaBuildPack and ignores setup.py etc., therefore failing to build the package. See https://repo2docker.readthedocs.io/en/latest/usage.html and https://repo2docker.readthedocs.io/en/latest/config_files.html#setup-py-install-python-packages and (most usefully) https://discourse.jupyter.org/t/an-unfolding-story-of-my-first-contribution-to-repo2docker/839/4 for info & context.

Note also that a temporary workaround is to just run pip install quakemigrate in the notebook before attempting to run the first cell. This installs successfully, allowing the notebook to run.

Data warnings over day lines

Currently getting the following error when performing detect when the start time and end time span a dateline:

Processing : 2018-04-11T23:55:46.900000Z - 2018-04-11T23:59:04.000000Z
Processing : 2018-04-11T23:57:46.900000Z - 2018-04-12T00:01:04.000000Z

RuntimeWarning: invalid value encountered in true_divide
  dsnr = np.exp((dsnr / (len(avail_idx) * 2)) - 1.0)
RuntimeWarning: invalid value encountered in true_divide
  map_ = map_ / sum_coa[np.newaxis, np.newaxis, np.newaxis, :]
RuntimeWarning: invalid value encountered in greater
  dsnr[dsnr > 21474.] = 21474.

I'll do a bit of digging, just wanted to flag it here.

units

Units should be in km not metres.

Issue Import QuakeMigrate in Python (AttributeError)

Hi,

I'm trying to install QuakeMigrate on my DEBIAN GNU/Linux 10 machine and I'm using the Installation docs on the website to do so.

I have managed to create an environment and activate it and the installation from source seems to be also a success. However when I want to import quakemigrate in a Python session, I have the following error :

" AttributeError: 'numpy.int64' object has no attribute 'split' "

Here is my last process of installation :

(quakemigrate) [peltierp@ige-osugb-s-242]:~/Documents/QuakeMigrate-master$ python
Python 3.8.13 | packaged by conda-forge | (default, Mar 25 2022, 06:04:10)
[GCC 10.3.0] on linux
Type "help", "copyright", "credits" or "license" for more information.

import quakemigrate
Traceback (most recent call last):
File "", line 1, in
File "/home/peltierp/Documents/QuakeMigrate-master/quakemigrate/init.py", line 16, in
from quakemigrate.io.data import Archive # NOQA
File "/home/peltierp/Documents/QuakeMigrate-master/quakemigrate/io/init.py", line 31, in
from .availability import read_availability, write_availability # NOQA
File "/home/peltierp/Documents/QuakeMigrate-master/quakemigrate/io/availability.py", line 15, in
from obspy import UTCDateTime
File "/home/peltierp/miniconda3/envs/quakemigrate/lib/python3.8/site-packages/obspy/init.py", line 39, in
from obspy.core.utcdatetime import UTCDateTime # NOQA
File "/home/peltierp/miniconda3/envs/quakemigrate/lib/python3.8/site-packages/obspy/core/init.py", line 124, in
from obspy.core.utcdatetime import UTCDateTime # NOQA
File "/home/peltierp/miniconda3/envs/quakemigrate/lib/python3.8/site-packages/obspy/core/utcdatetime.py", line 27, in
from obspy.core.util.deprecation_helpers import ObsPyDeprecationWarning
File "/home/peltierp/miniconda3/envs/quakemigrate/lib/python3.8/site-packages/obspy/core/util/init.py", line 27, in
from obspy.core.util.base import (ALL_MODULES, DEFAULT_MODULES,
File "/home/peltierp/miniconda3/envs/quakemigrate/lib/python3.8/site-packages/obspy/core/util/base.py", line 36, in
from obspy.core.util.misc import to_int_or_zero, buffered_load_entry_point
File "/home/peltierp/miniconda3/envs/quakemigrate/lib/python3.8/site-packages/obspy/core/util/misc.py", line 214, in
loadtxt(np.array([0]), ndmin=1)
File "/home/peltierp/miniconda3/envs/quakemigrate/lib/python3.8/site-packages/numpy/lib/npyio.py", line 1086, in loadtxt
ncols = len(usecols or split_line(first_line))
File "/home/peltierp/miniconda3/envs/quakemigrate/lib/python3.8/site-packages/numpy/lib/npyio.py", line 977, in split_line
line = line.split(comment, 1)[0]
AttributeError: 'numpy.int64' object has no attribute 'split'

  • Operating System: Debian GNU/ Linux 10
  • Python version: 3.8.13
  • QuakeMigrate version: 1.0.0

Do you have any idea what might have gone wrong ?

Philémon

Re-write NonLinLoc class

Current workflow to read 3d nlloc lut files:

  • read file ==> nlloc_load_file()
  • define LUT properties & projection ==> nlloc_regrid() OR nlloc_project_grid()
    ==> some redundancy here and nlloc_project_grid() does not currently work.

Solution:

  • make new function to re-map grid parameters from NLLOC .hdr file to LUT class variables etc. (job done at start of nlloc_regrid() ) to be called at end of nlloc_load_file().
  • AND/OR re-structure GRID3D / LUT / NonLinLoc classes, so that class inheritance etc. makes more sense.

np.int32 overflow

Currently, coalescence data are saved as obsy Stream objects output to miniSEED. Values are saved to 8 decimal places by first multiplying the values by a factor of 1e8 (for coalescence and normalised coalescence) and 1e6 (for the X and Y locations of maximum coalescence) and then converted to numpy.int32 objects. This data type has the capacity to store integers up to the value of 2^31 (2 147 483 648), after which they will overflow. As such, if the coalescence value exceeds ~21.47 it will not record the correct value.

I propose switching to np.int64 objects. This data type will have no problem recording the true coalescence data.

Error when no earthquke in trigger step.

Describe the bug
Trigger code stops with an error when there isn't any earthquake during a day.

To Reproduce
To reproduce the error, compute the trigger plot on a day without earthquake in an area or use an higher detection threshold to avoid detecting earthquake. The error message says "local variable 'discarded' referenced before assignment" in ~/miniconda3/envs/quakemigrate/lib/python3.8/site-packages/quakemigrate/signal/trigger.py (line 299) when using the function trig.trigger.
I looked at it and it is at line 279 :

    if candidate_events.empty:
        logging.info("\tNo events triggered at this threshold - try a "
                     "lower detection threshold.")
        events = candidate_events
    else:
        refined_events = self._refine_candidates(candidate_events)
        events = self._filter_events(refined_events, batchstart, batchend,
                                     region)
        discarded = refined_events[~refined_events.isin(events)].dropna() #this variable is only defined here and not in the first condition
        logging.info(f"\n\t\t{len(events)} event(s) triggered within the "
                     f"specified region between {batchstart} \n\t\tand "
                     f"{batchend}")
        logging.info("\n\tWriting triggered events to file...")
        write_triggered_events(self.run, events, batchstart)

    if self.plot_trigger_summary:
        logging.info("\n\tPlotting trigger summary...")
        trigger_summary(events, batchstart, batchend, self.run,
                        self.marginal_window, self.min_event_interval,
                        threshold, self.normalise_coalescence, self.lut,
                        data, region, discarded,   #here is the problem
                        interactive=interactive_plot,
                        xy_files=self.xy_files,
                        plot_all_stns=self.plot_all_stns)             

If the code go to the first if and then to the second one, discarded is not defined and an error occurs.

Expected behavior
I just expect to have either a trigger summary plot without any earthquake at this day or just skip the plot if it is not mandatory for the locate step.

Desktop (please complete the following information):

  • Operating System: Ubuntu 20.04.3 LTS
  • Python version: 3.8.12
  • QuakeMigrate version: 1.0.0

Additional context
I am running it on a different computer (uranus) and I had on problem before with the version of one function too.

BUG: Reading triggered events when using locate with partial days of data

Describe the bug
When running Locate for a window of time with start/end times that are not an integer number of days apart and that cross a dateline, the process of reading in triggered event files will fail (silently) for the final day. It will simply not read in the triggered event file for the last day, and will only locate events for Julian days before this.

For example, running Detect, Trigger, and Locate with the starttime/endtime:

starttime = "2018-121T22:00:00.0"
endtime = "2018-122T02:00:00.0"

Any events triggered between 22:00 and 00:00 will be successfully located, but any between 00:00 and 02:00 will not.

This arises in the `quakemigrate/io/triggered_events.py' file, for lines 53-66:

trigger_files = []
readstart = starttime
while readstart <= endtime:
    fstem = f"{run.name}_{readstart.year}_{readstart.julday:03d}"
    file = (fpath / f"{fstem}_TriggeredEvents").with_suffix(".csv")
    if file.is_file():
        trigger_files.append(file)
    else:
        logging.info(f"\n\t    Cannot find file: {fstem}")
    readstart += 86400
if len(trigger_files) == 0:
    raise util.NoTriggerFilesFound
events = pd.concat((pd.read_csv(f) for f in trigger_files),
                   ignore_index=True)

The initial readstart <= endtime test returns False for the final day, if the hour/minutes of the starttime are later than the hour/minutes of the endtime.

This may also be an issue when reading the availability files and plotting during Trigger - I am investigating and will update this Issue accordingly.

To Reproduce
Run Detect and Trigger using a small sample of data over a dateline (start and end times either side of midnight, e.g. those above). Locate will successfully register events triggered before midnight, but not after.

Expected behavior
All event should be detected, as triggered event files (which are stored as day files, named by the Julian day) exist.

Desktop (please complete the following information):

  • Operating System: macOS Big Sur
  • Python version: 3.8
  • QuakeMigrate version: 1.0.1

Additional context
This was identified when running tests on seismic data from the HVO network.

Cross-platform support

Currently, all major development and usage has been within Unix-based Linux operating systems. Whilst efforts have been made to ensure the code is cross-platform compatible, full, out-of-the-box support for Windows or Mac is not yet available. It has been possible to install and run QuakeMigrate on both of these platforms, but the process of doing so is not yet streamlined. Below is a brief overview of what is involved and what we need to do:

Windows
Compiling the C code requires a suitable compiler (such as GCC) to be installed. The file format used to link the compiled C code to Python is different to that used in a Unix-based operating system. This should be handled during the setup.py script.

  • Compiled, installed, and tested on Windows.

Linux
The majority of development has been in an Ubuntu environment (16.04 and 18.04).

  • Compiled, installed, and tested on Ubuntu 16.04

  • Compiled, installed, and tested on Ubuntu 18.04

  • Compiled, installed, and tested on Ubuntu 20.04

  • Compiled, installed, and tested on a RedHat system (v??)

Mac
Again, the main issue is with installing a compiler. GCC needs to be homebrewed before it can be used to compile the C code. After that, everything is broadly the same.

  • Compiled, installed, and tested on MacOS.

The most important thing, however, is to create a set of unit tests with which we can verify that the code not only runs on each platform, but produces the exact same result.

Fix icequake example.

.pdf doesn't look right. Compare to previous working example on master for what it is expected to look like.

icequake is also spelled wrong 'iceqauke.py' etcetera...

Continuous Integration

Guys - great looking piece of code! I haven't tried it but will certainly take a look.

Your documentation mentions the lack of support from MacOS or Windows platforms. You should consider using continuous integration tools to automatically test compilation on those systems using virtual machines. There's a certain number of available CI tools (Travis CI is one of the standards) with really good documentation online, but if you need a place to start with Travis you can look here: https://github.com/nfsi-canada/OrientPy/blob/master/.travis.yml.

Your .travis.yml file will likely be more complicated due to the required installation of NonLinLoc from an external source (i.e., other than through pip or conda - but see before_install: in the travis file).

PickTimes export to Obspy

I want to export to ObsPy the catalog created with QuakeMigrate locations. So far, so good. The problem is the picks on the exported file. When PickTime is unavailable (filled with -1 in picks files), it's filled with the ModelledTime. I need only PickTime

To export the catalog, I use the following code:

from quakemigrate.export import to_obspy

catalogo = to_obspy.read_quakemigrate("/Volumes/SEISMIC/quakemigrate/8H_QNET/outputs/runs/run_name/", units="km", local_mag_ph="S")

catalogo.write('detections.xml', 'QUAKEML')

The following lines are an excerpt of the picks file for a given earthquake:

Station,Phase,ModelledTime,PickTime,PickError,SNR
PPJ,S,2020-01-01T01:45:51.071473Z,2020-01-01T01:45:52.204215Z,0.0805,14
QB03,P,2020-01-01T01:45:45.681704Z,2020-01-01T01:45:44.461308Z,0.0783,7.64

And the following lines are the output in XML format to the same event as the previous lines:

 <time>
          <value>2020-01-01T01:45:52.204215Z</value>
          <uncertainty>0.0805</uncertainty>
        </time>
        <waveformID networkCode="" stationCode="PPJ"></waveformID>
        <methodID>smi:local/autopick</methodID>
        <phaseHint>S</phaseHint>
        <ns0:snr>14.0</ns0:snr>
      </pick>
      <pick publicID="smi:local/d95eb323-4139-4072-b409-e4f270e85ff8">
        <time>
          <value>2020-01-01T01:45:45.681704Z</value>
        </time>
        <waveformID networkCode="" stationCode="QB03"></waveformID>
        <methodID>smi:local/modelled</methodID>
        <phaseHint>P</phaseHint>
      </pick>

I tried to avoid this problem, I removed lines 228 and 229 of module to_obspy.py nonetheless, the problem (bug?) remains.

macOS gcc compiler issues during installation (path to `as` -- possibly related to previous macPorts install?)

Describe the bug
After the creation of the conda environment, the command $ pip install . fails to install quakemigrate.
Main error is:
error: command 'gcc' failed with exit status 1
See attachment for the complete error screendump.

To Reproduce

  • Create the conda environment wit the requisites specified in quakemigrate.yml
    $ conda env create -f quakemigrate.yml
  • Activate environment and run
    $ pip install .

Expected behavior
Quakemigrate should be correctly installed

Desktop (please complete the following information):

  • MacOS: 11.5.2
  • Python version: 3.8.12
  • QuakeMigrate version: latest

Additional context
Gcc is installed

$ gcc -v
Using built-in specs.
COLLECT_GCC=gcc
COLLECT_LTO_WRAPPER=/usr/local/libexec/gcc/x86_64-apple-darwin14.4.0/5.1.0/lto-wrapper
Target: x86_64-apple-darwin14.4.0
Configured with: ../gcc-5.1.0/configure --enable-languages=c++,fortran
Thread model: posix
gcc version 5.1.0 (GC

error1.txt

Scan - External Onset Functions

Currently the internal compute determines the onset functions using a classic STA/LTA approach back-propagating these values and determine a coalescence value.

The addition of external Onset functions would allow the incorporation of External Machine Learning packages (GPD, Ross et al 2018) and other onset functions e.g. Kutosis.

error in compute_traveltimes

When I compute_traveltimes with "homogenous" method it works perfect, but when I use one of the other methods I get an error calculating the travel times to S phase. It's very strange, since it computes the travel times to the first station and the error shows-up on the second one. The output and error are the following;

Computing 1-D nlloc traveltimes for...
	...phase: P...
		...running Grid2Time - station: ADH   - 1 of 13
		...running Grid2Time - station: CALA  - 2 of 13
		...running Grid2Time - station: HOR   - 3 of 13
		...running Grid2Time - station: PAGU  - 4 of 13
		...running Grid2Time - station: PCAN  - 5 of 13
		...running Grid2Time - station: PCED  - 6 of 13
		...running Grid2Time - station: PGRA  - 7 of 13
		...running Grid2Time - station: PICO  - 8 of 13
		...running Grid2Time - station: PID   - 9 of 13
		...running Grid2Time - station: PMAN  - 10 of 13
		...running Grid2Time - station: PPNO  - 11 of 13
		...running Grid2Time - station: PSCM  - 12 of 13
		...running Grid2Time - station: ROSA  - 13 of 13
	...phase: S...
		...running Grid2Time - station: ADH   - 1 of 13
		...running Grid2Time - station: CALA  - 2 of 13
Traceback (most recent call last):
  File "/Users/joaofontiela/Documents/PycharmProjects/Earthquake_detection/sjo_teste.py", line 172, in <module>
    lut = compute_traveltimes(grid_spec, stations, method="1dnlloc", vmod=vmodel,
  File "/Users/joaofontiela/opt/anaconda3/envs/quakemigrate/lib/python3.8/site-packages/quakemigrate/lut/create_lut.py", line 212, in compute_traveltimes
    _compute_1d_nlloc(lut, phase, vmodel, **kwargs)
  File "/Users/joaofontiela/opt/anaconda3/envs/quakemigrate/lib/python3.8/site-packages/quakemigrate/lut/create_lut.py", line 455, in _compute_1d_nlloc
    out = check_output([str(nlloc_path / mode),
  File "/Users/joaofontiela/opt/anaconda3/envs/quakemigrate/lib/python3.8/subprocess.py", line 415, in check_output
    return run(*popenargs, stdout=PIPE, timeout=timeout, check=True,
  File "/Users/joaofontiela/opt/anaconda3/envs/quakemigrate/lib/python3.8/subprocess.py", line 516, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['Vel2Grid', 'control.in']' returned non-zero exit status 252.

My system is a macOS 12.4 and conda version : 4.12.0 ,conda-build version : 3.21.4, python version : 3.8.11.final.0 and quakemigrate 1.0.0

No files found in archive for this time period : No detection

Hi,

Describe the bug
I want to run the program for a small array of stations but the detection step doesn't work. It says : "No files found in archive for this time period" despite the fact that I have put my mseed files in the good directory and with the right time window.

To Reproduce
Here is my code :

station_file="/home/peltierp/Documents/Data/QuakeMigrate/Inputs/Astrolabe_stations.txt"
data_in = "/home/peltierp/Documents/Data/QuakeMigrate/Inputs/MSEED"
lut_out = "/home/peltierp/Documents/Data/QuakeMigrate/Outputs/icequake_Astrolabe.LUT"
run_path = "/home/peltierp/Documents/Data/QuakeMigrate/Outputs/runs"
run_name = "icequake_example_Astrolabe"

cproj = Proj(proj="longlat", ellps="WGS84", datum="WGS84", no_defs=True)
gproj = Proj(proj="lcc", lon_0=139.93279, lat_0=-66.73208, lat_1=-66.72, lat_2=-66.74650,
             datum="WGS84", ellps="WGS84", units="km", no_defs=True)

ll_corner = [139.85, -66.75, -0.1]
ur_corner = [140.05, -66.71, 0.8]
node_spacing = [0.02, 0.02, 0.02]

# --- Define the grid specifications ---
grid_spec = AttribDict()
grid_spec.ll_corner = ll_corner
grid_spec.ur_corner = ur_corner
grid_spec.node_spacing = node_spacing
grid_spec.grid_proj = gproj
grid_spec.coord_proj = cproj

# --- Read in the station information file ---
stations = read_stations(station_file)

# --- Homogeneous LUT generation ---
lut = compute_traveltimes(grid_spec, stations, method="homogeneous", phases=["P", "S"],
                          vp=3.6, vs=1.8, log=True,
                          save_file=lut_out)

# --- Create new Archive and set path structure ---
archive = Archive(archive_path=data_in, stations=stations,
                  archive_format="YEAR/JD/*_STATION_*")

# --- Create new Onset ---
onset = STALTAOnset(position="classic", sampling_rate=200)
onset.phases = ["P", "S"]
onset.bandpass_filters = {
    "P": [5, 20, 4],
    "S": [5, 20, 4]}
onset.sta_lta_windows = {
    "P": [0.01, 0.25],
    "S": [0.05, 0.5]}


# --- Create new QuakeScan ---
scan = QuakeScan(archive, lut, onset=onset, run_path=run_path,
                 run_name=run_name, log=True, loglevel="info")

# --- Set detect parameters ---
scan.timestep = 0.75
# NOTE: please increase the thread-count as your system allows; the
# core migration routines are compiled against OpenMP, and using
# multithreading will ~ linearly speed up the compute time!
scan.threads = 1

# --- Set time period over which to run detect ---
starttime = "2022-01-23T05:05:45.0"
endtime = "2022-01-23T05:06:00.0"

# --- Run detect ---
scan.detect(starttime, endtime)

Expected behavior
I would expect the program to be able to find those mseed files which are supposd to be in the archive (which have the following names and are hourly files : 2022.01.23.05.AST04.00.ZR.HHZ.mseed (HHZ channel of station AST04 the 23/01/2022 at 5 am) and here is their path : /home/peltierp/Documents/Data/QuakeMigrate/Inputs/MSEED/2022/023/).

Desktop (please complete the following information):

  • Operating System: Debian GNU/ Linux 10
  • Python version: 3.8.13
  • QuakeMigrate version: 1.0.0

Do you have an idea of what could be the problem here ?

Philémon

Overlapping waveforms in locate plot summary

Hi,

Describe the bug
It is not really a bug. I'm dealing with an array of 4 to 6 stations so the output pdf file contains a plot with 4 to 6 waveforms corresponding to the event. If I have a seismic event close to one station, the amplitude is really different from one waveform to the other (10 times larger sometimes) and, thus, the plot of waveforms is not really readable (overlapping waveforms) (scaling seems to be made over the smaller amplitude waveform and not on each waveforms). I tried to find some info in quakemigrate.plot docs but I didn't find.

Do you have an idea where I could find some parameters that could be changed to fix this issue?

To Reproduce
Just using the different steps of QM and reading the output pdf file of locate.

Expected behavior
I would like to adjust the scales (y values) of each waveform in the output pdf file of locate.

Desktop (please complete the following information):

Operating System: Debian GNU/ Linux 10
Python version: 3.8.13
QuakeMigrate version: 1.0.0

Additional context
Please find attached a pdf file as an example.

Phil

icequake_example_Astrolabe_20220115021309825_EventSummary.pdf

BUG: Add "units" arg to to_mfast.py

Currently to_mfast.py assumes QM outputs event depths and uncertainties in metres but this has now changed to km.

Add a 'units' arg (see to_obspy.py).

[Feat] Lightweight Event/Timestep objects

Issue statement
Currently information pertaining to an event or timestep is stored on a range of objects. Some objects, such as the Archive, are being used to perform tasks that (I feel) are not within their intended scope.

Example: the purpose of the Archive class is to provide an interface with the raw data. The Archive can be queried and the data that satisfy the criteria of this query is served up to the user (perhaps after some pre-processing). The instance of the Archive should remain unchanged by any given query. Currently, however, the queried data is stored on the Archive object to be accessed later, meaning the Archive object has to be passed through the rest of the QuakeMigrate workflow.

Proposition
Build two small, lightweight classes that will create objects designed to capture information about a Timestep (for detect) or an Event (for locate). When the Archive object is queried, it will construct an instance of one of these classes, populate it with the required information, and serve it up. We are avoiding the use of the ObsPy Event, Stream, and Trace classes here as, while they meet our requirements, they come with a large amount of additional functionality that we just do not need.

Result
Cleaner, more future proof code.

Reach
Package-wide, albeit purely internal, changes.

ImportError: Could not load extension library 'qmlib'.

I installed quakemigrate according to the steps of the instructions. But I still meet a problem, like this:

>>> import quakemigrate
Traceback (most recent call last):
  File "/Users/guanlingpeng/QuakeMigrate/quakemigrate/core/libnames.py", line 39, in _load_cdll
    cdll = ctypes.CDLL(str(lib))
  File "/Users/guanlingpeng/anaconda3/envs/quakemigrate/lib/python3.7/ctypes/__init__.py", line 364, in __init__
    self._handle = _dlopen(self._name, mode)
OSError: dlopen(/Users/guanlingpeng/QuakeMigrate/quakemigrate/core/src/qmlib.cpython-37m-darwin.so, 0x0006): tried: '/Users/guanlingpeng/QuakeMigrate/quakemigrate/core/src/qmlib.cpython-37m-darwin.so' (no such file), '/usr/local/lib/qmlib.cpython-37m-darwin.so' (no such file), '/usr/lib/qmlib.cpython-37m-darwin.so' (no such file)

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/Users/guanlingpeng/QuakeMigrate/quakemigrate/__init__.py", line 18, in <module>
    from quakemigrate.signal import QuakeScan, Trigger  # NOQA
  File "/Users/guanlingpeng/QuakeMigrate/quakemigrate/signal/__init__.py", line 20, in <module>
    from .scan import QuakeScan  # NOQA
  File "/Users/guanlingpeng/QuakeMigrate/quakemigrate/signal/scan.py", line 23, in <module>
    from quakemigrate.core import find_max_coa, migrate
  File "/Users/guanlingpeng/QuakeMigrate/quakemigrate/core/__init__.py", line 20, in <module>
    from .lib import migrate, find_max_coa  # NOQA
  File "/Users/guanlingpeng/QuakeMigrate/quakemigrate/core/lib.py", line 20, in <module>
    qmlib = _load_cdll("qmlib")
  File "/Users/guanlingpeng/QuakeMigrate/quakemigrate/core/libnames.py", line 45, in _load_cdll
    raise ImportError(msg)
ImportError: Could not load extension library 'qmlib'.

dlopen(/Users/guanlingpeng/QuakeMigrate/quakemigrate/core/src/qmlib.cpython-37m-darwin.so, 0x0006): tried: '/Users/guanlingpeng/QuakeMigrate/quakemigrate/core/src/qmlib.cpython-37m-darwin.so' (no such file), '/usr/local/lib/qmlib.cpython-37m-darwin.so' (no such file), '/usr/lib/qmlib.cpython-37m-darwin.so' (no such file)

If you have chosen to install from a clone of the github repository, please ensure you have run 'python setup.py install', which will compile and install the C library. See the installation documentation for more details.

===================================
I don't know how to solve it.

Trigger: Handle case where no events are triggered

Describe the bug
A clear and concise description of what the bug is.
When running Trigger I receive an UnboundError related to the local variable 'discarded'. The only difference between my script and that of the example is I have removed the optional region argument to the trigger call.

To Reproduce
Steps to reproduce the behaviour (including error messages if applicable).

(quakemigrate_dev) tg286@circinus:/space/tg286/indonesia/ocean_bottom_data/qmigrate$ python ./trigger.py 
==============================================================================================================
==============================================================================================================
	QuakeMigrate RUN - Path: outputs/runs/test_run - Name: test_run
==============================================================================================================
==============================================================================================================

==============================================================================================================
	TRIGGER - Triggering events from .scanmseed
==============================================================================================================

	Triggering events from 2020-03-25T00:00:00.000000Z to 2020-03-26T00:00:00.000000Z

	Trigger parameters:
		Pre/post pad = 120.0 s
		Marginal window = 1.0 s
		Minimum event interval  = 60.0 s

		Triggering from the normalised maximum coalescence trace.

		Trigger threshold method: static
		Static threshold = 1.45

==============================================================================================================
	Reading in .scanmseed...

	    No .scanmseed file found for day 2020_084!

	    No .scanmseed file found for day 2020_086!
	    Warning! No .scanmseed data found for pre-pad!
	    Warning! No .scanmseed data found for post-pad!
	    ...from 2020-03-25T00:00:00.000000Z - 2020-03-25T23:59:59.900000Z.

	Triggering events...
	No events triggered at this threshold - try a lower detection threshold.

	Plotting trigger summary...
Traceback (most recent call last):
  File "./trigger.py", line 58, in <module>
    trig.trigger(starttime, endtime, interactive_plot=False)
  File "/space/tg286/quake_migrate/tim_area/QuakeMigrate/quakemigrate/signal/trigger.py", line 244, in trigger
    self._trigger_batch(batchstart, batchend, region, interactive_plot)
  File "/space/tg286/quake_migrate/tim_area/QuakeMigrate/quakemigrate/signal/trigger.py", line 299, in _trigger_batch
    data, region, discarded,
UnboundLocalError: local variable 'discarded' referenced before assignment

Expected behavior

I would have expected trigger to run

Desktop (please complete the following information):

  • Operating System: Ubuntu
  • Python version: 3
  • QuakeMigrate version: 1.0

Additional context
Add any other context about the problem here.

TEST FAILURE: dike intrusion example

Describe the bug
The benchmark test for the Volcanotectonic_Iceland example fails (specifically test_locate()).

To Reproduce
Run the examples, then run test_benchmarks.py

Expected behavior

======================================================================
FAIL: test_locate (__main__.TestExamples)
Check the outputs of locate.
----------------------------------------------------------------------
Traceback (most recent call last):
  File "test_benchmarks.py", line 146, in test_locate
    pd.testing.assert_frame_equal(pd.read_csv(b_event),
  File "/Applications/anaconda3/envs/quakemigrate/lib/python3.8/site-packages/pandas/_testing.py", line 1372, in assert_frame_equal
    assert_series_equal(
  File "/Applications/anaconda3/envs/quakemigrate/lib/python3.8/site-packages/pandas/_testing.py", line 1186, in assert_series_equal
    _testing.assert_almost_equal(
  File "pandas/_libs/testing.pyx", line 65, in pandas._libs.testing.assert_almost_equal
  File "pandas/_libs/testing.pyx", line 174, in pandas._libs.testing.assert_almost_equal
  File "/Applications/anaconda3/envs/quakemigrate/lib/python3.8/site-packages/pandas/_testing.py", line 915, in raise_assert_detail
    raise AssertionError(msg)
AssertionError: DataFrame.iloc[:, 19] (column name="ML") are different

DataFrame.iloc[:, 19] (column name="ML") values are different (100.0 %)
[left]:  [0.865]
[right]: [0.863]

----------------------------------------------------------------------
Ran 4 tests in 0.692s

FAILED (failures=1)

Desktop (please complete the following information):

  • Operating System: any
  • Python version: any
  • QuakeMigrate version: 1.0.0

Additional context
This is caused by a package update, and is a ~ 8th decimal place issue. We will update the test soon. In the meantime, please ignore this error.

FEAT: Make picking optional.

Make it so users can choose whether to make picks (ie picker=False or picker=None when instantiating QuakeScan object)

[BUG]: skfmm LUT generation doesn't work for the southern hemisphere

Describe the bug
Attempted to use the "1dfmm" method to calculate an LUT for a grid in the southern hemisphere (Java) - traveltimes are clearly wrong. Suspect they may be mirrored along one axis; y seems most likely?

To Reproduce
Specify a grid that is in the southern hemisphere (ie a pair of negative latitudes - note the grid in question was also in the eastern hemisphere; not yet tested if there would be any / different problems in south&west), and calculate traveltimes using the "1dfmm" method. Manual inspection of traveltimes from various points in the LUT shows obviously incorrect results (e.g. the most distant station sometimes have the shortest traveltimes).

Expected behavior
Traveltimes increasing with distance from the chosen location. Note that several tests were performed to isolate this to the traveltime generation step, including:

  • using the same gridspec, station file etc. but with 'homogeneous' velocity model --> traveltimes increase with distance
  • inspection of grid coordinates of stations (i.e. do they correspond to expected locations)
  • plotting of velocity model to ensure it wasn't messed up somehow

Desktop (please complete the following information):

  • Operating System: Windows 11
  • Python version: 3.7
  • QuakeMigrate version: 1.0.0

Additional context
NOT yet compared to the "1dnlloc" traveltime generation. However, this method has been used before in the same region with no such difficulties. This indicates it is likely isolated to the "1dfmm" implementation.

BUG: network fix

There was a small bug in the latest changes to allow for networks to be specified. This fixes this bug. Stay tuned for a more complete fix using the full NSLC codes

NumPy multi-threading uses all available threads

Issue

In it's default state, NumPy will use all available threads for certain functions (those compiled against multi-processing compatible C libraries). Which library NumPy is built against depends on the user's system architecture, amongst other factors. As such, it is difficult to provide a catch-all solution to control the number of threads used (see numpy/numpy#11826 and numpy/numpy#16990).

The user may notice this while monitoring CPU usage with programs such as htop while running QuakeMigrate; CPU usage may spike above the user-specified number of threads.

Short term solution

For the moment, we have added snippets at the top of the example scripts to set all possible relevant environment variables before NumPy is imported. This is the only simple and guaranteed way to control the number of threads used. However, this only works if the environment variable(s) are set before NumPy is imported for the first time. E.g. if you have previously imported an ObsPy function, before running the code snippet, it will not work.

Permanent fix

In the future, we will look to provide a more elegant solution. The route recommended by NumPy is to use the threadpoolctl package to wrap all relevant functions. This will also provide the added benefit of allowing us to use the user-specified n_threads variable to set the number of threads used, rather than limiting it to 1. However, we note that in our own tests this led to minimal speed-up (due to the nature of the functions being used).

Location

Hello, guys:

When I run the locate part of icequake.py, it presents a FileNotFoundError problem. Do you know what is the reason for this problem, and how can I fix it?

Thank you very much.

The problem followed:

========================================================================================================================
LOCATE - Determining earthquake location and uncertainty

    Parameters specified:
            Start time                = 2014-06-29T18:41:55.000000Z
            End   time                = 2014-06-29T18:42:20.000000Z
            Number of CPUs            = 12

========================================================================================================================

Traceback (most recent call last):
File "icequake.py", line 142, in
scan.locate(starttime, endtime)
File "/home1/yangyg/Tools/Anaconda/anaconda/envs/QuakeMigrate/lib/python3.6/site-packages/QMigrate-1.0.20.6.14-py3.6.egg/QMigrate/signal/scan.py", line 622, in locate
self._locate_events(start_time, end_time)
File "/home1/yangyg/Tools/Anaconda/anaconda/envs/QuakeMigrate/lib/python3.6/site-packages/QMigrate-1.0.20.6.14-py3.6.egg/QMigrate/signal/scan.py", line 840, in _locate_events
trig_events = self.output.read_triggered_events(start_time, end_time)
File "/home1/yangyg/Tools/Anaconda/anaconda/envs/QuakeMigrate/lib/python3.6/site-packages/QMigrate-1.0.20.6.14-py3.6.egg/QMigrate/io/quakeio.py", line 434, in read_triggered_events
events = pd.read_csv(fname)
File "/home1/yangyg/Tools/Anaconda/anaconda/envs/QuakeMigrate/lib/python3.6/site-packages/pandas/io/parsers.py", line 646, in parser_f
return _read(filepath_or_buffer, kwds)
File "/home1/yangyg/Tools/Anaconda/anaconda/envs/QuakeMigrate/lib/python3.6/site-packages/pandas/io/parsers.py", line 389, in _read
parser = TextFileReader(filepath_or_buffer, **kwds)
File "/home1/yangyg/Tools/Anaconda/anaconda/envs/QuakeMigrate/lib/python3.6/site-packages/pandas/io/parsers.py", line 730, in init
self._make_engine(self.engine)
File "/home1/yangyg/Tools/Anaconda/anaconda/envs/QuakeMigrate/lib/python3.6/site-packages/pandas/io/parsers.py", line 923, in _make_engine
self._engine = CParserWrapper(self.f, **self.options)
File "/home1/yangyg/Tools/Anaconda/anaconda/envs/QuakeMigrate/lib/python3.6/site-packages/pandas/io/parsers.py", line 1390, in init
self._reader = _parser.TextReader(src, **kwds)
File "pandas/parser.pyx", line 373, in pandas.parser.TextReader.cinit (pandas/parser.c:4184)
File "pandas/parser.pyx", line 667, in pandas.parser.TextReader._setup_parser_source (pandas/parser.c:8449)
FileNotFoundError: File b'outputs/runs/icequake_example/icequake_example_TriggeredEvents.csv' does not exist

Issue with locate - Sizes of output from _compute do not match

I have detected and triggered events, but for some reason locate is not calculating the coalescence correctly. I've checked the inputs into the line that throws the error, and it is due to the datetime stamp array being 1 item longer than the other outputs. Has anyone seen this before? I don't want to try and fix it if it has already been fixed?

See error message below.

Many thanks,
Tom

Here is the error message:

========================================================================================================================
QuakeMigrate - Coalescence Scanning - Path: ../outputs/runs - Name: dynamic_trigger_testing

========================================================================================================================

========================================================================================================================
LOCATE - Determining earthquake location and uncertainty

Parameters specified:
	Start time                = 2014-06-29T18:00:00.000000Z
	End   time                = 2014-06-29T19:00:00.000000Z
	Number of CPUs            = 12

========================================================================================================================

========================================================================================================================
EVENT - 802 of 46 - 20140629180029114000

Determining event location...

Reading waveform data...
	Elapsed time: 2.424772 seconds.

Computing 4D coalescence grid...

Traceback (most recent call last):
File "/Users/eart0504/opt/anaconda3/envs/Qmigrate_tsh_dev/lib/python3.6/site-packages/pandas/core/internals.py", line 4247, in create_block_manager_from_blocks
placement=slice(0, len(axes[0])))]
File "/Users/eart0504/opt/anaconda3/envs/Qmigrate_tsh_dev/lib/python3.6/site-packages/pandas/core/internals.py", line 2685, in make_block
return klass(values, ndim=ndim, fastpath=fastpath, placement=placement)
File "/Users/eart0504/opt/anaconda3/envs/Qmigrate_tsh_dev/lib/python3.6/site-packages/pandas/core/internals.py", line 1817, in init
placement=placement, **kwargs)
File "/Users/eart0504/opt/anaconda3/envs/Qmigrate_tsh_dev/lib/python3.6/site-packages/pandas/core/internals.py", line 109, in init
len(self.mgr_locs)))
ValueError: Wrong number of items passed 1, placement implies 5

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "dynamic_trigger_testing.py", line 218, in
run_locate(data_in, station_file, lut_out, out_path, run_name, locate_sampling_rate, p_bp_filter, s_bp_filter, p_onset_win, s_onset_win, time_step, locate_decimation, marginal_window=marginal_window, n_cores=n_cores, locate_onset_centred=locate_onset_centred)
File "dynamic_trigger_testing.py", line 190, in run_locate
scan.locate(starttime, endtime)
File "/Users/eart0504/opt/anaconda3/envs/Qmigrate_tsh_dev/lib/python3.6/site-packages/QMigrate-1.0.20.1.16-py3.6.egg/QMigrate/signal/scan.py", line 622, in locate
self._locate_events(start_time, end_time)
File "/Users/eart0504/opt/anaconda3/envs/Qmigrate_tsh_dev/lib/python3.6/site-packages/QMigrate-1.0.20.1.16-py3.6.egg/QMigrate/signal/scan.py", line 884, in _locate_events
columns=["DT", "COA", "X", "Y", "Z"])
File "/Users/eart0504/opt/anaconda3/envs/Qmigrate_tsh_dev/lib/python3.6/site-packages/pandas/core/frame.py", line 297, in init
copy=copy)
File "/Users/eart0504/opt/anaconda3/envs/Qmigrate_tsh_dev/lib/python3.6/site-packages/pandas/core/frame.py", line 474, in _init_ndarray
return create_block_manager_from_blocks([values], [columns, index])
File "/Users/eart0504/opt/anaconda3/envs/Qmigrate_tsh_dev/lib/python3.6/site-packages/pandas/core/internals.py", line 4256, in create_block_manager_from_blocks
construction_error(tot_items, blocks[0].shape[1:], axes, e)
File "/Users/eart0504/opt/anaconda3/envs/Qmigrate_tsh_dev/lib/python3.6/site-packages/pandas/core/internals.py", line 4233, in construction_error
passed, implied))
ValueError: Shape of passed values is (1, 5), indices imply (5, 5)

BUG: 'discarded' events not being plotted in Trigger summary

Describe the bug
'Discarded' events (i.e. those which occur outside the user-specified 'region', are not being plotted on the Trigger summary plot. They should appear as grey points on the map & cross-sections, and as grey bands on the coalescence timeseries panels.

To Reproduce
Run the Volcanotectonic_Iceland example with default settings, and look at the trigger summary plot. The final coalescence peak that exceeds the threshold (~ 09:52), is not annotated with grey axvspans, nor shown on the map

Expected behavior
Grey points and bars illustrating these rejected events.

Desktop (please complete the following information):

  • Operating System: macOS 11.4 Big Sur (x86_64)
  • Python version: 3.9.16
  • QuakeMigrate version: 1.0.1

Additional context
Debugging shows this is due to L288 of quakemigrate/signal/trigger.py no longer working as intended. The events rejected by Trigger._filter_events() are being removed by the .dropna(). I think this is another quirk of the changes to pandas default dtype strategy for DataFrames.

The comparison (without .dropna()) outputs the following:

    EventID  CoaTime  TRIG_COA  COA_X  COA_Y  COA_Z  MinTime  MaxTime    COA  COA_NORM  EventNum
0     False    False     False  False  False  False    False    False  False     False     False
1     False    False     False  False  False  False    False    False  False     False     False
2     False    False     False  False  False  False    False    False  False     False     False
3     False    False     False  False  False  False    False    False  False     False     False
4     False    False     False  False  False  False    False    False  False     False     False
5     False    False     False  False  False  False    False    False  False     False     False
6     False    False     False  False  False  False    False    False  False     False     False
7     False    False     False  False  False  False    False    False  False     False     False
8     False    False     False  False  False  False    False    False  False     False     False
9     False    False     False  False  False  False    False    False  False     False     False
10    False    False     False  False  False  False    False    False  False     False     False
11    False    False     False  False  False  False    False    False  False     False     False
12    False    False     False  False  False  False    False    False  False     False     False
13    False    False     False  False  False  False    False    False  False     False     False
14    False    False     False  False  False  False    False    False  False     False     False
15    False    False     False  False  False  False    False    False  False     False     False
16    False    False     False  False  False  False    False    False  False     False     False
17    False    False     False  False  False  False    False    False  False     False     False
18    False    False     False  False  False  False    False    False  False     False     False
19    False    False     False  False  False  False    False    False  False     False     False
20    False    False     False  False  False  False    False    False  False     False     False
21    False    False     False  False  False  False    False    False  False     False     False
22    False    False     False  False  False  False    False    False  False     False     False
23    False    False     False  False  False  False    False    False  False     False     False
24    False    False     False  False  False  False    False    False  False     False     False
25    False    False     False  False  False  False    False    False  False     False     False
26     <NA>     True      <NA>   <NA>   <NA>   <NA>     True     True   <NA>      <NA>      <NA>

Which is then dropped, meaning the discarded DataFrame is left empty:

Empty DataFrame
Columns: [EventID, CoaTime, TRIG_COA, COA_X, COA_Y, COA_Z, MinTime, MaxTime, COA, COA_NORM, EventNum]
Index: []

And therefore not plotted.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.