kurtisanstey / internal_waves_barkley_canyon Goto Github PK
View Code? Open in Web Editor NEWOcean Physics MSc thesis at the University of Victoria, under the supervision of Dr. Jody Klymak, characterising the internal wave field at Barkley Canyon.
Ocean Physics MSc thesis at the University of Victoria, under the supervision of Dr. Jody Klymak, characterising the internal wave field at Barkley Canyon.
In most of the Barkley Canyon papers I've read, they use the same regional overview from ONC, or an old contour plot with little detail. So I grabbed some public data from GEBCO and made my own, showing more bathymetric contours and the locations of my primary ADCP.
If you're interested in checking it out, @jklymak , see here:
https://github.com/kurtisanstey/project/blob/master/archive/bathymetry/bathymetry.pdf
Check process, using Euler, to confirm 30 degrees positive rotation to align v with along-slope (NNW) and u with cross-slope (ENE).
Checking raw .nc files for each instrument.
if you add a readme, it could very briefly say what is in each of your subdirectories.
np.datetime64('2018-01-01')
is just as easy (or easier) than datetime(2018,1,1)
I don't think you need to go from pandas and back just to linearly interpolate... np.interp
should work .
we already discussed the output of the co-ords in the netcdf files.
not clear why you have an if/else
for the saving of the files....
I think its probably fine to just save your "segments" as separate files. But those time series should concatenate if possible. Another approach would've been to assign a segment number as a variable. Then you could have done
ds = xr.load_dataset('boo.nc')
d = ds.isel(segment=i)
and d would just be the data that had the segment you are interested in. The advantage of this, is that you have the data all in one place.
I think its OK to deal with lists of data sets like this, but consider the approach suggested above as well...
Building lists of arrays is a bit inelegant. I'd instead do:
um_PSD = np.zeros((nsegs, ndepths, nfs ))
um_PSD[i, j, :] = sig.welch()....
But its not terrible to do what you are doing...
OK, you can just save N2 as a variable with just depth dimension, and multiply when needed during plotting. You don't need to divide by "int_scale" and save that. Its just a multiplicative factor. Regardless, you definitely want to save that factor.
Again, not your only dims here are f_PSD and depth
MAJOR I'm actually confused here. I thought the plan was for you to make matrices of raw spectral estimates, and then average them as need be. i.e. don't average, and hence have many many segments, and save as 3-D matrices, with dimensions depth
, time
, f
, where time is the beginning or middle of each time slice, and each time slice represents 1024 15-minute averages (or whatever NFFT is). i.e. for a year it would be a matrix of 220, 30, 1024. spectra that are gaps, are just left as NaN. Then when you want to plot the average, you need only select the time interval to average (ignoring NaN). Or if you are plotting a spectrogram, you simply pcolor the spectrogram at the depth of interesting.
Or is that coming later? It really should be here.
Old & new data are the same pre-May 2013. Half-day March 2013:
Old & new data are the same pre-May 2013. Week in March 2013:
Old & new data are the same pre-May 2013. Everything before fix (July 2012- May 2013):
New data, half-day March 2014 (after fix date):
Old data, half-day March 2014:
New data, week in March 2014 (after fix date):
Old data, week in March 2014:
Semidiurnal example:
Main points of continuum write-up:
Results
Discussion
Code should be reformatted to save NetCDF structures using xarray at each major step. This will allow for individual sections of code to work independently without rerunning the entire process, each time.
Hi, Kurtis:
I have read through your thesis proposal and have a number of edits and comments that I will save for our conference call. I think this is a challenging and worthwhile undertaking and you are off to a very good start. I am delighted that you are making good use of the ONC records and I quite like the spectral plots on pages 12 and 13. I am looking forward to seeing what you find with respect to the properties and origins of: (1) baroclinic diurnal shelf waves; (2) near inertial motions (are these blue-shifted internal waves of inertial period?); (3) semidiurnal (M2) tides; and (4) fM2 internal waves. A comparison of the spatial and temporal variability of the various forms of internal motions over the shelf and in the canyon should be especially interesting. I am also interested to learn if there are internal solibores at the base of the canyon. As you can see by the attached reprint link below (the paper is online but too big to send via DFO webmail), these features can be highly dynamic.
I was hoping you could make use of all the 1-hour current meter data we (DFO) collected in Barkley Canyon in the late 1990s in a collaborative project with UBC, and used/discussed by Allen et al. (2001) in some detail. I have the data (current velocity, temperature and salinity) but suspect that the data would add too much to your processing load, although they would provide good information on the seasonal background flows for your MSc.
I suspect that your proposed schedule is optimistic but Jody would know better if it is reasonable, especially during a pandemic. I wish you the best of luck with your course work.
Kindest regards
Rick
Thomson, R. E., & Spear, D. J. (2020). Gravity currents facilitate formation of high‐frequency internal solitons and bores at the base of the Fraser Delta in the southern Strait of Georgia. Journal of Geophysical Research: Oceans, 125, e2020JC016589. https://doi.org/10.1029/2020JC016589; Received 7 JUL 2020; Accepted 29 SEP 2020; Accepted article online 4
Hi Kurtis,
I also read through it and will pass on edits (review) and comments in a word.doc.
And I also think this is a really good start! It is hard time to be a student.
Since Rick has given such a nice overview, I will pursue some details.
You've listed it as optional but I think it may be a very valuable component to do an analysis of critical slope. I think we have all the data in hand at ONC to provide a physical slope map based on some sort of smoothed resolution and a range of incoming wave directions. You can then pick a seasonal buoyancy frequency and probably more importantly a direction of incoming wave - so yes this could get messy very quick. But I think knowing the possible source areas would add a lot of insight into the single deployment site.
I think you should look at the rotary spectrograms with a little more frequency resolution. It will make it easier to confirm that the analysis is correct, and also provide more information on currents that are likely from internal waves. Also I think you are correct, the flattening of the spectra at high frequency is indicating noise, and then you are whitening the white, which makes it hard to see what is going on in the tidal and inertial frequencies.
I think, and this could be discussed at an upcoming meeting, that you may need to focus on certain aspects of your proposal. The best way to pick those aspects, is let the data lead you - follow what shows up the most interestingly and clearly in the data. Often a spectrogram can show you that. For example, is there inertial energy? Is it modulated seasonally? As Rick asks, how much of a blue shift is there, is there a variety of blue shifts and there is episodicity in these blue shifts, or is it just a barrage of near-inertial energy.
Looking forward to having a meeting and discussing further with your committee, and particularly with your supervisor on how best I can guide you to achieve your goals. You've done really well in getting a handle on some of these techniques.
Talks soon, Steve
Examples:
Velocities (40h low-pass - Slope):
Power spectra (Rotary - Axis, will be side by side with PSD):
Depth-frequency spectra (PSD - Axis):
Characterise the features of K1
For notifying Jody of writing updates as the thesis progresses.
Access the rough thesis document from the readme, documents folder, or using the following link:
https://github.com/kurtisanstey/project/blob/master/documents/Anstey_Thesis_Rough.pdf
Estimate dissipation rates, using c/GM amplitude ratios calculated in #40.
Internal wave interaction theory (Gregg, 1989; Polzin et al., 1995; Sun and Kunze, 1999; Althaus et al., 2003)
Semidiurnal outline
Upper Slope - Near-shelf intensification
Describe (what do I see):
Compare (what did others see):
Explain (what may be causing this):
Axis - Canyon-axis intensification
Describe (what do I see):
Compare (what did others see):
Explain (what may be causing this):
Same process as for the diurnal band. Check for trends (or lack thereof) in general plots, create band-specific plots that demonstrate these trends, research for comparisons and evidence, then write.
Steve mentioned that a student may have been using an instrument in Barkley Canyon for their MSc, when we first met with him. I had checked around, and found nothing, and it never came up in any of my Barkley Canyon research through Google Scholar or the UVic library. Until yesterday.
https://scholarworks.sjsu.edu/cgi/viewcontent.cgi?article=8604&context=etd_theses
It looks as if they only examine 2013, but they do use both Axis and Upper Slope, as I do. However, and this could be a controversial personal opinion, it doesn't seem very 'polished'. What do you think? Is my project a bust, or do I need to shift focus? This is very worrying to me.
#33 Subdiurnal summary
#21 Diurnal summary
#27 Inertial summary
#31 Semidiurnal summary
#32 Continuum summary
#30 Barotropic comparison
Fix scaling issue with GM spectrum, and determine whether there is a rotary version.
Rotational spectra need to have Welch's method applied to them. Get them to work for time-domain
check versus toy model and hard-wired code.
2-D rotational spectra. Get to work in t-z domain...
New and worth sharing:
These could be summarised as 'the Barkley Canyon internal wave field is highly dependent on depth and frequency', but that seems far too vague to be the 'main point'.
To do:
I used information from a few papers (primarily https://www.osti.gov/servlets/purl/5688766) to determine the instrument noise floor for a Welch PSD, which should be 2*(standard deviation of noise signal)^2.
RDI Workhorse Long Ranger 75 kHz ADCP (all of Upper Slope data, and early Axis data):
Nortek Signature55 55 kHz ADCP:
Is this how you would go about this? I did not find much on 'noise floor' in terms of using a predefined value (i.e. single ping uncertainty/accuracy), and some papers talk about using the RMS to estimate it from the data, directly.
Using appropriate averaging and weighting. Trying to automate it so I don't have to do it manually every time I want these years or seasons.
Continuum outline
Upper Slope - Near-shelf intensification
Describe (what do I see):
Compare (what did others see):
Explain (what may be causing this):
Axis - Canyon-axis intensification
Describe (what do I see):
Compare (what did others see):
Explain (what may be causing this):
These are useful to get propagation direction....
Main points of NI write-up:
Results
Discussion
Presentation slides with:
I get the following parameters from the device Additional Attributes https://data.oceannetworks.ca/DeviceListing?DeviceId=22925 and the NetCDF metadata:
AverageInterval = 108
MAveragingInterval = 108
NumberPings = 6
adcp_setup_cell_size_meters = 20 (from metadata and agrees with depth intervals in data; differs from Additional Attributes which says 10 but with no record of any changes, this could be the problem - see end of comment)
SoundVelocity = 0
Salinity = 34
PowerLevel = -2
I use these values in the Nortek Signature 55 Deployment software (https://www.nortekgroup.com/software) to get the ping uncertainty (called horizontal precision). The noise floor should be somewhere around 4e-1, and I'm sure the PSD amplitudes are correct as they are similar for the other instruments, which have a confirmed noise floor.
I start by assuming 1 ping every 18 s since that is the raw time interval, for which software gives an uncertainty of 3.82 cm/s. This is too low.
Then I try the ONC given parameters of 6 pings per 108 sec averaging interval, which still suggests 18s for a single ping. This is the same noise floor as the first case, and I don't understand how the time steps could be 18 s when all the intervals are listed as 108 s. And as the Nyquist frequency gets smaller for the 108 s interval it's even more off. This isn't right, either.
So that's all wrong. However, the software says the lowest the pings can go is 6 s intervals, so maybe that's the case and ONC hasn't listed it properly. This gives a larger uncertainty from the software, at 6.62 cm/s, but it turns out it's just the same noise floor. The 18 s uncertainty must just be averaged from the 6 s value. This doesn't help, for either 18s or 108s averaging intervals.
The noise floor is the same in every case - and too low.
In attempting a depth analysis it is important to consider the effects of stratification on the observed spectra.
Currently
To do
I believe these are what you suggested, and show the results we were discussing for slide 7. I tightened up the frequency axis limits to better scale colorbars to show depth trends in the tidal range. Any comments?
To check for threshold depths using correlation and intensity
http://cproof.uvic.ca/gliderdata/bathy/british_columbia_3_msl_2013.nc
Small enough area that can just scale axes aspect ratio by 1/cos(lat)
Obtained data from DFO Neah Bay (station 46206) for relevant years.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.