Git Product home page Git Product logo

ocean_data_tools's People

Contributors

castelao avatar kthyng avatar lnferris avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

ocean_data_tools's Issues

Input parsing and validation

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

Currently, none of the functions provide any form of input validation; this makes the toolbox very brittle and prone to uninformative errors. Input validation should be added to all functions to ensure that the correct number of inputs were passed, and that each input holds the type of data expected by the function (in terms of variable class, size, and value). When inputs fail the validation, informative error messages should be returned. When inputs point to files and/or folders, the function should check that the path exists and the files are of the expected format.

Many of the functions would be much more convenient to use if they allowed users to skip certain inputs and rely on default values. For example, argo_build should only require one input: the files of interest. The region, start date, and end date inputs could default to the span within those files, and the variables could default to all variables in the files (or perhaps a subset of most commonly-accessed ones).

Additions

  • time-varying build: Let model_build_profiles handle a list of times as an argument. If passing a list of times as an argument, length must either be 1 (like it is now) or match the length of xcoords and ycoords.

  • integrate cmocean colormaps

  • geovariable movies: Clean up and add the code for making HYCOM, Mercator movies from data in the local directory

  • expand list of data products: Create _build files for RSS, ECMWF, OOI, CMIP5 products

  • slab variables: add slab variables to model_build and modify general_section to accept

Bathymetry data reloaded for plots

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

Bathymetry data is reloaded from file every time it needs to be plotted; this is a very inefficient way of working with this data, especially across larger regions. It would be better if you separated data extraction from data plotting, allowing reuse of the same data in multiple plots.

Separating the two processes would also allow users to use a different bathymetry dataset; there's no particular reason to favor this one dataset, given that the bathymetry data is typically used only as a reference for plotting. Higher- or lower-resolution bathymetry products may be more useful for a given user. This is particularly true for basin-scale regions, where 1-minute resolution is overkill.

Also, note that Matlab's Mapping Toolbox has built-in support for the Smith & Sandwell bathymetry data, via the satbath function, but satbath expects data in the .img format, while ocean_data_tools expects it in netCDF format. Allowing more flexibility regarding bathymetry data can allow users like myself who already have the .img data downloaded to avoid unecessary replication. (I will note that the satbath function is designed around the older 2-minute dataset and is not compatible with newer versions due to its hard-coding of expected matrix dimensions, so there are good reasons to prefer bathymetry_extract over satbath in some use cases... but for many users the lower-resolution data is sufficient.)

Longitude wrapping

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

Functions that use a region box to filter data points need to be robust to varying longitude-wrapping conventions. Right now, some (e.g. bathymetry_extract) are, but others (e.g. argo_build) assume that the user's input box will match the conventions of the underlying dataset. Please check that all are either robust to any wrapping, or throw an informative error message for mismatches. At minimum, I suggest testing -180-180, 0-360, 25-385 (ocean-focused), and -25-335 (land-focused).

DOI for each release

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

A suggestion. Since you are already using Zenodo, it might be useful to have a DOI for each version released, so when ocean_data_tools is used it can be pointed to the specific version. If you adopt that, you can automate this process with Zenodo, and in that case, you can link the Zenodo DOI with your future JOSS paper (once accepted), as well as the DOIs of the datasets that you use, such as Argo dataset or WOA, and finally your ORCID. Here is an example of how to configure the Zenodo DOI record.

In my experience, the scientific community is still not used to cite software, so it is best for you to explicitly suggest to them what to cite when ocean_data_tools is used, and the JOSS DOI will probably be your best option.

In-file documentation

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

The current documentation lists the inputs and outputs for each function. However, the description of the input and output variables often provide only very minimal description, and not enough to avoid errors. All requirements for inputs (including class, expected size, and any limits of value) should be clearly stated. For outputs, fully describe the returned variables. In particular, describe the primary data structures, including all fields, their classes, and as much metadata as possible from the underlying data source (original name, units, etc).

For example, the function argo_build lists the syntax

[argo,matching_files] = argo_build(argo_dir,region,start_date,end_date,variable_list)

The example shows argo_dir as character array search path with wildcards; would a folder name work? (Answer: no) How about a cell array of specific file names? (also no). What happens if the specified path points to non-netCDF files? (Error using netcdflib, The NetCDF library encountered an error during execution of 'open' function - 'Unknown file format (NC_ENOTNC)'.)

The region input appears to be a row vector [latmin latmax lonmin lonmax]; will a column vector work? (yes) Does longitude have to be in either -180-180 or 0-360 (neither causes an error, but I didn't have enough Argo data to test further), and must lonmax be greater than lonmin if the region crosses the dateline? (yes, otherwise Index in position 1 exceeds array bounds. Error in argo_build (line 100)).

For the date inputs, the example shows date strings being used. What if I enter datenumbers instead? (Error using datenum (line 190), DATENUM failed.) How about a datetime? (Error using datetime/datenum, Too many input arguments.)

For the variable list, the example shows a cell array of strings. Will a character array work for a single variable? (Yes) How about a string or array of strings? (Yes). What happens if you specify a variable that isn't in the file? (Index in position 1 exceeds array bounds, Error in argo_build (line 100)).

The argo output is only described as a structure, and no description is provided for matching_files.

Documentation

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

Required. @kakearney made some good comments on the documentation #26, #28 & #29, and I think that those should not be simply suggestions, but should be addressed for the JOSS publication.

There is already a lot of information, but I found the documentation a little confusing. My first academic code packages were quite visionary (I think) but almost no one used because of my poor documentation at that time. There is really no limit on how much time we can invest in the documentation, but I think this is your best chance to create a good core. It will be hard to make time for that later. I reinforce the suggestion from @kakearney to improve the installation procedure, otherwise people get frustrated and give up without even trying your package. Experienced users probably already have their own system to fetch data. I think that a good fraction of your future users will be people learning, so abuse of clear and complete information, step by step.

Suggestion: Have you considered using the wiki resource from Github? It might be a convenient way to organize and maintain your documentation. It should be easy to move from /docs into the wiki, and you can clone the wiki itself. This is your call, I'm fine if you would rather use the current /docs. As your package evolves your README will turn into a huge file.

Another suggestion, I usually prefer a clean README with essential information about the package. If someone lands here, it should have just information enough to decide, "I'm the right place, let me learn more about this. Where are the examples?", or "this is not for me, let me keep searching for what I need".

Take advantage of ERDDAP

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

A suggestion. Take advantage of Ifremer's ERDDAP server to select profiles. It is useful to operate with a local directory but for some situations, it could be convenient to select profiles directly from the ERDDAP server instead. Argopy could be a nice reference on how to implement that.

That would require some work, so maybe a good start is to just point the ERDDAP server in the documentation.

general_depth_subset z-axis

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

The general_depth_subset function assumes that depth values do not cross z = 0, i.e. that the input depths are either all positive or all negative. While this is typically the case for in-situ measurement, where z = 0 is set at the water's surface, some datasets, particularly modeled ones, set z = 0 to mean sea level (in which case, depths are mostly negative but can exceed zero when sea surface height is higher than average). You may want to consider handling this not-too-uncommon case, given the plan to continue incorporating new data sources into the toolbox.

At the very least, a user trying to subset something across, for example, [-100 2] should be warned that the 2 will be treated as 2 below the surface, not above.

Remove eval

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

The use of eval is not recommended Matlab syntax. Please replace with dynamic field names (I found similar use in argo_profiles.m, general_depth_subset.m, general_profiles.m, general_section.m):

cvar = eval(['argo.' variable]); % bad
cvar = argo.(variable); % good

Unsuppressed command window output

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

A few functions seem to intentionally not suppress output (i.e. no terminal semi-colon on a code line) in order to print certain things to screen. It's not clear to the me why these variables are being dumped to the command window. For example, in woa_domain_plot, names of all variables in the file and the variable attributes of the selected-to-plot variable are dumped to the screen for no apparent reason. I'd suggest using disp or fprintf, with enough explanatory text and formatting that the user understands the purpose and can clearly see that the printing is intentional (I assumed it was a mistake at first.)

Selection popup instructions

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

The region_select and transect_select functions could include more informative instructions in the popup boxes. In region_select, let the users know they should draw a polygon. For transect_select, the use of "stations" to describe both the clicked locations and the interpolated points confused me at first.

Plot versus distance along transect

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

For the section-plotting tools, you offer the option to place either latitude or longitude along the x-axis. Distance-along-transect would be another obvious candidate here, especially when section-plotting circuitous routes that may double back on themselves.

Installation instructions

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

The installation instructions are sufficient, but could be improved. This toolbox is specifically targeted to an audience of users who are new to working with these various oceanographic datasets, and this audience likely skews towards novice Matlab users, including students and early career scientists. Because of this, I recommend a more complete set of instructions for getting up and running with the toolbox. For example, explicitly state which folders need to be added to the path:

  • ocean_data_tools/ocean_data_tools
  • ocean_data_tools/ocean_data_tools/utilities
  • nctoolbox/

and explain how to alter the path (or link to Mathworks documentation: https://www.mathworks.com/help/matlab/matlab_env/what-is-the-matlab-search-path.html)

A more in-depth overview of the various data sources would also be very helpful to this target audience. The "Finding Data" section of the README gives a bare bones overview of where to get data, but a paragraph or two devoted to each supported data source would be nice. The data aquisition phase can be intimidating for new users. What sort of data should they expect to find in the various sources? Which datasets need to be downloaded manually via the users, as opposed to those that can be accessed remotely via OpenDAP? (You mention that the mocha url is embedded, but users unfamiliar with Matlab's OpenDAP capabilities may not understand what that means, i.e. that they don't need to download anything.) For the ones that need to be downloaded manually, it may be useful to include a few examples for how to get that data (though I know the various data portals are not guaranteed to maintain their current format). Also note, the following links are broken for me:

The majority of users are probably going to simply download the ocean_data_tools code, rather than clone and maintain via git, but in the case of the latter, you should recommend they store large data files somewhere outside of the main repository. Version control does not play nicely with constantly-evolving collections of netCDF files.

Finally, consider linking the ocean_data_tools repository to the MatlabCentral File Exchange (https://www.mathworks.com/matlabcentral/fileexchange/). Entries in the File Exchange can be downloaded directly from Matlab via the Add-On Explorer (i.e. the APPS > "Get More Apps" button in the Matlab Desktop). The File Exchange can pull directly from GitHub, so linking your existing repo to the FEX should be pretty easy, and the FEX entry will remain synced with any future changes.

remove Mac path separators

[from @kakearney]

Many path names in the demos.m script, and possibly elsewhere, assume *nix/Mac path conventions (i.e. forward slash as a path separator). This may cause things to crash on Windows systems. Path names should be constructed using fullfile, pathsep, etc. to keep them platform-independent.

Demo script hard-coded paths

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

The demo script provides a nice overview of the various tools, while also allowing users to make sure they have everything set up properly. However, right now the demo script includes a number of hard-coded file paths that point to the author's personal computer. As a result, the demo script crashes if not modified. This is particularly problematic because the first function to make use of one of these hard-coded paths (netcdf_info) fails silently, and the next (argo_build) does not check that the path exists before using it, leading to very unintuitive error:

Index in position 1 exceeds array bounds.

Error in argo_build (line 100)
    is_matrix = sz1(1,:)~=1;

Error in demos (line 30)
[argo,~] = argo_build(argo_dir,region,start_date,end_date,variable_list);

This is not an easy-to-diagnose error for a new user.

The demo should be written so it can be run without any modifications, out of the box. If you absolutely cannot avoid hard-coded paths, make sure the script checks the specified paths for the appropriate files and directs the user exactly where to modify the script if the path is not found or does not contain the expected files.

Add function headers

Related to JOSS review.

Opening on behalf of @chadagreene. This comment was originally here.

@lnferris as I mentioned, it will be a few weeks before I can focus on a detailed review, but based on a quick glance through the .m files, one request I will have is that each function be given a header that can be accessed by typing help myfunction into the command line. Ideally, I'd like to see either .mlx or .html files with rich documentation as well so users can see and follow along with examples, but for starters it's important to at least include headers for every function.

If you want to get started on that now, I recommend following the template of official Matlab documentation, listing Syntax, Description, and Examples. The Syntax is generally a list of all the different ways users can specify inputs. The Description section describes in words what each type of syntax does and when you might want to use it. And provide a couple of Examples, starting with the simplest case scenario and then getting more complex. The Examples are good not only to help users learn how to use your functions, but the Examples also serve as reliable tests to ensure your function is working as expected. Here's a dummy template:

function b = myfunction(a,varargin)
% myfunction magically transforms a into b. 
% 
%% Syntax
% 
%  b = myfunction(a) 
%  b = myfunction(a,'Option',Value)
% 
%% Description 
% 
% b = myfunction(a) converts a into b. Input a can be any MxN matrix
% containing numerical values. 
%  
% b = myfunction(a,'Option',Value) specifies a value of an optional
% input 'Option'. By default, Value is 0, but may be set to any finite
% scalar value. 
% 
%% Example 1
% Determine the value of b if a equals five: 
% 
%  a=5; 
%  b=myfunction(a)
%
%% Example 2
% Convert a 5x4 matrix of values of a into b, and specify an Option  
% value of 16: 
% 
%  a=rand(5,4); 
%  b=myfunction(a,'Option',16)
%
%% Citation Info 
% Link to your GitHub here. 
% 
% See also myotherfunction and myotherotherfunction.

transect_select point density

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

The transect_select function allows users to specify the density of points along each individual line segment. It would be nice to allow users to specify density in terms of distance rather than number of points, or to specify number of points along the entire transect rather than along each segement. As it is, a long segment followed by a short segment results in very uneven station-spacing along the entirety of the polyline.

GSW compatibility

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

I think the statements regarding the compatiblity of the ocean_data_tools structures with the Gibbs Seawater Toolbox are misleading. Yes, certain fields from the ODT structures can be used as input to GSW functions, in the same way that any temperature, salinity, or pressure data can be used. However, the structure format does not add any additional value to GSW. In fact, it may cause problems, given that the the ODT functions do not import the variable meta-data that explain exactly what type of data are in each dataset, e.g. that the TEMP_ADJUSTED field from an Argo float is ITS-90 in-situ temperature (t in GSW syntax), while the Mercator model thetao field is potential temperature (pt in GSW syntax).

I excitedly hoped that ODT would provide some nice wrapper functions to help users naviagate the often-tricky nuances of applying the GSW equation-of-state calculations when starting from different temp/salinity/pressure combos, but it does not. I would suggest removing this compatiblity claim unless some more concrete connection (either in terms of new functions, or extended documentation matching up ODT structure fields to the GSW expected inputs) is added.

input parsing - verify files exists

[from @kakearney]

The input parsing and checking is much improved! But there are still a few places where things fall through the cracks, notably with file input. It would be nice if the various functions verified that files exist before they try to open them. At the very least, some checks should be added to the demos.m file to make sure the user-modified bathymetry_dir path points to a valid file, and to error out gracefully if it does not. For example, if a user were to ignore the instructions to modify the path (and I guarantee you most of them will!), this is the error they will get:
Error using netcdf.open (line 52)
Could not open file
'/Users/lnferris/Documents/data/bathymetry/topo_20.1.nc'.

Error in bathymetry_extract (line 51)
nc = netcdf.open(bathymetry_dir, 'NOWRITE'); % open the
file as netcdf datasource.

Error in demos (line 58)
bathymetry_plot(bathymetry_extract(bathymetry_dir,bounding_region(argo)),'2Dcontour')

Community guidelines

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

One of the requirements for JOSS is: "Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support"

If that already exists, could you point it to me, please? And maybe make it more easily visible? That would help people interested in helping you with your package. You don't need to teach the basics, it's OK to asume that this person would master Matlab and git, but tell us how is the way that you would like things to be done, with more details than simply "open an issue". Which kind of functionalities would you consider to merge in your package? If I create a new function, what would be the basic requirements for you to consider merging that? This can save you time in the long run.

Plotting function customization

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

Currently, many of the plotting functions are very heavy-handed. They do things like create new figures and axes, change the hold state of existing axes, update colormaps and color order, set ticks and gridlines, etc. I would reconsider whether many of these changes are necessary. In particular, consider whether new axes need to be created, or if it makes sense to allow the user to target an existing axis. If using an existing axis, consider whether aesthetic changes are necessary, or if they may unintentionally alter prexisting plotted objects. At minimum, the documentation for the plotting functions should be updated to inform users any changes that will be applied.

Also, it's nice to have the option to customize aesthetic properties, either on creation (as input variables) or after the fact (by modifying existing graphics handles). Currently, users must do some graphics handle excavation (via findobj) to make any of these types of changes. I recommend updating all the plotting functions to save and return graphics handles associated with all newly-created graphics objects, including figures, axes, lines, contours, pcolors, etc.

Revise or remove netcdf_info

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

I couldn't figure out the purpose of this function. I can see that it's being used to store ncdisp output, but why? No functions seem to access this info. The only benefit I can see over simply calling ncdisp directly is that it accepts a file path rather than a single file name, but there is no checking to see if the first file found on the indicated search path is representative of all files matching this search pattern. The function assumes that it will always have write permission in the current directory, which is not always true. Finally, the function messes with the root diary state within a try/catch loop, which is asking for trouble. For example, if you pass a nonexistent path to the function, the diary state is never toggled off. Also, it fails to check the diary state before making any changes, so it will not properly reset things if the user did not have the diary off at the time netcdf_info was called.

I recommend seriously reconsidering the logic behind this function. Users can already simply call ncdisp to get a quick peek at the netCDF header info. Alternatively, ncinfo can return the info in a storeable format with far fewer opportunities for failure and unintended side effects.

Automated tests

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

While explicit unit tests are not necessary, I do think each function needs to be subjected to a much wider variety of input tests to make sure it either runs properly or throws an informative error.

Add overview page

Related to JOSS review.

Comment originally here from @chadagreene.

Sorry to bread-crumb the suggestions here, but you might also want to provide some sort overview page, where all the functions are organized in a way that users can quickly find the function they need based on the task they're trying to accomplish. I'm thinking something along these lines, and it could take the form of a documentation page accessible in the Matlab Documentation Viewer, or it could just be detailed in the README.md. Or perhaps you'll come up with some other way to do it, but you know, just something to help users figure out which functions they need for a given task.

argo_build confuses pressure and depth

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

Pressure is not the same thing as depth! Currently, the argo_build function places the PRES_ADJUSTED variable data from an Argo float data in a field called depth, despite it actually holding sea level pressure (in decibars), which will be very misleading for users. The mismatch between pressure (dbar) and depth (m) is small at the surface, but is far from insignificant at depth. Especially with Deep Argo floats coming online now, treating them equivalently could lead to major miscalculations.

If you want this function to return data along a depth axis, you need to properly convert, not just rename the field. Otherwise, please rename the output field to properly reflect its contents.

DOIs of the datasets

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

Requirement: The paper and the documentation should include, when available, the DOI of the datasets used, such as Argo, WOA, WOD, ... I'm sorry that I missed this point before.

Suggestion: Include that in the DOI record created by Zenodo too. It could be done manually on every release or automatically by defining a .zenodo.json file.

Markdown documentation vs Matlab docs

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

You've written some nice markdown documentation for most of the functions, which can be viewed online (or with a markdown editor/viewer, but I would not assume users have access to the latter). However, these are not compatible with Matlab's own documentation viewer. It would be very beneficial to users if you added .html versions of these documentation files and made them accessible via Matlab's doc command (see https://www.mathworks.com/help/matlab/matlab_prog/display-custom-documentation.html).

Revise or remove more_colors

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

The more_colors function should be eliminated. It changes the user's default color order at the Matlab root level. This is very undesireable behavior; functions should never alter user preferences like this, especially without warning (this function is undocumented and called from within other plotting functions). There are far better ways to change the color of plotted lines, e.g. setting colors of specific line objects manually via set(), altering the color order of a single specific axis, etc.

Remove external dependency on nctoolbox

This issue is part of the JOSS review (openjournals/joss-reviews#2497)

This toolbox often relies on the java-based nctoolbox functions to read netCDF files, rather than using Matlab's native netCDF utilities (ncread, ncinfo, netcdf class methods, etc.). The extra external dependency will be a barrier to use for some novice users. Java dependencies also add an additional area for potential incompatiblity.

Bathymetry plotting along a transect

Both bathymetry_section and bathymetry_chord suffer from issues regarding how they order the output points when plotting against a transect.

For bathymetry_section, there are two issues. The first is that it uses the coordinates of the closest-matching-grid cell, rather than the input coordinates, along the x axis. Unless the input line is parallel to the specified axis (i.e. an east/west line paired with lon, or a north/south line paired with lat), this may cause strong reduction in the plotted x-coordinate resolution relative to the input coordinates. The points are also plotted in the order they appear in the input, rather than in increasing order with longitude and/or latitude, which can be an issue for any set of points whose x-coordinate dataset (x for xref='lon' or y for xref='lat') is not monotonically increasing.

For bathymetry_chord, the output can get even messier. The general idea behind this function appears to be to plot the bathymetry along a transect line. However, it only does so in a very limited context: when the line is parallel to the specified x axis, and the specified width is approximately equal to or less than to the resolution of the bathymetry dataset. Anything else results in a mess of lines doubling back on themselves, due to the way grid cell within the polygon are collected.

A demonstration:

% A few lines near the Aleutian Islands

latlon = [...
       53.51       176.23
       53.74         -177
       55.055       179.68
       50.623       -179.8
       53.165       171.64
       51.687       178.25];
latlon(:,2) = wrapTo360(latlon(:,2));
   
latlim = minmax(latlon(:,1), 'expand');
lonlim = minmax(latlon(:,2), 'expand');

% Read bathymetry

bdir = '/Volumes/LacieKK2019/SmithSandwell/topo_20.1.nc'; % user-specific
[z,lt,ln] = bathymetry_extract(bdir, [latlim lonlim]);

cmap = get(0, 'defaultAxesColorOrder');

% Plot map of bathymetry

figure; axes;
pcolor(ln,lt,z');
shading flat;
colormap('gray');
hold on;
plot(latlon(1:2,2), latlon(1:2,1), 'color', cmap(1,:));
plot(latlon(3:4,2), latlon(3:4,1), 'color', cmap(2,:));
plot(latlon(5:6,2), latlon(5:6,1), 'color', cmap(3,:));
set(gcf, 'color', 'w');

d = {'lat', 'lon'};

% Plot transects

for ii = 1:3
    
    idx = (ii-1)*2+[1 2];
    [y, x] = interpm(latlon(idx,1), latlon(idx,2), 1/60);
    
    zline = interp2(ln,lt,z',x,y, 'nearest');
    
    h = plotgrid('size', [4 2], 'sp', 0.05);
    
    for id = 1:2
        % Wider chord
        axes(h.ax(1,id));
        bathymetry_chord(bdir, x(1), y(1), x(end), y(end), d{id},  0, 1/20);
        % Narrower chord
        axes(h.ax(2,id));
        bathymetry_chord(bdir, x(1), y(1), x(end), y(end), d{id},  0, 1/60);
        % Section
        axes(h.ax(3,id));
        bathymetry_section(bdir, x, y, d{id}, 0);
        % Section (my way)
        switch id
            case 1
                plot(h.ax(4,id), y, zline, 'color', 'k');
            case 2
                plot(h.ax(4,id), x, zline, 'color', 'k');
        end
    end
    
    set(findall(h.ax, 'type', 'line'), 'linewidth', 1, 'color', cmap(ii,:));
    set(h.ax(:,1), 'xlim', minmax(y));
    set(h.ax(:,2), 'xlim', minmax(x), 'yaxisloc', 'right');
    set(h.ax, 'ylim', minmax(zline));
    xlabel(h.ax(end,1), 'Latitude');
    xlabel(h.ax(end,2), 'Longitude');
    set(h.fig, 'color', 'w');
    set(h.ax(1:end-1,:), 'xticklabel', '');
    
    h.lbl = multitextloc(h.ax(:,1), {...
        'bathymetry_chord (width=1/20)'
        'bathymetry_chord (width=1/60)'
        'bathymetry_section'
        'the transect'}, 'northwestoutsideabove');
    set(h.lbl, 'interpreter', 'none');
    
end

bathybug1
bathybug2
bathybug3
bathybug4

(Note: this code uses some of my personal toolbox: plotgrid, minmax, multitextloc)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.