Git Product home page Git Product logo

dod's People

Contributors

dankelley avatar

Watchers

 avatar  avatar

dod's Issues

dod.met() is broken

See dankelley/oce#2151 but note that I plan to fix it here, in dod, porting it back to oce only if I find it works well.

Another approach, which I find quite appealing, is to "deprecated" the oce version. I just don't like having a function that requires major changes on a year or even sub-year basis, and that is what happens with met data from Environment Canada (or whatever it names itself this week).

build-check errors

I get as below. I am busy trying to fix other things (this is just a warning promoted to an error -- the package can build, it just cannot check). The following can likely be alleviated in the way I do for my other packages ... I can never remember that offhand.

* checking package dependencies ... ERROR
Namespace dependencies missing from DESCRIPTION Imports/Depends entries:
  'methods', 'oce', 'rjson'

add ability to search NCEI database

I've just spend about 2 hours trying to search for some data. Why must this be so hard? I ran across some things that looked useful (e.g. ref 1) but that interface is really quite confusing, and when I finally got to a spot where I could request data -- which did not give a fixed URL, but send an email with a temporary URL, making it useless for reproducible work -- I got a file that was in a format I've never seen, and that did not match the documentation. It was just a string of numbers and space. But I would not make much of that because there were so many choices to make in the UI that maybe I got what I asked for. (It might be that format that ODV uses??)

Then I found ref 2, which cannot be viewed in Safari, but is OK in Firefox. It is very nice because

  1. it has a map you can click to get data in Marsden squares
  2. clicking on that map gives URLs, which we can likely reverse engineer. See for example ref 3.

Plan. I will be exploring the data for a while, but if things look promising, I may return here and write code so dod can download such data. I don't see any further UI to e.g. select by year or whatever, but frankly, my dear, I don't give a damn because in 10 minutes anybody can write code to do such things, and 10 minutes is a lot faster than the 2 hours I've spent so far today trying to find some bloody data.

PS. @richardsc might be interested in these links.

References

  1. https://www.ncei.noaa.gov/products/world-ocean-database
  2. https://www.ncei.noaa.gov/access/world-ocean-database/datawodgeo.html
  3. https://www.ncei.noaa.gov/data/oceans/woa/WOD/GEOGRAPHIC/CTD/OBS/CTDO7306.gz This is high-resolution CTD data in the box from 60 to 70W and 30 to 40N. The first 3 letters designate the sampler, the "O" means at obdserved levels (I think) and the number, 7306, is a Marsden Square code.

support for XBT data from Hadley Centre?

(Transferred from AnnieHow/dod#5)

I've not read enough of the source materials to really know what this database holds, but it seems worth exploring. Perhaps @AnnieHow can look into it and we could talk about this sometime. (As is often the case at this time of year, my interest is partly because I'm keen to find cool things to show students.)

PS. XBT data from the 1900s seems curious to me, but maybe they just took some ship data and put it into the format.

https://www.metoffice.gov.uk/hadobs/en4/download-en4-2-2.html

add ITP datasets?

(transferred from AnnieHow/dod#22)

@AnnieHow I think it would be good to add dod.ctd(program="ITP") to get ice-tethered profilers, which are great for Arctic data. This is not a stop-other-work sort of idea, just something that might be nice to get over the next few weeks or so.

I've put some notes about the ITP program, with links that might be used in dod.ctd(), at the wiki page

https://github.com/AnnieHow/dod/wiki/dod.ctd(program=%22ITP%22)

PS. I also put wiki stubs for the other two CTD things we have. I think those wiki pages might be a good way to document where data are. I find it a bit hard to make sense of issues, which are more like diaries than coherent essays.

copy oce::download_* to dod::dod_*

(transferred from AnnieHow/dod#41)

@AnnieHow and @j-harbin -- I plan to start work on this after noon today, so if you object, please post something here (or let me know over email) soon.

At the moment, some dod::dod_X() functions work by calling oce::download_X(). This is not ideal in terms of keeping up with changes to data servers, because oce is large and cannot be re-released on short timescales. We've seen an example of this in the past month, actually, as oce::download_topo() had to be modified to handle a new server API structure (to provide access to the 1/4-minute data now available).

I see the plan as follows. Stage 1: copy oce::download_X() to dod::download_X(). Stage 2: submit dod to CRAN. If it gets accepted, then alter oce to call dod. This ought to provide oce users with a faster response to changing data servers.

Note that some dod functions are getting a read argument lately. This will not be added to oce functions, because oce users are expected to know to use read.oce() or a more specialized function like read.ctd(), as required. And those oce::read() functions have a lot of extra arguments that are likely to be used in many applications.

Below is a checklist of things that should be ported into the paired dod functions.

  • oce::download.amsr()
  • oce::download.coastline(..)
  • oce::download.met()
  • oce::download.topo()

support for NOAA water-level buoys

This may already be in the package, but if not, here is what I'll do after I check.

station <- "8727520"
tStart <- "20230801"
tEnd <- "20230830"
variables <- c("water_level", "predictions")
for (variable in variables) {
    url <- sprintf("https://api.tidesandcurrents.noaa.gov/api/prod/datagetter?product=%s&application=NOS.COOPS.TAC.WL&begin_date=%s&end_date=%s&datum=MLLW&station=%s&time_zone=GMT&units=metric&interval=&format=CSV",
        variable, tStart, tEnd, station)
    file <- paste0(variable, ".csv")
    download.file(url, file)
    d <- read.csv(file)
    head(d,1)
    plot(as.POSIXct(d$Date.Time, tz="UTC"), d[,2], xlab="", ylab=variable, type="l")
    mtext(paste("Station", station), side=3)
}

Created on 2023-08-30 with reprex v2.0.2

request: dod.section() to get section data

(transferred from AnnieHow/dod#28)

I've been working this afternoon with section data (collections of CTD files) that I got from
https://cchdo.ucsd.edu/data/22008/a22_2021_ct1.zip
which is a link I got to from
https://cchdo.ucsd.edu/products/goship-easyocean

I am not clear if this should be handled in something called dod.section() or dod.ctd(). I think the latter, actually (despite the subject line) but I'm not sure. Anyway if you go to the second link and click on things you'll maybe discover the pattern. I think it is basically like below.

library(dod)

#' Download an Oceanographic Section
#'
#' Download a file that contains CTD files corresponding to stations
#' within an oceanographic section.  At the moment, this works only
#' for files listed at Reference 1.  Filenames are constructed
#' by [dod.section()] based on a few instances on that website,
#' and so if there is a variation from the inferred pattern of
#' URL names, this function will fail.
#'
#' FIXME: document parameters; add a (commented-out) example
#'
#' @references
#' 1. https://cchdo.ucsd.edu/products/goship-easyocean
#'
#' @author Dan Kelley
#'
#' @export
# https://cchdo.ucsd.edu/data/16031/a03_1993_ct1.zip
dod.section <- function(program="cchdo", code="22008", section="a22", year=2021,
    age=0, destdir=".", debug=0)
{
    if (identical(program, "cchdo")) {
        server <- "https://cchdo.ucsd.edu/data"
        filename <- paste0(section, "_", year, "_ct.zip")
        dodDebug(debug, "using file=\"", filename, "\"\n", sep="")
        url <- paste0(server, "/", code, "/", filename)
        download.file(url, filename)
        return(filename)
    } else {
        stop("program=\"", program, "\" not understood.  Try \"cchdo\"")
    }
}
a <- dod.section(debug=1)

or something like that.

I don't actually know whether this should be dod.section() or dod.ctd(). Maybe the latter because that's what the data actually are. As for the program name, I don't know a good answer for that. Maybe like I have above.

Pretty much every year at this time I scour the web looking for sources of section data. I think the URLs change quite frequently, which is a pain. Having this handled by dod would be very time-saving because it's boring doing the same search over and over!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.