This repository compares provisional OPERA DSWx-HLS products with validation datasets.
In your ~/.netrc
, place earthdata login credentials:
machine urs.earthdata.nasa.gov
login <username>
password <password>
It is recommended to install mamba
in the user's base environment to speed up the installation process:
conda install -c conda-forge mamba
From this repo:
mamba env create -f environment.yml
conda activate dswx_val
After activatating your environment (i.e. conda activate dswx_val
), then
python -m ipykernel install --user --name dswx_val
Run the papermill script with:
python verify_all.py
See sample_runs.zsh
for some additional ways of parametrizing the tests.
This mirrors the current validation clone. To generate this table, one must additionally have:
- JPL VPN access and be connected to the VPN
- Have group access to the validation clone (that requires coordination with HySDS to be added to the appropriate LDAP group)
- Create a
.env
file with JPL credentials.
Specifically, for 3. the .env
should look like
ES_USERNAME='<JPL USERNAME>'
ES_PASSWORD='<JPL PASSWORD>'
After that is done, then run the notebook _create_validation_table.ipynb to create this table.
- Create a branch from dev and create a pull request.
- Do you development.
- For local
git diff
, usenbdiff --ignore-id
as cell ids are required and updated on each change for newer versions of nbformat. Github will provide a prettier way of viewing notebook differences. - Run
pytest .
in this repository to ensure working of the notebooks. We do not use github actions (yet). - Have another member review.
- Make sure you don't commit to
out/
directory unless you want to share your results with the larger PST team. You can manually add / commit files with git.