Comments (10)
@brhillman you are really the best. Would it be too much to ask you to fork the repo, make the changes, and open a PR? I'd favor the 1._wp
form of the fix.
from cospv2.0.
@brhillman
Thanks for this. I second Robert's opinion.
Also @RobertPincus, since this is change doesn't impact the core COSP code and instead lives in subsample_and_optics_example/, do we need to go through the formal review process?
from cospv2.0.
@dustinswales I would think not but we could ask the PMC. This fix is very specific and won't change answers under most circumstances.
from cospv2.0.
@RobertPincus no problem, are there tests I need to run? Or do ya'll run those after I submit the PR?
from cospv2.0.
@dustinswales , @RobertPincus , why wouldn't we want to follow the standard review process? The code review for this change is trivial, so the only thing we need to check is that the change doesn't modify the standard outputs. If it does, then we need to update the known good outputs (KGOs). I think the issue of whether the code in subsample_and_optics_example is part of COSP or not is not relevant, since modifications to subsample_and_optics_example have the potential of changing the KGOs, which would create problems when testing future modifications.
from cospv2.0.
@alejandrobodas Yes, someone (Dustin, I guess, or we could ask @brhillman) should run the regression tests to be sure the answers don't change. They won't, I'm confident. Do we then want to require further review? It seems like a lot of process for something so trivial.
from cospv2.0.
@RobertPincus That sounds good to me. I just wanted to highlight that, no matter how confident we are that a modification won't change results, we must run the regression tests and copy&paste the results in the PR or issue so that it gets properly documented.
from cospv2.0.
@RobertPincus @alejandrobodas I ran the regression tests on the bugfix branch, but I am getting diffs in unrelated fields. Thinking maybe the reference data was stale, I tried again using master and get the same result. Are the references for master maybe old? Here are the diffs (and command used to compare):
(python2) [bhillma@ghost-login2 driver (master)]$ python test_cosp2Imp.py data/outputs/UKMO/cosp2_output_um.ref.nc data/outputs/UKMO/cosp2_output_um.nc
############################################################################################
Treating relative differences less than 0.0010000000% as insignificant
atb532_perp: 16.30 % of values differ, relative range: -5.90e-05 to 7.03e-05
atb532: 17.08 % of values differ, relative range: -5.89e-05 to 7.02e-05
cfadLidarsr532: 0.00 % of values differ, relative range: -1.00e+00 to 1.00e+00
lidarBetaMol532: 19.45 % of values differ, relative range: -5.91e-05 to 7.02e-05
clthinemis: 0.65 % of values differ, relative range: -1.18e-05 to -1.18e-05
lidarBetaMol532gr: 18.56 % of values differ, relative range: -5.89e-05 to 6.97e-05
atb532gr: 13.87 % of values differ, relative range: -5.89e-05 to 6.97e-05
lidarBetaMol355: 3.99 % of values differ, relative range: -1.53e-05 to 2.19e-05
atb355: 3.74 % of values differ, relative range: -1.53e-05 to 2.22e-05
dbze94: 0.24 % of values differ, relative range: -2.78e-04 to 1.05e-04
boxtauisccp: 1.93 % of values differ, relative range: -2.37e-05 to 2.40e-05
boxptopisccp: 2.22 % of values differ, relative range: -4.67e-05 to 4.10e-05
All other fields match.
Most of these look small, but some do not (i.e., cfadLidarsr532, although it reports 0% of them differ?). Again, this happens with master too.
from cospv2.0.
Hi @brhillman I think this is similar to the issue we had in #15. I suspect that your compiler/options/architecture introduces some differences. Please, can you update your reference files with the ones produced with master, and then run the regression tests with your modifications?
from cospv2.0.
Thanks @alejandrobodas that's exactly it. I had done this experiment but neglected to return the compiler flags back to default before doing that (I had inserted the flag to promote to r8 precision in one of the test but not the other). Rebuilding both the new branch and master with the identical flags and comparing produces identical results:
(python2) [bhillma@ghost-login2 driver (fix-quickbeam-optics-types)]$ python test_cosp2Imp.py ../../master/driver/data/outputs/UKMO/cosp2_output_um.nc data/outputs/UKMO/cosp2_output_um.nc
############################################################################################
Treating relative differences less than 0.0010000000% as insignificant
All fields match
(python2) [bhillma@ghost-login2 driver (fix-quickbeam-optics-types)]$
from cospv2.0.
Related Issues (20)
- MODIS Optical_Thickness_vs_ReffICE and Optical_Thickness_vs_ReffLIQ not masked for night columns HOT 3
- Striping caused by vertical interpolation routine HOT 15
- Example data for tests do not include any night columns HOT 2
- Allocation issue with MODIS/Cloudsat joint-products HOT 1
- cosp_diag_warmrain has wrong dimensions for variable frac_out / design flaw HOT 2
- cosp2_test reference (test) output is missing variables & needs versioning. HOT 4
- Python test script does not note occurrence of outputs that are not in the test dataset (Version check?) HOT 6
- Remove COSPv1.4 interface HOT 5
- Test case using global snapshot HOT 5
- Encoding version information in the KGO HOT 7
- CI test ifort broken HOT 1
- Implementation of CLARA simulator HOT 14
- hgt_matrix_half is used inconsistently HOT 1
- Masking fix in cosp_diag_warmrain? HOT 4
- Calculation of cloudsat_preclvl_index fails when use_vgrid=.false. HOT 4
- Missing type specifier causing crash on Cori-KNL in running debug mode HOT 6
- Allocation of radar lookup table and delete unused variables
- download_test_data.sh drive links HOT 3
- add MODIS joint histgram diagnostics
- Inconsistent unit labels for hgt_matrix
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from cospv2.0.