Git Product home page Git Product logo

gsmwam-ipe's People

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

gsmwam-ipe's Issues

V & V for WAM-IPE

Explore if the UCAR Met package can be applied for V&V. The package seems to be accessible from Hera and WCOSS.

Customer Interactions for the Operational WAM-IPE

There are a few collaboration and customers currently looking into our operational outputs. Some of them are also working on model-data validation. I am keeping the records here so that we don't lose track.

  1. James Secan ([email protected]), NorthWest Research Associates, Inc
    https://spawx.nwra.com/spawx/tec/wam-ipe/
    The comparisons of foF2 and TEC with observations at two locations in real-time.

  2. Bruce Ward ([email protected]), the University of Adelaide in Australia
    Validating WAM-IPE electron density profiles in the bottom-side ionosphere with over-the-horizon (OTH) surveillance radars. This system is supported by a network of HF vertical, oblique, and backscatter sounders all observing the ionosphere in real-time.

  3. Eelco Doornbos ([email protected]), Royal Netherlands Meteorological Institute (KNMI)
    Looking into WAM-IPE validation in thermosphere and ionosphere.

  4. Catherine Olsen ([email protected]), SAIC
    Trying to understand how MUF is calculated in our outputs. This is to support a real-time radar "bug splat" depiction to see where and why "over the horizon" radar reflections at high frequency (HF) change through time, who can pick them up, and the applications our customer can do with such information is up to them.

  5. NASA CCMC

WAM Neutral Density:
SpaceX Starlink
MIT Aerospace, Richard Linares, for reentry analysis and thermospheric density
MIT Lincoln Lab
Kayhan Space
LeoLabs
Space Environmental Technologies, Kent Tobiska ([email protected])
Industrial Sciences Group in Australia, David Shteinman
Space Force, 18th Space Defense Squadron
STK + Comspoc
Celestrak.org on solar and geomagnetic drivers
Aerospace Corp. for reentry analysis

Ionosphere and Irregularities
Rocket Lab on GPS outage
WAAS

Rapid-Update, Near-Realtime Input Parameters

Currently, the scripts in scripts/interpolate_input_parameters are run at initialization during the execution of exglobal_fcst_nems.sh to create minute-cadence input parameter data for ~a dozen geomagnetic and solar values, typically global values or split by hemisphere. In pursuit of a rapid-update near-realtime forecast system for WAM-IPE, we seek to introduce these input parameters with a short cadence during the forecast cycle without reinitialization of the model.

We need to first create a layer in the Python scripts that parse the incoming realtime input parameters (once NOAA-SWPC/WAM-IPE#285 is completed) at a specified interval, creating a new input parameter file each time that can be linked to a static name in the RUNDIR. This can be run as a background process (simply add & to the Python script call), or as a single-processor service job if needed by operations. After a verification step to ensure the input file is consistent with expectations, a log file can be updated with a new timestamp.

Then, a NUOPC layer which is checking for changes to that log file will then allow the model integration to proceed up until the end of the new input parameter file. WAM (and IPE) will need to know to update their input parameter arrays with values from the new input parameter file.

This process will repeat until the end of the forecast period.

WAM-IPE debug mode issue due to the small overflow problem in WAM

The WAM-IPE debug mode has been found to get a problem to compile due to an overflow issue in WAM when run ./gsm/phys/read_fix.f. The issue has been figured out it happens when the reading code is called by a subroutine associate to total column soil moisture, will do some simple cutting & adjustment on numerical criteria of curtain physics variables in the corresponding subroutine to solve this issue.

txt write of historical drivers broken

Indirectly reported by Dibyendu Sur via email, there's a typo on lines 269 and 287 of scripts/interpolate_input_parameters/interpolate_input_parameters.py in which -- during a text file write -- an integer (hemispheric power index) receives precision formatting, which is disallowed. While parse_realtime.py in the same directory has the exact same issue, it doesn't affect that script, as the hemispheric power index is stored as a string and not an integer there.

stdout cleanup

Per @gmillward, stdout is excessive during a normal model run with no configurable option to contain it. While we can/will disable the ESMF PET logs, it's useful for the model stdout to be approachable as well, and it's not at the moment.

@ZhuxiaoLi66 started a GSMWAM-IPE branch that I refactored as printout that comments out many write/print statements; there's still somewhat substantial initialization information printed, but the amount during model integration is reduced significantly. Would like feedback on my approach in the mediator, though, @rmontuoro.

I tested a 1hr run and got bitwise identical results, while dropping the fcst log size by >50% in size, stdout should be 90% or more reduction depending on run length.

New Feature : IPE Field Projection for WAM

  • @IonospherePlasmasphereElectrodynamics N:create a baseline run standalone IPE 2013-03-16 (1x80)
  • @IonospherePlasmasphereElectrodynamics N: insert interface routines
  • Tim: insert the code that transform the IPEField projection for WAM
  • @IonospherePlasmasphereElectrodynamics N: make sure the test run is b4b identical to baseline.
  • Tim: validate the IPE fields
  • @joeschoonover : make sure the new branch works in the wam-ipe environment
  • Get documentation on projections from Tim, and put in Technical Documentation

transition to COMIO/NetCDF for input drivers

On WCOSS we were encountering more (presumed) filesystem issues when reading the wam_input_f107_kp.txt file repeatedly for CONOPS2, eventually producing a NaN in one of the input parameter arrays, leading to immediate NaNs in the model. We decided to transition to NetCDF utilizing the COMIO library, and may still need a simple MPI broadcast implementation to further get a grip on the IO issue.

This work is mostly complete.

The io4&io8 transition issue in debug mode coupling run caused by read_fix.f in WAM

there is a overflow error in debug mode coupling run caused directly by read_fix.f code in WAM.
Now my solution is that I set a paramter (real (kind=kind_io4), parameter :: c4to8=1.0) in subroutine
uninterprez, then on problem line: buff_mult_piecea(i,j) = f(i,j)*c4to8
to change the io8 type of f(i,j) into io4 type of buff_mult_piecea, seems not work yet.

new GloTEC data which merged COSMIC2 data

The new GloTEC data with the merge of COSMIC2 data after this March has been checked and compared with the GloTEC before the merge. The results indicate that the merge of COSMIC2 significantly increased the effective GloTEC data area, both over ocrea and continental areas, especially increased over the tropical, subtropical area where the TEC peak areas are always located.

Notice emails for the updates of the IC files for WAM-IPE & WAM & IPE on WCOSS

Adam has removed the IC files on WCOSS into the present IC file folder:/gpfs/dell2/swpc/noscrub/WAM-IPE_DATA/ICS
He also nicely release the write access on both WCOSS and Hera to all the group members (Tzu-Wei, George, Raffaele, himself, and me).
We think it should be a better thing to do to produce an email notice to the group members (better including Tim for his validation works, maybe also Naomi?) right way after the IC files on WCOSS has been updated. Unfortunately, Hera can't do this due to the firewall issue. While since our policy now is that we always update the IC files on both the machines at the same time, so this should not be a big concern.
Any different thoughts?

Change Fixed Height Grid To Match IPE levels

The group would like to experiment with changing the fixed height grid that is used in the mediator to regrid from WAM to IPE. The idea is to switch the heights of that grid to match the IPE levels to see how that affects the behavior of the coupled system.

output drivers and bring back to SWPC

The driver values should be brought back to SWPC along with other outputs. It can be a file containing all the inputs or somehow folded into other outputs.

WAM-IPE: Make a consolidated database for initial conditions

In order to improve our ability to move between systems, we need to consolidate the locations that initial condition/model set up data is pulled from. Along with doing this, we should make a concise document that lists the files that are needed, and document a concise directory structure for storing that data. We should start by building this database with a few key test cases that we are already running and update the compset scripts to make use of this new structure. We'll need a new variable indicating where that database directory is installed.

When porting to a new system, the database can simply be tarballed and carried over, or we can look into using git-lfs to track versions of this initial condition database.

  • Document and plan database directory structure
  • Check if git-lfs is a possible solution for handling our initial condition database.
  • Consolidate the March 16, 2013 and November initial conditions into this new structure
  • Update compset scripts under scripts/compsets/ to use this new database structure
  • Check for bit-for-bit correctness with the most recent version of development

Workflow (SWIO-specific) refresh

branch workflow_refresh has been created to help address lingering issues from #21.

There has been a minor overhaul in namelist and resource file writes, extracting their contents from exglobal and putting them into scripts/compsets/parm (PARMDIR) for either linking into the RUNDIR, or rewriting into the RUNDIR by use of envsubst.

Use of SWIO is now configurable by setting SWIO=.true. or .false.

There is a default configuration for SWIO=.true. for both standaloneWAM and WAM-IPE. If the user wishes to change this configuration, they can alter PARMDIR/nems.configure.${MODE}_io and create or alter the appropriate resource file in the parm directory. Additionally, the user should set SWIO_MODELS (by default for WAM-IPE, SWIO_MODELS="IO AIO"), and then for each IO model, MODEL_PREFIX and MODEL_CADENCE (e.g. IO_PREFIX=ipe and AIO_PREFIX=wam, and IO_CADENCE=AIO_CADENCE=180). This should match with the runSeq and output prefix listed in nems.configure and the SWIO resource file respectively, for purposes of linking the output from RUNDIR to ROTDIR. I debated a few ways to extract these values automatically from nems.configure and the resource files, but couldn't settle on anything that was appropriately simple.

Further tasks that need to be completed for resolution of this issue:

  1. Determine what needs to be exported from each model and at what cadence; identify variables that should be calculated from the export state prior to output by SWIO. Establish how to add model fields to the export state.
  2. Extend the med.rc above 800km. [from @twfang]
  3. Determine an appropriate default runSeq for each mode (multiple components per model?).
  4. Have SWIO components output at the final model timestep (i.e. a one-hour run with AIO_CADENCE=3600 should produce output).

Anything else, please add.

the issue on lower atmosphere variables changed right away after purturbing geomegnetic drivers

Clayton is working with Tomoko on the WAM ensembles for data assimilation. They are getting some curious results
in the lower atmosphere right away (after 1 timestep) just perturbing the geomagnetic drivers.
This issue has been investigated by doing experiences to check the code in idea_phys.f and idea_ion.f. Based on the results, we found the physics status (temperature) in the lower atmosphere (below WAM level 90) happened outside of idea_phys.f( upper atmosphere physics).
In gsm/phys/gloopb.f where the physics variables (status) have been integrated, it has been found that the gbphys (which are the usual lower atmosphere physics) has been called after the idea_phys.f, there is temperature tendency for the whole column have been calculated in gb_phys due to adjusting mean radiation fluxes and heating rates to fit for faster model timestep, and that adjustment imposes the physics status from the idea_phys in gloopb.f, besides some SPPT stochastic perturbing adjustment in gloopb.f.
The physics status adjustments in phys/gbphys.f and gsm/phys/gloopb.f are considered as the cause of the issue.

storage estimation

To estimate how much storage we need for the WAM-IPE operation, we have come up with a number with 50 GB per day. Here is the break down for now. For the current estimation, we neglect the size of file 1 since they are small. We can easily reduce the size of Files 2 and 3 to make up rooms for it, if necessary.

https://docs.google.com/spreadsheets/d/1oh5kaNg_ThCTwX1bfFSQqquPbg8kTC7OyYQ5lLdY6Sk/edit#gid=0

File 1: The smallest output only contains TEC/NmF2/hmF2 and neutral density at 400km (or O/N2)

File 2: For the smaller set of ipe* and gsm* files that contain necessary parameters for products, it is roughly 55 MB per time-step with single precision. These files contain all height in IPE and all levels in WAM.

File 3: For the large files that contain all IPE and WAM output parameters, it is about 110 MB per time-step with single precision. These files contain all height in IPE and all levels in WAM.

For CONOPS I, we do 4 times of 2-day forecast. If we have file 1 output every 1 (or 3 mins), file 2 output every 15 mins on the first day, file 3 output every hour on both day, we got 3 (every 15 mins)*24 (hrs)1 (day) 4 (4 cycles a day)*55MB+24 (hrs)*2(days)*4 (cycles per day)*110MB=37GB.

For CONOPS II, we do 1 (or 3 mins) output for file 1 and hourly output for file 3, we have 24*110MB=2.64GB.

Therefore, 50GB per day should be sufficient for the whole operation.

Validation on swpc_products branch

The new features have been added to SWIO, includes output of O/N2.
The output of O/N2 from SWIO is verified by the calculation of the O/N2 offline based on other output variables (Temp., height, O_den and N2_den) from the same run by the corresponding NCL code.
The NetCDF format of the output and other output variables also need to be double-checked aiming for a more ready-to-go status.

Validations: WAM-IPE operational outputs

It appears that we need to create an issue to track all the WAM-IPE validation activities. Below is the list. Each item will have an issue associated with it and be linked back to this issue.

WAM:

  1. Neutral density
  2. O/N2, Temp

IPE:

  1. NmF2/hmF2 from ionosondes
  2. MUF from ionosonde
  3. TEC from GloTEC, MIT-TEC, and LISN
    (#55, #54)
  4. the bottomside ionosphere
  5. Plasma drift velocity from ISR

NOTE: JRO ISR campaign will take place between 9/21-9/26. We should plan to have all the ionospheric parameters validated.

tidal anaysis

This project introduces the main results on the WAM tidal analysis based on the WAM output 3 years ago, and its comparison with the main tidal analysis results in Rashid & Tim paper (2007, Geophysical Research Letters) with the title of 'Tidal variability in the lower thermosphere: Comparison of Whole Atmosphere Model (WAM) simulations with observations from TIMED'.
The comparison indicates that the WAM (2017 version) performance in representing the tidal features is consistent with its performance in 2007 which has been validated against the TIMED observational dataset.
The output of the latest operational WAM-IPE will be used to do the similar tidal analysis.
And we feel confidence in the results given that we haven't done any large physics changes in WAM during the past 3 years.

CCMC neutal density product validation

The neutral density product data files for Feb.3 2022 storm delivered by CCMC have been checked. The comparison plots with our original product files are posted for further discussion and record.

List for WAM and IPE parameters and factors

Let's start to build a list of parameters or factors that need to be kept in namelist/input files, so that they can be easily changed once the model goes operation. Once we have a completed list, we can see how to get them out and how to organize them in a reasonable way.

Comparisons of WAM-IPE performances on WCOSS-Phase 3 and WCOSS2

In this issue, we document the differences in model results from two different machines as part of transition validation.

The branches ops/wcoss2 and develop gets exactly the same results on phase wirSignificant differences are found between these two machines. These differences come from different compilers and differences in libraries (ESMF), and WAM driver broadcast through updated COMIO. To isolate where the differences come from, several tests were carried out with ICs from 2015031600 with fixed drivers (F10.7=120 and Kp=3).

1. Standalone WAM (no ESMF and no COMIO) sigfiles. 62 days (hr 1489 to 1508) after 2015/03/16. The global mean temperature is included on top of each subplot.

WAM Tn (level 135) on WCOSS 1 phase 3:
standalone_Tn_wcoss_fixdrivers

WAM Tn (leven 135) on WCOSS2:
standalone_Tn_wcoss2_fixdrivers

WAM Tn difference (WCOSS2-Phase3)
standalone_Tn_diff_fixdrivers

In summary, the temperature differences from the standalone WAM are ranging between -100K to 100K. The larger differences often are associated with the polar cap boundary. Smaller differences are more random and scattered around the globe. The global means from these two runs are very close and are within 4K differences. The differences mainly come from the machine change and are within the acceptable range.

2. WAM-IPE with the same modulefiles and libraries. Results are from netcdf outputs (SWIO). TEC and Tn from WCOSS2 (1st column), WCOSS Phase 3 (2nd column), and differences (3rd column, wcoss2-phase3) are shown in these plots. Global mean TEC and Tn (level 135) are also marked in some of these plots.

2015/03/16
TEC
20150316
Tn
Tn_20150316

2015/04/15
TEC
20150415_fixdrivers
Tn
Tn_20150415_fixdrivers

2015/05/25
TEC
20150525_fixdrivers
Tn
Tn_20150525_fixdrivers

In summary, the Tn values change are roughly +-100K and TEC changes are around 20-30 TECu depending on the locations. Stronger changes are found at the low-latitude ionosphere. However, the results of the global mean of Tn and TEC are very similar from both machines. The changes do grow with a longer run (2 months in this test case) but don't seem to continue growing. With these results, we conclude that the differences in model results from these two machines are acceptable. The WAM-IPE ops/wcoss2 branch perform consistently between WCOSS Phase 3 and WCOSS2.

Python 2 EOL

I've had a very bad habit of writing Python 2 for the last two years when the EOL is in a few months. The transition to Python 3 should be mostly pretty painless, but I'll need to update all of the repository .py files to Python 3 and adjust the module loads for Anaconda accordingly.

V&V on Ops. WAM-IPE Aug. 2020

The Ops. WAM-IPE storm validations will be conducted to verify the output of 3D neutral (might include plasma later) fields (U, V, W, T, O, O2, N2, mass density, H, He) and 2D neutral& plasma fields ( TEC, O/N2) from the tip operational WAM-IPE, against the corresponding observational dataset, and also the corresponding former runs of WAM-IPE.
The storm case will be the St. Patrick storm of 2015 (Mar.16-21 2015).
Will includes (not limited to) the following validations,
a) The vertical integral O/N2 (at pressure levels) against the track-by-track 24hrs GUVI data during Mar.16-21 2015, against the corresponding former runs.
b) The thermosphere neutral total mass density (at about 400km) against point-to-point along with the satellite trail data of GRACE.
c) TEC against the output & data of GloTEC & MIT.
d) Tides analysis based on the hourly output of an annual run of WAM.
...

Minimize the seasonal TEC bias by revising the eddy mixing parameterization

This project is mainly aiming to minimize the operational seasonal TEC bias through the revision of the eddy mixing parameterization in WAM.
At the same time, we will also keep a close eye on the influence of parameterization revision on the change of the neutral density with some validation.
The bias improvement also includes estimating the TEC change due to the day-number bug fix in WAM.
The output of the operational WAM-IPE and that of several designed free WAM-IPE coupling annual runs for the period about 20210715 - 20220705, also the output of the operational GloTEC during the same year-long period will be applied to do the diagnostics and mathematic analysis for parameterization change.

Products: Complete Documentation on MUF

We have opened a document to begin filling in details on this product.

Ideally, we should have some background information that explains what this product is and why it would be useful. We should indicate intended customers that would benefit from SWPC supporting this product.

Additionally, we should document, with references, how we are calculating MUF and report on other diagnostics/indexes that we can derive from forecasts of MUF.

The document can be found here

output O/N2 product from WAM

This work will calculate the O/N2 directly in get_w_z.f of WAM and deliver it to mediator and SWIO. At the same time, make the reference integral N2 as a namelist parameter of WAM.

Improvement on Kp derived Solar Wind Parameters

Our current forecast uses Kp to derive solar wind parameters. Calculations for the forecasted values of Bz, By, angle, and velocity are based on the following relationships (code can be found here).

SW_By = 0.0
SW_Bt = SW_Bz
If (Bz .ge. 0.0) SW_Angle = 0.0
If (Bz .lt. 0.0) SW_Angle = 180.
SW_Velocity = 317.0 + 55.84 * Kp - 2.71 * Kp_squared
SW_Bz = -0.085 * Kp_squared - 0.081* Kp + 0.434 + 0.0079 * F10.7(daily) - 0.0022 * Kp *
F10.7(daily)

The SW_Bz is really wimpy, especially at low F10.7:
-0.4 for Kp 3 at F10.7=100
-3.6 for Kp 6 at F10.7=100
-8.4 for Kp 9 at F10.7=100

Dan Weimer developed a relationship between the solar wind electric field and Kp:
Esw = 0.1455 + 0.4675 * Kp - 0.1446 * Kp_squared + 0.0276 * Kp_cubed

We assume no information on By in the forecast, the solar wind electric field becomes the product of the solar wind velocity and Bz, then we can get Bz through Esw and Vsw:
Esw = - Vsw * Bz

The Goncharov paper has a relationship between Kp and Bz :
Bz = - Kp * 2.27 (or -Kp / .44 as in equation 5, but only above Kp 2). This is hard to interpret given Figure 1a in their paper. Which gives the values in the second column below. The values from Weimer's SW electric field formula (3rd column) divided by the Vsw (5th column) from our expression, are fairly consistent with Goncharov, at the high Kp levels, and lower for the mid-range Kp levels.

Screen Shot 2021-11-03 at 5 56 58 PM

We need to try out the new estimation in the WAM-IPE to see the impact of these values.

Neutral Density Product of WFS

Total Neutral Density products are based on operational WAM-IPE (WFS) output, with calculated Helium on the higher levels based on MSIS2.0. The products are at fixed-hights from 100km to 1000km with 10km intervals.

Fixes for NmF2 and hmF2 in SWIO

For NmF2 and hmF2, we need to search from the top of the profile to find the density peak at F-region. The current method simply finds the maximum value resulting in a peak at E-region at some locations. Let's discuss how to correct the calculation here.

scaling tests on forcasted neutral density by new solar winds (newBz, etc) algorithem in WAM-IPE

The new solar-wind deriving algorithm from Kp in WAM-IPE has been verified to improve the neutral density forecast largely against the old algorithm, especially during storm time. While it is still lower than the WAM-IPE results with the observational solar winds input. we plan to do more investigation on the simulation results and do the comparison to check the possibility and rationality to scale the forecasted neutral density by the new algorithm.
#61
#65

Products: Single Frequency Delay in Post Processing

We should add the Single Frequency Delay (SFD) diagnostic to the post-processing suite (IPE_To_Height_Grid.f90) and provide documentation similar to what is done for MUF.

This work should be done on a feature branch off of development and merged in after this issue is completed.

Tasks for resolving this issue include

  • Document SFD intended customers, background, calculation, and any derived indices
  • Implement SFD in IPE_To_Height_Grid.f90
  • Compile and run-check
  • Open Pull Request to merge into development

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.