Git Product home page Git Product logo

fmscoupler's People

Contributors

abrooks1085 avatar andrew-c-ross avatar baoqiang80 avatar bcc2761 avatar bensonr avatar colingladuenoaa avatar fabienpaulot avatar gbw-gfdl avatar gfdl-eric avatar gitleviglenn avatar hallberg-noaa avatar jgjgfdl avatar jwdgfdl avatar kaiyuan-cheng avatar laurenchilutti avatar mcallic2 avatar menzel-gfdl avatar mlee03 avatar nikizadehgfdl avatar raphaeldussin avatar rem1776 avatar scitech777 avatar slm7826 avatar spencerkclark avatar thomas-robinson avatar underwoo avatar wfcooke avatar wrongkindofdoctor avatar zhi-liang avatar zhihong-tan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fmscoupler's Issues

omp_set_nested depreciated

omp_set_nested has been depreciated. The following warning occurs when the coupler is compiled:

OMP: Info #276: omp_set_nested routine deprecated, please use omp_set_max_active_levels instead.

We currently have
https://github.com/NOAA-GFDL/FMScoupler/blob/main/full/coupler_main.F90#L1359-L1375

    !--- dynamic threading turned off when affinity placement is in use
!$  call omp_set_dynamic(.FALSE.)
    !--- nested OpenMP enabled for OpenMP concurrent components
!$  call omp_set_nested(.TRUE.)

    if (Atm%pe) then
      call mpp_set_current_pelist( Atm%pelist )
!$    if (.not.do_concurrent_radiation) radiation_nthreads=atmos_nthreads
!$    if (do_concurrent_radiation) conc_nthreads=2
      !--- setting affinity
      if (do_concurrent_radiation) then
!$      call fms_affinity_set('ATMOS', use_hyper_thread, atmos_nthreads + radiation_nthreads)
      else
!$      call fms_affinity_set('ATMOS', use_hyper_thread, atmos_nthreads)
      endif
!$    call omp_set_num_threads(atmos_nthreads)
    endif

The omp_set_nested(.TRUE.) and If the argument to omp_set_nested evaluates to true, the value of the max-active-levels-var ICV is set to the number of active levels of parallelism that the implementation supports; otherwise, if the value of max-active-levels-var is greater than 1 then it is set to 1.
So, what do we set omp_set_max_levels to be?
"The effect of this routine is to set the value of the max-active-levels-var ICV to the value specified in the argument. If the number of active levels requested exceeds the number of active levels of parallelism supported by the implementation, the value of the max-active-levels-var ICV will be set to the number of active levels supported by the implementation." If we set it to some large value, then it should maximize the number of active levels similar to the behavior of omp_set_nested(.TRUE.)

Remove redundant units='none' arguments in register_*_field calls

In some register_*_field calls in full,simple/atm_land_ice_flux_exchange.F90, the optional argument units is pass in as "none":

id_land_mask = &
register_static_field ( mod_name, 'land_mask', atmos_axes, &
'fractional amount of land', 'none', &
range=frange, interp_method = "conserve_order1" )
!--------- initialize diagnostic fields --------------------
id_ice_mask = &
register_diag_field ( mod_name, 'ice_mask', atmos_axes, Time, &
'fractional amount of sea ice', 'none', &
range=frange, interp_method = "conserve_order1" )
id_wind = &
register_diag_field ( mod_name, 'wind', atmos_axes, Time, &
'wind speed for flux calculations', 'm/s', &
range=(/0.,vrange(2)/) )
id_drag_moist = &
register_diag_field ( mod_name, 'drag_moist', atmos_axes, Time, &
'drag coeff for moisture', 'none' )
id_drag_heat = &
register_diag_field ( mod_name, 'drag_heat', atmos_axes, Time, &
'drag coeff for heat', 'none' )
id_drag_mom = &
register_diag_field ( mod_name, 'drag_mom', atmos_axes, Time, &
'drag coeff for momentum', 'none' )

This is redundant because, diag manager does not write it anyway:
https://github.com/NOAA-GFDL/FMS/blob/203c8bf464ff26fe0fe39b1451caedd026bbce55/diag_manager/diag_output.F90#L651-L652

Dimensionless variables are not required to have units to be cf compliant. I think these units='none' arguments should be removed from the register_*field calls. It won't change any behavior.

refactorization: coupler_clock object

The derived type coupler_clock_type consists of (integer) clock ids and has been introduced as part of full coupler_main refactorization. Because of the frequency in which these ids are retrieved in coupler_main, all components of coupler_clock_type have initially been made public.

However, to be consistent with current FMS developments to objectify all derived types - making derived types components private and introducing "getters" and "setter" type-bound procedures, the following changes should be made:

  1. All ids should be made private.
  2. Type-bound functions to retrieve clock ids should be introduced.
  3. Existing clock-related subroutine coupler_set_clock_ids should be made into a type-bound subroutine.

div_by_zero in atmos_ocean_fluxes_calc

I'm attempting a run through with Lori's xml, hitting a div_by_zero in this section of code which is calculating gas fluxes.

& 101325./(rdgas*wtmair*1e-3*tsurf(i)*gas_fields_ice%bc(n)%field(ind_alpha)%values(i)),&
It would appear that the incoming ice gas fields are zero (this is the calling routine in atm_land_ice_flux_exchange):
call atmos_ocean_fluxes_calc(ex_gas_fields_atm, ex_gas_fields_ice, ex_gas_fluxes, ex_seawater, ex_t_surf)
I realize these values may be overridden at some point later on in the code (and hence the error probably doesn't even show up in a prod run), but should there be something here that at least warns of the div by zero, or perhaps provides a default flux value in the event of 0 fields?

dependency on land_model_mod.mod from AM4

I have a autotools based build system for FMScoupler, but unfortunately FMSCoupler depends on land_model_mod.mod from AM4. Meanwhile AM4 depends on FMSCoupler.

Your home-rolled build system + your methods of building (a.k.a. your "workflow") are hiding your inter-dependencies. I suggest you get those dependencies sorted out and decide what packages are going to depend of what other packages.

2020.04-beta1 data override issue with doubly periodic

Here is the error message:

FATAL from PE     1: data_override_mod: Error in opening file INPUT/grid_spec.nc
fms_aquaplanet_xa  000000000249D8C3  mpp_mod_mp_mpp_er          68  mpp_util_mpi.inc
fms_aquaplanet_xa  000000000212AE21  data_override_mod         377  data_override.F90
fms_aquaplanet_xa  0000000000403784  coupler_main_IP_c         476  coupler_main.F90
fms_aquaplanet_xa  0000000000401CA2  MAIN__                    158  coupler_main.F90

There is no grid_spec.nc file for the doubly periodic.  The grid is set up without it.  Is there a way to initialize data override without a grid spec file?

Need additional argument added to update_slow_ice_and_ocean

For the recent corrections to the combined ice-ocean driver to be used, the ocean_ice_boundary information needs to be passed to the combined ice-ocean driver through the update_slow_ice_and_ocean call. The ocean_ice_boundary type has been added to an optional argument in SIS2 in commit 0006cd2 so adding this argument requires that version of SIS2 (or newer) otherwise there is an error at compile time.

coupler to have travis CI testing

The FMS coupler library need to have automated testing done. One method is to use the null component libraries on github, and use Travis CI.

Ice fraction in tracer flux calculations

I'm very likely to be misunderstanding/missing something, but the calculation of gas and tracer fluxes in atmos_ocean_fluxes_calc_mod and atmos_ocean_dep_fluxes_calc_mod does not appear to account for fractional ice coverage. Unweighted fluxes (and related fields) are calculated for open water exchange cells, otherwise they are set to zero. E.g.

if (seawater(i) == 1.) then
gas_fluxes%bc(n)%field(fms_coupler_ind_kw)%values(i) =&
& gas_fluxes%bc(n)%param(1) * gas_fields_atm%bc(n)%field(fms_coupler_ind_u10)%values(i)**2
cair(i) = &
gas_fields_ice%bc(n)%field(fms_coupler_ind_alpha)%values(i) * &
gas_fields_atm%bc(n)%field(fms_coupler_ind_pCair)%values(i) * &
gas_fields_atm%bc(n)%field(fms_coupler_ind_psurf)%values(i) * gas_fluxes%bc(n)%param(2)
gas_fluxes%bc(n)%field(fms_coupler_ind_flux)%values(i) =&
& gas_fluxes%bc(n)%field(fms_coupler_ind_kw)%values(i) *&
& sqrt(660. / (gas_fields_ice%bc(n)%field(fms_coupler_ind_sc_no)%values(i) + epsln)) *&
& (gas_fields_ice%bc(n)%field(fms_coupler_ind_csurf)%values(i) - cair(i))
gas_fluxes%bc(n)%field(fms_coupler_ind_flux0)%values(i) =&
& gas_fluxes%bc(n)%field(fms_coupler_ind_kw)%values(i) *&
& sqrt(660. / (gas_fields_ice%bc(n)%field(fms_coupler_ind_sc_no)%values(i) + epsln)) *&
& gas_fields_ice%bc(n)%field(fms_coupler_ind_csurf)%values(i)
gas_fluxes%bc(n)%field(fms_coupler_ind_deltap)%values(i) =&
& (gas_fields_ice%bc(n)%field(fms_coupler_ind_csurf)%values(i) - cair(i)) / &
(gas_fields_ice%bc(n)%field(fms_coupler_ind_alpha)%values(i) * permeg + epsln)
else
gas_fluxes%bc(n)%field(fms_coupler_ind_kw)%values(i) = 0.0
gas_fluxes%bc(n)%field(fms_coupler_ind_flux)%values(i) = 0.0
gas_fluxes%bc(n)%field(fms_coupler_ind_flux0)%values(i) = 0.0
gas_fluxes%bc(n)%field(fms_coupler_ind_deltap)%values(i) = 0.0
cair(i) = 0.0
endif

Am I understanding correctly?

FMScoupler calls to ice_model_restart require time argument for new IO

An optional restart_time argument needs to be passed to ice_model_restart to support the implementation of the new FMS IO in ice_SIS and, eventually, SIS2. This change allows the user to override the default time value of 1.0 days that is set in ice_model_restart, as when writing intermediate restart files.

unnecessary/unused line should be removed

This line appears unnecessary, and the returned contents of "date" are unused:

call fms_time_manager_get_date (Time_atmos, date(1), date(2), date(3),  &
                                 date(4), date(5), date(6))

https://github.com/NOAA-GFDL/FMScoupler/blob/77618869f48507c8629f28457cb701e25e1ea4fc/SHiELD/coupler_main.F90#L447C12-L447C37

Note that in FMS, get_date takes the first argument Time_atmos as an input argument:
https://github.com/NOAA-GFDL/FMS/blob/be1856c45accfe2fb15953c5f51e0d58a8816882/time_manager/time_manager.F90#L1145

The date() object is not used after being returned by the get_date routine, so this line is unnecessary and adds confusion here. (e.g. I originally assumed based on the context that "Time_atmos" was being returned and used below for subsequent checks, but instead it is defined at a higher scope.)

Uninitialized variables from `surface_flux_nml`

In shared/surface_flux.F90, all variables from the surface_flux_nml namelist are initialized except bulk_zu, bulk_zt, and bulk_zq. If these values are absent from the namelist, this can result in a crash due to initialization to NaN.

FMS2io crash when reading gridpsec

The SCM model crashes when using the 2021.02 tag with the following error:
FATAL: incorrect size of dim_sizes array.

This is the traceback:

fms_SCM_am4_xanad  00000000030FB631  mpp_mod_mp_mpp_er          71  mpp_util_mpi.inc
fms_SCM_am4_xanad  0000000002AE2DD0  fms_io_utils_mod_         212  fms_io_utils.F90
fms_SCM_am4_xanad  0000000002961145  netcdf_io_mod_mp_        1553  netcdf_io.F90
fms_SCM_am4_xanad  000000000045375D  flux_exchange_mod         916  flux_exchange.F90
fms_SCM_am4_xanad  00000000004565E7  flux_exchange_mod         753  flux_exchange.F90
fms_SCM_am4_xanad  000000000040C69F  coupler_main_IP_c        1782  coupler_main.F90
fms_SCM_am4_xanad  0000000000402614  MAIN__                    568  coupler_main.F90

call get_variable_size(grid_file_obj, "AREA_ATM", siz)

Here siz is an array of size 4.
The variable AREA_ATM in the grid_spec has two dimensions, so the code crashes because 2 does not equal 4.

double AREA_ATM(yta, xta) ;

The call to get_variable_size should be:

call get_variable_size(grid_file_obj, "AREA_ATM", siz(1:2))

Source lines too long to compile using PGI compiler

PGF90-S-0285-Source line too long (src/coupler/surface_flux.F90: 39)
PGF90-S-0285-Source line too long (src/coupler/flux_exchange.F90: 198)
PGF90-S-0285-Source line too long (src/coupler/coupler_main.F90: 105)
PGF90-S-0285-Source line too long (src/coupler/coupler_main.F90: 106)
PGF90-S-0285-Source line too long (src/coupler/coupler_main.F90: 114)
PGF90-S-0285-Source line too long (coupler/coupler_main.F90: 115)
PGF90-S-0285-Source line too long (src/coupler/coupler_main.F90: 117)
PGF90-S-0285-Source line too long (src/coupler/coupler_main.F90: 120)

These comment lines are far in excess of the 132 characters allowed in Fortran 95.

Update CI to Github Actions

The Travis CI with the null_model build has not been shown in the repo since the name was changed from coupler due to the URL change. It is still being run silently and currently failing, but should be migrated to Github Actions to mirror the same change in the FMS repo and to make it available once again.

Implement FMS2io in FMScoupler

FMS has a "new" io: fms2_io that should be used instead of fms_io:
save_restart, restore_state is called in coupler_main.F90
coupler_type_register_restarts and coupler_type_restore_state, which still use fms_io are also called in coupler_main.F90.

Requires NOAA-GFDL/FMS#677 to be solved.

refactorization: Remove redundant clocks in flux exchange modules

This issue proposes to remove clocks in atm_land_ice_flux_exchange_mod in full coupler.

In full coupler, clocks are set in the coupler_main program as well as in _*flux_exchange modules. Some of the clocks in coupler_main and *flux_exchange modules overlap in functionality.

For example, in atm_land_ice_flux_exchange_mod, clocks sfcClock, fluxAtmDnClock,regenClock, and fluxAtmUpClock are initialized in subroutine atm_land_ice_flux_exchange_init. These clocks begin at the beginning and end at the end of its respective subroutines - sfc_boundary_layer, flux_down_from_atmos, generate_sfc_xgrid, flux_up_to_atmos - to gather runtime information.

Simultaneously, clocks newClockb, newClockd, newClock1, and newClockg in coupler_main program are initialized in subroutine coupler_init and begin before calls to its respective subroutines - sfc_boundary_layer, flux_down_from_atmos, generate_sfc_xgrid, flux_up_to_atmos - and end post call.

The set of clocks in atm_land_ice_flux_exchange_mod and set of clocks in coupler_main seem to measure the same processes. The only seemingly difference between the two sets is time measurement being done internally or externally. Thus, the redundant set of clocks in atm_land_ice_flux_exchange_mod could be removed without any loss of diagnostic information.

coupler repo is missing the xanadu tag

xanadu xmls won't work bacause of the missing tag:

 git clone -q --recursive -b xanadu https://github.com/NOAA-GFDL/coupler.git
fatal: Remote branch xanadu not found in upstream origin

@menzel-gfdl I cannot assign the issues on this repo!

string length errors from SPEAR runs

SPEAR errors have been reported due to a string length being exceeded in the code below:

CHARACTER(len=256) :: executable_name, arg, fredb_id
#ifdef FREDB_ID
#define xstr(s) str(s)
#define str(s) #s
fredb_id = xstr(FREDB_ID)
#else
#warning "FREDB_ID not defined. Continuing as normal."
fredb_id = 'FREDB_ID was not defined (e.g. -DFREDB_ID=...) during preprocessing'
#endif
arg_count = command_argument_count()
DO i=0, arg_count
CALL get_command_argument(i, arg, status=status)
if (status .ne. 0) then
write (error_unit,*) 'get_command_argument failed: status = ', status, ' arg = ', i
stop 1
end if
if (i .eq. 0) then
executable_name = arg
else if (arg == '--fredb_id') then
write (output_unit,*) TRIM(fredb_id)
stop
end if
END DO
if (arg_count .ge. 1) then
write (error_unit,*) 'Usage: '//TRIM(executable_name)//' [--fredb_id]'
stop 1
end if

This code related to the fredb is unused and can be removed.

Passing additional surface variables into the atmospheric model

A new boundary layer scheme (NCEP TKE-based eddy-diffusivity mass-flux scheme) is being implemented in the prototype-AM5 model, and it requires five additional surface variables (shflx, lhflx, wind, thv_atm, and thv_surf) as inputs. Thus, I would like to add a few lines in the surface coupler (full/atm_land_ice_flux_exchange.F90 and shared/surface_flux.F90) to pass these variables into the atmospheric model.

This issue is associated with the issue I just opened for the atmos_drivers repository.

How to start learning FMScoupler?

I have to learn to couple different ESM component models. Where do I start to learn this FMScoupler. Are there any sample examples?

Simple coupler missing argument to diag_manager_init that results in missing prepended date in associated_files attribute

Users of the simple coupler (e.g. aquaplanet) find that their history files are missing a prepended date in the associated_files attribute.

For example, this is a typical associated_files global attribute:
:associated_files = "land_area: 20980101.land_static.nc" ;

If one uses the simple coupler, they get this instead:
:associated_files = "land_area: land_static.nc" ;

The result is that fregrid fails to find the expected associated file (that has the prepended date) and post-processing fails.

While poking in dev/master with @menzel-gfdl, we found that the diag_manager_init expects a certain argument to activate the date-prepending behavior. The full coupler passes this in but the simple coupler does not. It also prints a distinctive error message as it fails to use the prepended date.

The full coupler (https://github.com/NOAA-GFDL/coupler/blob/dev/master/full/coupler_main.F90 line 1638) has:
call diag_manager_init(DIAG_MODEL_SUBSET=diag_model_subset, TIME_INIT=date)

The TIME_INIT argument is the one that simple_coupler needs in order for the date-prepending code to be activated.

refactorization: subroutine coupler_chksum removal

This issue proposes to remove the subroutine coupler_chksum. In full coupler_main, the subroutine coupler_chksum is called before or at the end of the grouped calls to atmos_ice_land_chksum and ocean_chkum. In coupler_chksum, the checksums of tracers and the below select fields are computed twice and are written to stdout twice back to back.

Atm%t_box
Atm%z_bot
Atm%p_bot
Atm%u_bot
Atm%v_bot
Atm%p_urf
Atm%gust
Land%t_surf
Land%t_ca
Land%rough_mom
Land%rough_heat
Land%rough_scale
Ice%t_surf
Ice%rough_mom
Ice%rough_heat
Ice%rough_moist
Ice%ocean_fields

Unless it is useful to group the checksums of the above fields in stdout (for example, for easier analysis) coupler_chksum should be removed to avoid unnecessary redundancy.

Should SHiELD be using a Gregorian calendar?

Disclaimer

This is a bit in the weeds and is not a blocking issue, but I have often wondered whether SHiELD should be using a Gregorian calendar instead of a Julian calendar. I just wanted to put down some thoughts as some investigation provided me a little more clarity on this question than I had previously. I am increasingly thinking the answer should be "yes," but I would be curious to know what the rationale was behind using a Julian calendar initially.

Issue

In thinking about doing extended simulations using SHiELD, NOAA-GFDL/atmos_drivers#36, it occurred to me to look into how time deltas were computed in its physics, e.g. in this line or when computing the Julian day in the astronomy module, as I wanted to make sure that they were safe from the same integer overflow problem described in NOAA-GFDL/atmos_drivers#35. I believe that they are, because 32-bit integer values are only used for computing the Julian day (and differences between Julian days), and time deltas in units of seconds are computed using 64-bit floats.

But in investigating these functions / subroutines I also wanted to see what kind of calendar they assumed the dates provided used. This is relevant for longer simulations, since the Julian (what the SHiELD coupler currently uses) and Gregorian calendars, for example, handle whether year 2100 is a leap year differently. In the Julian calendar it is a leap year, but in the Gregorian calendar it is not.

To compute these time deltas, SHiELD uses functions / subroutines defined in the NCEPLIBS-w3emc library, like iw3jdn, which computes the Julian day. This function is also called in downstream subroutines like w3difdat. While the comment in iw3jdn does not explicitly say which calendar it expects the input date to use, it refers to an old ACM letter to the editor:

Henry F. Fliegel and Thomas C. van Flandern. 1968. Letters to the editor: a machine algorithm for processing calendar dates. Commun. ACM 11, 10 (Oct. 1968), 657. https://doi.org/10.1145/364096.364097

This letter makes it clear that this algorithm assumes the date comes from a Gregorian calendar. We can also assess this empirically by comparing the result of w3difdat when called to determine the time difference between two dates straddling the potential leap year in 2100:

program test
  integer :: jdat(8), idat(8)
  real(kind=8) :: rinc(5)

  jdat = (/ 2100, 3, 1, 0, 0, 0, 0, 0 /)
  idat = (/ 2100, 2, 1, 0, 0, 0, 0, 0 /)
  call w3difdat(jdat, idat, 1, rinc)
  print *, jdat
  print *, idat
  print *, rinc
end program test

If we run this program, we obtain the following output:

        2100           3           1           0           0           0           0           0
        2100           2           1           0           0           0           0           0
   28.000000000000000        0.0000000000000000        0.0000000000000000        0.0000000000000000        0.0000000000000000

indicating that the difference between March 1st, 2100 and February 1st, 2100 is 28 days. This suggests 2100 does not have a leap year, and therefore that the algorithm expects the dates come from a Gregorian calendar (perhaps not surprisingly, since this should help the physics produce the most accurate weather forecasts).

For typical date ranges of SHiELD simulations, this is not important—the Julian and Gregorian calendars are effectively the same between 1901 and 2099—but in extended simulations, this mismatch in calendars starts to become relevant, and there will be an inconsistency between time deltas computed using the FMS time manager and time deltas computed using the physics.

Why might we care about this in the present day?

While the Julian calendar is still used in some settings, the Gregorian calendar is the standard calendar we use in everyday life, and therefore makes the most sense to use for a weather model. While a calendar change should not affect answers in the present day, it would show up in the metadata of data output by SHiELD, which could make it easier to interact with in downstream libraries (e.g. in Python it would allow dates to be safely represented using standard library datetime.datetime objects or numpy.datetime64 values).

Anyway some food for thought, but again I want to emphasize that this is not a blocking issue.

cc: @lharris4

refactorization: simplify setting of Time_start and Time at nc=0

In full coupler_main, Time at nc=0 and Time_init are set through a convoluted process in subroutine coupler_init.

For Time at nc=0,

  1. Time = date where date is specified in the third line in the coupler.res file.
  2. If coupler.res does not exist or if the user specifies force_date_from_namelist=.True., Time = current_date where current_date is specified in the namelist.
  3. If coupler_init is resorted to using current_date and current_date is not of appropriate value, mpp_error is called.

Initially, Time_init=date_init where date_init is specified in the second line of coupler.res. This value is eventually overwritten to be Tiime_init=base_date where base_date is specified in the diag_table. If the base_year=0 from the diag_table, date_init=date where date had been determined as above description.

Setting of Time and Time_init should be straightfoward and only method route should be available to the user.
Refining the setting of Time and Time_init also opens up a chance to discuss the need of the coupler.res file.

Unable to run with the combined ice-ocean driver

I have been testing the combined ice-ocean driver that was originally implemented in a fully-coupled model with a data atmosphere. A test case of the fully coupled version of this setup can be found here: MOM6-examples/tree/dev/gfdl/coupled_AM2_LM3_SIS2/Intersperse_ice_1deg

But when trying to run a configuration without an active atmosphere, such as the Baltic test case, there is an error in initialization stating GET_PESET: pelist is not monotonically increasing and the run fails.

This is because the sea-ice PEs are separated into slow and fast PEs when the combined-ice ocean driver is used. Fast ice PEs are a subset of the atmosphere PEs and slow ice PEs are the ocean PE list (lines 1364-1373 of coupler_main.F90). However, when using a data atmosphere (do_atmos=false), the atmosphere PE list and the ocean PE list are the same (for example [0-15]). This leads to a non-monotonic PE list for the ice PE list which is the union of the fast and slow PE list ( ice PE list would be [0-15 0-15]). To correct this issue, when using the combined ice-ocean driver when the atmosphere and ocean use the same PE list, the ice PE list should be only the fast (or slow) PE list (just [0-15]).

To reproduce this error, the combined ice-ocean driver (CIOD) should be turned on in a test case with do_atmos=false. The CIOD_input file can be found in the fully coupled interspersed case, and the input.nml file should have the following additions:

 &ice_ocean_driver_nml
    output_directory = './',
    parameter_filename = 'CIOD_input',
                         'CIOD_override'
/

and in the coupler_nml portion:

            do_atmos = .false.,
            do_ice = .true.,
            do_ocean = .true.,
            use_lag_fluxes=.false.
            concurrent = .false.
            concurrent_ice = .true.,
            slow_ice_with_ocean = .true.,
            combined_ice_and_ocean = .true.

Support for Gregorian calendar

FMScoupler does not currently support the Gregorian calendar.

Gregorian does appear to be supported by FMS, so are there any historical reasons for this lack of support or other issues with the Gregorian calendar?

I running some MOM6 simulations with realistic tidal forcing, and using the Gregorian calendar is particularly useful for this case.

As far as I can tell supporting the Gregorian calendar would only require importing it from time_manager_mod in coupler_main and then adding a case statement for it. With these changes my MOM6 simulations run fine. I'd be happy to send a PR.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.