Execute GEOS via calling gcm_run.j + rename marine suites -- part 1, 2, & 3#677
Execute GEOS via calling gcm_run.j + rename marine suites -- part 1, 2, & 3#677
Conversation
* some changes for geosv12 * changes in geos class utility for geos v12 * changes for 3dvar_cycle to test get, prep and run * rename suites, take out obsolete parts and tasks * breaking down PR/660 into smaller pieces -- part3, adding hofx_cf experiment (#694) * first commit * fix failing tests * clean up and handling NO * fix coding norm * fail if obs is not listed in observation_ioda_names.yaml * add tier 1 tests * fix coding norm * Update src/swell/configuration/jedi/interfaces/geos_cf/task_questions.yaml Co-authored-by: Doruk Ardağ <38666458+Dooruk@users.noreply.github.com> * Update src/swell/configuration/jedi/interfaces/geos_cf/task_questions.yaml Co-authored-by: Doruk Ardağ <38666458+Dooruk@users.noreply.github.com> * remove channels from eva yaml --------- Co-authored-by: Doruk Ardağ <38666458+Dooruk@users.noreply.github.com> * add docstrings and proper naming for tasks as forecast directory is not erased * version bump * suite changes with new task names * code improvements * Implement R2D2 Ingest Suite (#675) * Script to setup new r2d2 credentials * Create a new swell task to test new r2d2 * Adapt get_observations to new R2D2 * Remove exit() * Add r2d2 configs (#318) * Update swell tasks to new R2D2 (#318) * Update r2d2 version of save obs diagnostics #318 * Remove unused files #318 * Create r2d2 file register script #318 * Add scripts for manual setup for R2D2 #318 * Clean up files (#318) * Clean up the files (#318) * Update Python coding norms (#318) * Fix pycode styles * Remove redundant lines * Load R2D2 credentials under TaskBase (#318) * Load credentials under create R2D2 config (#318) * make R2D2 host/compiler detection support dynamic (#318) * Add docs for credential setup (#318) * Update r2d2_config for cascade (#318) * Move credentials under create_task (#318) * Move scripts under utilities (#318) * Fix pylint errors * Fix AttributeError when fetching bias correction files (#318) * Fix bias correction arguments (#318) * Fix bias correction argument (#318) * Add file type argument (#318) * Fix bias correction ingest * Improve file extension support * Add logging * Fix bias coefficient ingestion * Go back to existing bias naming convention * Use JCSDA enums for bias files * Improve logging * Fix code style * extend exclude list to ignore venv and build directories * Change the script name register_files with ingest_files * Create a test suite * add ingest question default * Fix Slurm qos * Add observation yaml for ingest config * Fix datetime with string * suite config * Add defaults and override yaml for ingest * Switch to new cylc * Fix suite config * Add file pattern * Make ingest more modular * Move obs configs under JEDI config directory. * Add ingest config yamls for obs * Implement ingest background suite. (#646) * Use provider name from the provider list (#646) * Clean up unused parts in yaml files (#646) * Fetch model name from experiment.yaml (#646) * Create searh for already ingested files and skip (#646) * Remove window offset (#646) * Delete an old doc * Delete an old script * Add detailed docstrings. (#646) * Add type hints. (#646). * End the task if no source pattern is found (#646). * Made R2D2 related exceptions explicit. (#646). * Update background yaml files (#646). * Fix the glob pattern (#646) * Add search for already ingested files (#646) * Fix pycode style tests (#646) * Fix code tests (#646) * Remove check function #646 * Create a standalone script to delete obs within range * Fix pycode style * Add step for restart * Removing ingest background suite for another PR (#646) * Create a README (#646) * Fix window_start calculation. (#646) * Fix pycodestyle * Add docs to Swell website (#646) * Fix doc path for the website (#646) * Change path for config files with experiments directory. (#646) * Update README (#646) * make mom6_iau model dependent * cycle times hack * make experiment.yaml non-alphabetical again by using default ruamel * relevant for experiment.yaml * minor fix for MOM6 IAU * cycle times and overrride fixes * add tier2 cycling run * fix platform defaults --------- Co-authored-by: Maryam Abdi-Oskouei <mary.abdi@gmail.com> Co-authored-by: Furkan Goktas <ftgoktas@gmail.com>
|
This is ready to go in and/or final testing. CI-workflows need to be modified to handle new suite names though. |
|
@Dooruk For the sake of clarity and simplicity, would it be possible to break down the PR in two for the Main change 1 and 2? They are two disctinct feature improvements here. That would also speed up and facilitate the reviews. Thanks a lot! |
I already did this. I'm running out of time before I go on FMLA so I won't be able to do that again. Main change 1 is just renaming suites anyway. I created this PR after your similar comment, it was only part 1 for a couple of months or so. @mer-a-o reviewed it and decided she will implement the Skylab approach to execute GEOS for compo, and we will collaborate for running NWP afterwards. This is not end all be all in terms of executing GEOS but it is needed urgently for marine group to run some experiments while I'm gone. |
This is ready to go in. CI-workflows need to be modified to handle new suite names like so:
GEOS-ESM/CI-workflows#32
Main change 1:
SOCA (marine) suites are renamed:
3dvar -> 3dvar_marine
3dvar_cycle -> 3dvar_marine_cycle
3dfgat_cycle -> 3dfgat_marine_cycle
Main change 2:
Cylc calls
gcm_run.jdirectly inflow.cylc.With this new approach, SWELL can point to an existing GEOS experiment folder (the
experiment.yamlkey for that isgeos_homdir) and theforecastfolder is now located under experimentGEOSgcm/forecastdirectory. It is possible to hotstart. With this new approach, forecast directory is not erased and MAPL history outputs can be accumulated under there. I updated docs a bit but might add on more for GEOSgcm execution.For those who stumbled upon this PR, more details on change 2 below:
The main thing happening here is that Cylc (flow.cylc) now calls
gcm_run.jdirectly. To facilitate this aforecastdirectory was created under{swell_exp_dir}/GEOSgcm/forecast. Thisforecastfolder is a replication of a GEOS experiment folder, with only a few changes regarding where HOMDIR, EXPDIR are defined. Model execution happens under{swell_exp_dir}/GEOSgcm/forecast/scratchsimilar to typical GEOS model runs.Why was this change necessary:
/RCfiles) in the forecast directory. This creates incompatibility while running/testing different GEOSgcm versions. Between multiple products and update frequencies, this is an important requirement.forecastdir can't be updated inflow.cylcif it is templated in a time dependent way.subprocesssimply couldn't run GEOSv12 on Milan nodes. I tried many combinations, it didn't pass beyond the initialization stage.gcm_run.j. If users make mistake in terms of requesting sufficient SLURM nodes, GEOS tries submitting hundreds of instances to compensate lack of compute resources, then NCCS will yell at you.gcm_run.jandgcm_setup.jscripts are being or will be modernized. This is work underway but might take a long time (especiallygcm_run.j).gcm_run.jin SWELL, some parts should be erased or commented out. Or, my idea is that there could be conditional sections ingcm_run.jsaySWELL_active, thengcm_run.jcan skip those sections, which are mainlypostprocessinganyway.More details in below comment: #677 (comment)
Finally, little primer on
gcm_run.jLet's consider
gcm_run.jin 4 stages:In the current implementation, SWELL handles 2 & 3 via python and
subprocessand 1 is assumed to be set properly by the user, which caused trouble with the NCCS. For DA purposes 4, postprocessing is explicitly handled by SWELL but that is not the focus of this PR.In this proposed implementation, the main difference is that we rely on
gcm_run.jfor 2 and 3 by conducting surgical edits viaPrepCoupledGeosRundirat few locations and runninggcm_run.jdirectly from Cylc (which doesn't capture failed exit status):I created the
3dfgat_coupled_cyclesuite for testing, should work by default if anyone has time to check it out.