Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
43 changes: 38 additions & 5 deletions .github/workflows/bash_code_analysis.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ on:
push:
branches:
- develop
pull_request:
pull_request_target: # safe as long as code is not being run

workflow_dispatch:

Expand All @@ -19,19 +19,35 @@ jobs:
checks: write

steps:
- name: determine hash
uses: haya14busa/action-cond@v1
id: hash
with:
cond: ${{ github.event_name == 'pull_request' }}
if_true: ${{ github.event.pull_request.head.sha }}
if_false: ''

- name: checkout code
uses: actions/checkout@v6
with:
ref: ${{ steps.hash.outputs.value }}
submodules: false

- name: determine reporter
uses: haya14busa/action-cond@v1
id: reporter
with:
cond: ${{ github.event_name == 'pull_request' }}
if_true: 'github-pr-review'
if_false: 'github-check'

- name: shfmt scan
uses: reviewdog/action-shfmt@v1
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
filter_mode: nofilter
fail_level: any
level: any
reviewdog_flags: '-reporter=github-pr-review'
reviewdog_flags: '-reporter=${{ steps.reporter.outputs.value }} -fail-level=any'
shfmt_flags: ''

shellcheck:
Expand All @@ -44,16 +60,33 @@ jobs:
checks: write

steps:
- name: determine hash
uses: haya14busa/action-cond@v1
id: hash
with:
cond: ${{ github.event_name == 'pull_request' }}
if_true: ${{ github.event.pull_request.head.sha }}
if_false: ''

- name: checkout code
uses: actions/checkout@v6
with:
ref: ${{ steps.hash.outputs.value }}
submodules: false


- name: determine reporter
uses: haya14busa/action-cond@v1
id: reporter
with:
cond: ${{ github.event_name == 'pull_request' }}
if_true: 'github-pr-review'
if_false: 'github-check'

- name: shellcheck scan
uses: reviewdog/action-shellcheck@v1
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
reporter: github-pr-review
reporter: ${{ steps.reporter.outputs.value }}
filter_mode: nofilter
fail_level: any
level: any
Expand Down
12 changes: 6 additions & 6 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[![Read The Docs Status](https://readthedocs.org/projects/global-workflow/badge/?badge=latest)](http://global-workflow.readthedocs.io/)
[![shellnorms](https://github.com/NOAA-EMC/global-workflow/actions/workflows/linters.yaml/badge.svg)](https://github.com/NOAA-EMC/global-workflow/actions/workflows/linters.yaml)
[![pynorms](https://github.com/NOAA-EMC/global-workflow/actions/workflows/pynorms.yaml/badge.svg)](https://github.com/NOAA-EMC/global-workflow/actions/workflows/pynorms.yaml)
[![bash code analysis](https://github.com/NOAA-EMC/global-workflow/workflows/bash_code_analysis/badge.svg?branch=develop&event=push)](https://github.com/NOAA-EMC/global-workflow/actions?query=workflow%3Abash_code_analysis+event%3Apush+branch%3Adevelop)
[![python code analysis](https://github.com/NOAA-EMC/global-workflow/workflows/python_code_analysis/badge.svg)](https://github.com/NOAA-EMC/global-workflow/actions?query=workflow%3Apython_code_analysis+event%3Apush+branch%3Adevelop)
![Custom badge](https://img.shields.io/endpoint?url=https://gist.githubusercontent.com/emcbot/e35aa2904a54deae6bbb1fdc2d960c71/raw/wcoss2.json)

![Custom badge](https://gist.githubusercontent.com/emcbot/66059582886cb5c2485ff64bf24e7f93/raw/ursa_pipeline_badge.svg)
Expand Down Expand Up @@ -28,12 +28,12 @@ The Global Workflow currently supports the following machines at the indicated t

| HPC | Tier | Notes |
| --------------------------------------- |:----:|:--------------------------------------------------------------------------:|
| WCOSS2<br>NCO | 1 | GEFS testing is not regularly performed. |
| Ursa<br>NOAA RDHPCS | 1 | METplus verification and vminmon GSI-monitor jobs and GCAFS system not supported yet. |
| Hercules<br>MSU | 1 | Currently does not support the TC Tracker. |
| WCOSS2<br>NCO | 1 | |
| Ursa<br>NOAA RDHPCS | 1 | |
| Hercules<br>MSU | 1 | |
| Gaea C6<br>RDHPCS | 1 | |
| Hera<br>NOAA RDHPCS | 2 | |
| Orion<br>MSU | 2 | The GSI runs very slowly on Orion and the TC tracker is not supported. |
| Orion<br>MSU | 2 | The GSI runs very slowly. |
| AWS, GCP, Azure <br>NOAA Parallel Works | 3 | Supported by EPIC. |

<ins>**Tier Definitions**</ins>
Expand Down
1 change: 0 additions & 1 deletion dev/ci/cases/pr/C96_gcafs_cycled.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,6 @@ experiment:
skip_ci_on_hosts:
- gaeac5
- awsepicglobalworkflow
- ursa
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should we run this on Ursa now that its running well?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's what this does, removes Ursa from the 'skip' list


workflow:
engine: rocoto
Expand Down
1 change: 0 additions & 1 deletion dev/ci/cases/pr/C96_gcafs_cycled_noDA.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,6 @@ skip_ci_on_hosts:
- gaeac5
- hercules
- awsepicglobalworkflow
- ursa
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

same


workflow:
engine: rocoto
Expand Down
12 changes: 6 additions & 6 deletions dev/ci/gitlab-ci-hosts.yml
Original file line number Diff line number Diff line change
Expand Up @@ -22,19 +22,19 @@

# Template matrices for case lists
.hera_cases_matrix: &hera_cases
- caseName: ["C48_ATM", "C48_S2SW", "C48_S2SWA_gefs", "C48mx500_3DVarAOWCDA", "C48mx500_hybAOWCDA", "C96C48_hybatmDA", "C96C48_hybatmsnowDA", "C96C48_hybatmsoilDA", "C96C48_ufsgsi_hybatmDA", "C96C48_ufs_hybatmDA", "C96C48mx500_S2SW_cyc_gfs", "C96_atm3DVar", "C96_gcafs_cycled", "C96_gcafs_cycled_noDA", "C96mx100_S2S", "C96mx025_S2S"]
- caseName: ["C48_ATM", "C48_S2SW", "C48_S2SWA_gefs", "C48mx500_3DVarAOWCDA", "C48mx500_hybAOWCDA", "C96C48_hybatmDA", "C96C48_hybatmsnowDA", "C96C48_hybatmsoilDA", "C96C48_ufsgsi_hybatmDA", "C96C48_ufs_hybatmDA", "C96C48mx500_S2SW_cyc_gfs", "C96_atm3DVar", "C96_gcafs_cycled", "C96_gcafs_cycled_noDA", "C96mx100_S2S"]

.gaeac6_cases_matrix: &gaeac6_cases
- caseName: ["C48_ATM", "C48_S2SW", "C48_S2SWA_gefs", "C48mx500_3DVarAOWCDA", "C48mx500_hybAOWCDA", "C96C48_hybatmDA", "C96C48_hybatmsnowDA", "C96C48_hybatmsoilDA", "C96C48mx500_S2SW_cyc_gfs", "C96_atm3DVar", "C96_gcafs_cycled", "C96_gcafs_cycled_noDA", "C96mx100_S2S", "C96mx025_S2S"]
- caseName: ["C48_ATM", "C48_S2SW", "C48_S2SWA_gefs", "C48mx500_3DVarAOWCDA", "C48mx500_hybAOWCDA", "C96C48_hybatmDA", "C96C48_hybatmsnowDA", "C96C48_hybatmsoilDA", "C96C48mx500_S2SW_cyc_gfs", "C96_atm3DVar", "C96_gcafs_cycled", "C96_gcafs_cycled_noDA", "C96mx100_S2S"]

.orion_cases_matrix: &orion_cases
- caseName: ["C48_ATM", "C48_S2SW", "C48_S2SWA_gefs", "C96C48_hybatmDA", "C96C48mx500_S2SW_cyc_gfs", "C96_atm3DVar", "C96mx100_S2S", "C96_gcafs_cycled", "C96mx025_S2S"]
- caseName: ["C48_ATM", "C48_S2SW", "C48_S2SWA_gefs", "C96C48_hybatmDA", "C96C48mx500_S2SW_cyc_gfs", "C96_atm3DVar", "C96mx100_S2S", "C96_gcafs_cycled"]

.hercules_cases_matrix: &hercules_cases
- caseName: ["C48_ATM", "C48_S2SW", "C48_S2SWA_gefs", "C48mx500_3DVarAOWCDA", "C48mx500_hybAOWCDA", "C96C48_hybatmDA", "C96C48mx500_S2SW_cyc_gfs", "C96_atm3DVar", "C96mx100_S2S", "C96_gcafs_cycled", "C96mx025_S2S"]
- caseName: ["C48_ATM", "C48_S2SW", "C48_S2SWA_gefs", "C48mx500_3DVarAOWCDA", "C48mx500_hybAOWCDA", "C96C48_hybatmDA", "C96C48mx500_S2SW_cyc_gfs", "C96_atm3DVar", "C96mx100_S2S", "C96_gcafs_cycled"]

.ursa_cases_matrix: &ursa_cases
- caseName: ["C48_ATM", "C48_S2SW", "C48_S2SWA_gefs", "C48mx500_3DVarAOWCDA", "C48mx500_hybAOWCDA", "C96C48_hybatmDA", "C96C48_hybatmsnowDA", "C96C48_hybatmsoilDA", "C96C48_ufsgsi_hybatmDA", "C96C48_ufs_hybatmDA", "C96C48mx500_S2SW_cyc_gfs", "C96_atm3DVar", "C96mx100_S2S", "C96mx025_S2S"]
- caseName: ["C48_ATM", "C48_S2SW", "C48_S2SWA_gefs", "C48mx500_3DVarAOWCDA", "C48mx500_hybAOWCDA", "C96C48_hybatmDA", "C96C48_hybatmsnowDA", "C96C48_hybatmsoilDA", "C96C48_ufsgsi_hybatmDA", "C96C48_ufs_hybatmDA", "C96C48mx500_S2SW_cyc_gfs", "C96_atm3DVar", "C96mx100_S2S", "C96_gcafs_cycled", "C96_gcafs_cycled_noDA"]

# Host: Hera - Standard Cases
setup_experiments-hera:
Expand Down Expand Up @@ -277,7 +277,7 @@ setup_ctests-hera:

setup_ctests-gaeac6:
extends: .setup_ctests_template
stage: setup_tests
stage: setup_tests
tags:
- gaeac6
variables:
Expand Down
2 changes: 1 addition & 1 deletion docs/source/hpc.rst
Original file line number Diff line number Diff line change
Expand Up @@ -97,7 +97,7 @@ The Global Workflow provides capabilities for deterministic and ensemble forecas
- 1
- X
- X
-
- x
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

should this be uppercase?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no idea, it's whatever is in develop now

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It should be. I'll fix this in develop on my next PR.

- X
- X
- X
Expand Down
6 changes: 5 additions & 1 deletion ush/python/pygfs/task/atmens_analysis.py
Original file line number Diff line number Diff line change
Expand Up @@ -88,7 +88,6 @@ def initialize(self) -> None:
# initialize JEDI applications
logger.info(f"Initializing JEDI LETKF observer application")
self.jedi_dict['atmensanlobs'].initialize(clean_empty_obsspaces=True)
self.jedi_dict['atmensanlsol'].initialize()
self.jedi_dict['atmensanlfv3inc'].initialize()

@logit(logger)
Expand Down Expand Up @@ -123,6 +122,11 @@ def execute(self, jedi_dict_key: str) -> None:
None
"""

# Initialize solver immediately before execution so that obs space files are
# available for cleaning after running the observer
if jedi_dict_key == 'atmensanlsol':
self.jedi_dict['atmensanlsol'].initialize(clean_empty_obsspaces=True)

self.jedi_dict[jedi_dict_key].execute()

@logit(logger)
Expand Down
Loading