Hello,
I’m working to simulate ATMS radiances using CRTM’s forward model, specifically the test_ClearSky test code (test/mains/regression/forward/test_ClearSky). My input data consists of the standard gridded ECMWF model profiles in a .le file format containing atmospheric variables. My goal is to produce simulated radiances for ATMS without manually creating .inc files (Load_Atm_Data.inc, Load_Sfc_Data.inc), as this approach seems cumbersome and requires reformatting the .le data.
I’ve reviewed the CRTM User Guide (especially section 4.6) and found it helpful for understanding CRTM’s input structures, but it doesn’t cover converting ECMWF data to test code inputs. The .le files likely need parsing and interpolation, and I’m hoping to avoid writing custom Fortran or Python scripts from scratch to handle this (as this has likely been solved dozens of times in the past).
So my question is, are there existing tools, scripts, or community-shared examples that simplify ingesting ECMWF model data (especially those in .le format) into CRTM’s test codes for radiance simulation? I’m looking for something lightweight that parses ECMWF profiles and populates CRTM’s Atmosphere and Surface structures programmatically, bypassing .inc files if that is a preferred route.
I’d prefer solutions simpler than full frameworks like JEDI, GSI or WRF, as I only need radiances for Cal/Val, not data assimilation. I’m open to custom tools or modified test code examples. If anyone has handled .le files or similar ECMWF formats with CRTM, I’d greatly appreciate pointers to scripts, GitHub repos, or advice on parsing and integration.
Thank you in advanced for your help;
Peter
You could look at UFO HofX nomodel (there’s a ctest in UFO that demonstrates how to use it).
Thank you very much for your suggestion BenjaminTJohnson. I’ve been attempting to build UFO on the local RHEL9 server, but it’s proven significantly more challenging and time-intensive than other packages like CRTM.
Are there any cloud-based resources, containers, or pre-built binaries for UFO that could streamline this process? Any suggestions would be greatly appreciated.
To provide feedback on the build challenges:
In general, the UFO build has been daunting due to its complex dependency chain (eckit, atlas, oops, ioda, etc), very high sensitivity to compiler and library versions, and the need for iterative error fixes. On my environment which has older glibc (2.22) and no module system, this has significantly prolonged the process.
To briefly get into the gory details, early on, in general the needed ecbuild commands and flag adjustments caused cryptic CMake errors, requiring extensive debugging. Linker errors, like missing libpthread_nonshared.a from atlas_f, demanded multiple flag adjustments. Mid-process, system gfortran (GCC 11.5.0) triggered glibc mismatches (e.g., requiring GLIBC_2.29), necessitating a switch to conda’s GCC 14.2.0. Most recently, atlas’s FieldImpl.h failed to compile due to std::find errors, likely from a compiler mismatch between atlas (GCC 11.5.0) and oops (GCC 14.2.0). Despite these efforts, we haven’t reached the UFO build yet, still trying to complete the oops/ioda build.
Thank you for your time, your guidance is greatly appreciated.
Peter
You probably won’t succeed on the RHEL servers, at least in my experience they’re outdated. Do you have access to any of these HPC systems:
https://jointcenterforsatellitedataassimilation-jedi-docs.readthedocs-hosted.com/en/1.3.0/using/jedi_environment/modules.html
There’s detailed instructions for each system.