Fail to make CRTM test

Thanks for your help!

Does the update the Sensor_Zenith_Angle member you mentioned above mean that I only need to modify the number of Components in Geometry Info, such as the example in the figure below, I can delete SCAN_ANGLE or add Flux_Zenith_Angle, and the value of other declared ANGLE can be given arbitrarily, as long as it satisfies the equation mentioned in the Userguide?
Because I want to compare the simulated brightness temperature and the Satellite measured brightness temperature, each point of the satellite data has a different angle, so I was wondering about what value should be best to give.
Thanks a lot!

@yunzhiya no problem.

Yes, what you are attempting is a very common task for the CRTM.
You should specify the Sensor_Zenith_Angle from your satellite observation data and derive the scan angle from it based on the equation in the user guide.
The solar source angles should be specified based on the time and date of the measurement.

Okay, Got it!
Thanks a lot for your help!

Good afternoon,
I was able to build libcrtm.a for v2.4.0, but would like to go back to yunzhiya’s initial question about “make test”. I, too, am having trouble finding exactly where to run that command. I also couldn’t find the file amsua-metop-a.output mentioned in the Users’ guide. Maybe I didn’t pull down everything I need. I used 'git clone --branch=v2.4.0 GitHub - JCSDA/crtm".
One other thing: One of the reasons I would like to run the additional tests is that although the library compiled ok, there were several warnings (this is with the make -j4 command). Most had to do with unused variables and routines, but two stated “Integer division truncated to constant ‘1’ at (1) [-Winteger-division]”. The warning occurred for lines 756 and 781 of CRTM_Interpolation.f90.
Any help would be appreciated.

Hi Jack,

I think it might be more efficient to just set up a meeting with me, and we can work through it together over video chat.
Google meet or zoom, eastern time zone (USA)

I’m free this Friday and anytime next week more or less.



Thanks Ben, I’ll email you to find a time.


Hello! I have a similar problem to the original question. I can see my netcdf.mod in the include folder of the NC4_DIR that I set up in the configuration/gfortran.setup file. Yet, I still get the following error:

gfortran -fconvert=big-endian -g -O2 -c CloudCoeff_netCDF_IO.f90

32 | USE netcdf
| 1
Fatal Error: Cannot open module file ‘netcdf.mod’ for reading at (1): No such file or directory
compilation terminated.

I am working on a conda enviroment and I from my understunding the gfortran.setup file is indicating correctly where the .mod file is. any ideas? I’ve tried a few things and I cant seem to solve it.

This is typically a path problem for the include or library directory.
Make sure your -I path includes the location of this .mod file.

Thanks ben. I know, but I’m 100% sure this is correct

my LIBS is

-L /Users/vito.galligani/Work/anaconda3/envs/crtm_main/lib -lnetcdf -lnetcdff -L /Users/vito.galligani/Work/anaconda3/envs/crtm_main/lib -lhdf5 -I /Users/vito.galligani/Work/anaconda3/envs/crtm_main/include -I /Users/vito.galligani/Work/anaconda3/envs/crtm_main/include

and the .mod file is correctly in the include file


on my linux machine I had no problems but using the conda enviroment I’m running out of ideas.

Hello @vgalligani , in case you weren’t able to resolve the problem yet, can you provide steps for us to reproduce your error?

Actually I haven’t. I have been using the server.

This is how I create my conda environment:

conda create -n CRTM_fix python=3.7 numpy scipy clang_osx-64 clangxx_osx-64 gfortran_osx-64

conda activate CRTM_fix
conda install -c conda-forge netcdf-fortran

and then I edited the gfortran.setup in the configuration to include on the top:

export FC=“gfortran”
export NC4_DIR=“/Users/vito.galligani/Work/anaconda3/envs/crtm_fix”

this indicates the include and lib folders.

/Users/vito.galligani/Work/anaconda3/envs/CRTM_fix/include for example has the netcdf.mod file that my compilation cannot find.

thank you!