Description
spack-stack@2.1.0 fails to install netcdf-c plugins -- used when writing compressed output files with zstandard -- when netcdf-c utilizes spack's buiild_system = cmake setting
To Reproduce
Install spack-stack@2.1.0 with
Build / run code that relies on writing netcdf files with zstandard compression at a compression level > 0, e.g.
+ srun --label --distribution=block:block -n 150 ./fv3.exe
144: file: module_write_netcdf.F90 line: 500
144: NetCDF: Filter error: undefined filter encountered
144: ESMF_Finalize: Error closing trace stream
144: Abort(1) on node 144 (rank 144 in comm 496): application called MPI_Abort(comm=0x84000002, 1) - process 144
Expected behavior
netcdf-c plugins are built and installed so that code which relies on zstandard plugins for compression successfully create netcdf files
System, compiler, code, ...
The failure result was reproducible across all EPIC / RDHPCS hosts when running UFS WM regression tests against spack-stack@2.1.0:
- derecho / gaea-c6 / hercules / orion / ursa
oneapi@2025.{3,2}.1
gcc@{13,12}
Additional context
Referencing the above error example, in module_write_netcdf.F90:
ncerr = nf90_def_var_zstandard(ncid, varids(i), zstandard_level(grid_id)) ; NC_ERR_STOP(ncerr)
For regression testing, the default compression level is zero (export ZSTANDARD_LEVEL=0) -- no compression, see:
https://github.com/ufs-community/ufs-weather-model/blob/develop/tests/default_vars.sh
Each test that requires zstd compression sets it as, for example, (export ZSTANDARD_LEVEL=5), see e.g.
https://github.com/ufs-community/ufs-weather-model/blob/develop/tests/tests/control_p8
The zstandard compression level depends on the following:
... writes NetCDF files using ParallelIO (PIO), which supports HDF5-based compression filters, including Zstandard (zstd). Zstd offers better compression ratios and faster I/O than zlib. Implementing zstd requires NetCDF-4.9.0+ and the zstd filter plugin installed in the HDF5_PLUGIN_PATH
Prior to spack-stack@2.x -- e.g. spack-stack@1.9.2, netcdf-c was built with build_system=autotools:
^netcdf-c@4.9.2%oneapi@2024.2.1+blosc~byterange+dap~fsync~hdf4~jna~logging+mpi~nczarr_zip+optimize~parallel-netcdf+pic+shared~szip+zstd build_system=autotools patches=0161eb8 arch=linux-rocky9-zen3
and the configure step reports a plugin directory:
NetCDF Version: 4.9.2
Dispatch Version: 5
Install Prefix: /contrib/spack-stack/spack-stack-1.9.2/envs/ue-oneapi-2024.2.1/install/oneapi/2024.2.1/netcdf-c-4.9.2-idnwcbr
Plugin Install Prefix: /contrib/spack-stack/spack-stack-1.9.2/envs/ue-oneapi-2024.2.1/install/oneapi/2024.2.1/netcdf-c-4.9.2-idnwcbr/plugins
which is populated with plugins:
$ ls /contrib/spack-stack/spack-stack-1.9.2/envs/ue-oneapi-2024.2.1/install/oneapi/2024.2.1/netcdf-c-4.9.2-idnwcbr/plugins
lib__nch5blosc.so lib__nch5deflate.so.0.0.0 lib__nch5zstd.so.0
lib__nch5blosc.so.0 lib__nch5fletcher32.so lib__nch5zstd.so.0.0.0
lib__nch5blosc.so.0.0.0 lib__nch5fletcher32.so.0 lib__nczhdf5filters.so
lib__nch5bzip2.so lib__nch5fletcher32.so.0.0.0 lib__nczhdf5filters.so.0
lib__nch5bzip2.so.0 lib__nch5shuffle.so lib__nczhdf5filters.so.0.0.0
lib__nch5bzip2.so.0.0.0 lib__nch5shuffle.so.0 lib__nczstdfilters.so
lib__nch5deflate.so lib__nch5shuffle.so.0.0.0 lib__nczstdfilters.so.0
lib__nch5deflate.so.0 lib__nch5zstd.so lib__nczstdfilters.so.0.0.0
spack-stack-2.1.0 does NOT configure with --enable-plugins or --with-plugin-dir -- those options are NOT specified during the configure step, and this results in no plugins configured / built:
NetCDF Version: 4.9.2
Dispatch Version: 5
Install Prefix: /contrib/spack-stack/spack-stack-2.1.0/envs/ue-oneapi-2025.3.1/install/intel-oneapi-compilers/2025.3.1/netcdf-c-4.9.2-322lw3t
Plugin Install Prefix: N.A.
ls /contrib/spack-stack/spack-stack-2.1.0/envs/ue-oneapi-2025.3.1/install/intel-oneapi-compilers/2025.3.1/netcdf-c-4.9.2-322lw3t
bin include lib64 share
Note the absence of the plugins directory.
For the spack package.py file, netcdf-c@4.9.2 is retrieved from the same commit, so netcdf-c code is consistent.
Because shared libraries are requested to be built (variant +shared), along with zstd compression (variant +zstd), this code in package.py means that plugins are to be built / enabled:
# The plugins are not built when the shared libraries are disabled:
with when("~shared"):
conflicts("+szip")
conflicts("+blosc")
conflicts("+zstd")
and
def setup_run_environment(self, env: EnvironmentModifications) -> None:
if self.spec.satisfies("@4.9.0:+shared"):
# Both HDF5 and NCZarr backends honor the same environment variable:
env.append_path("HDF5_PLUGIN_PATH", self.prefix.plugins)
For spack-stack@2.1.0, netcdf-c requires building with cmake:
require:
- '@4.9.2'
[...]
- build_system=cmake
Upstream spack-packages contains this merged PR: Logic to install plugins for cmake netcdf-c builds #2887, which solves this particular issue.
JCSDA/spack-packages netcdf_c/package.py does not yet have this upstream commit / fix for spack-stack@2.1.0
Description
spack-stack@2.1.0fails to installnetcdf-cplugins -- used when writing compressed output files withzstandard-- whennetcdf-cutilizes spack'sbuiild_system = cmakesettingTo Reproduce
Install spack-stack@2.1.0 with
Build / run code that relies on writing
netcdffiles withzstandardcompression at a compression level > 0, e.g.Expected behavior
netcdf-cplugins are built and installed so that code which relies onzstandardplugins for compression successfully createnetcdffilesSystem, compiler, code, ...
The failure result was reproducible across all EPIC / RDHPCS hosts when running UFS WM regression tests against
spack-stack@2.1.0:oneapi@2025.{3,2}.1gcc@{13,12}Additional context
Referencing the above error example, in
module_write_netcdf.F90:ncerr = nf90_def_var_zstandard(ncid, varids(i), zstandard_level(grid_id)) ; NC_ERR_STOP(ncerr)For regression testing, the default compression level is zero (
export ZSTANDARD_LEVEL=0) -- no compression, see:https://github.com/ufs-community/ufs-weather-model/blob/develop/tests/default_vars.sh
Each test that requires
zstdcompression sets it as, for example, (export ZSTANDARD_LEVEL=5), see e.g.https://github.com/ufs-community/ufs-weather-model/blob/develop/tests/tests/control_p8
The
zstandardcompression level depends on the following:Prior to
spack-stack@2.x-- e.g.spack-stack@1.9.2,netcdf-cwas built withbuild_system=autotools:^netcdf-c@4.9.2%oneapi@2024.2.1+blosc~byterange+dap~fsync~hdf4~jna~logging+mpi~nczarr_zip+optimize~parallel-netcdf+pic+shared~szip+zstd build_system=autotools patches=0161eb8 arch=linux-rocky9-zen3and the configure step reports a plugin directory:
which is populated with plugins:
spack-stack-2.1.0does NOT configure with--enable-pluginsor--with-plugin-dir-- those options are NOT specified during theconfigurestep, and this results in no plugins configured / built:Note the absence of the
pluginsdirectory.For the
spackpackage.pyfile,netcdf-c@4.9.2is retrieved from the same commit, sonetcdf-ccode is consistent.Because shared libraries are requested to be built (variant
+shared), along with zstd compression (variant+zstd), this code in package.py means that plugins are to be built / enabled:and
For spack-stack@2.1.0, netcdf-c requires building with cmake:
Upstream spack-packages contains this merged PR: Logic to install plugins for cmake netcdf-c builds #2887, which solves this particular issue.
JCSDA/spack-packages netcdf_c/package.py does not yet have this upstream commit / fix for
spack-stack@2.1.0