Compiling Code On DIaL3
Compiler Suites
We provide the following compiler suites on DIaL3
Language | AMD | Cray | GNU | Intel | LLVM | OneAPI |
---|---|---|---|---|---|---|
C | clang | cc | gcc | icc | clang | icx |
C++ | clang++ | CC | g++ | icpc | clang++ | icpx |
Fortran | flang | ftn | gfortran | ifort | flang | ifx |
Hierarchical modules
We use hierarchical modules on DIaL3. This means that not all modules are visible initially, instead additional modules become visible once you load certain modules.
Specifically we use Compilers and MPI modules as the basis for our modules hierarchy. Loading a compiler module will make accessible additional modules built with that compiler, it will also unload all other compiler modules, in order to avoid clashes.
For example, if you load the AOCC compiler
module load aocc/3.0.0
it will unlock additional set of sub-modules that were built with the compiler, e.g.
module avail
...
amdblis/3.0/4yxmag amdlibflame/3.0/jqi4a5 cmake/3.21.3/ufpze2 hwloc/2.5.0/ty4ojn numactl/2.0.14/qagahc openmpi/4.0.5/wpfljq (L) python/3.8.11/yz4244
If you load OpenMPI
module load openmpi/4.0.5/wpfljq
it will unlock a further set of sub-modules built with aocc v3.0.0 and openmpi v4.0.5, e.g.
module avail
...
amdfftw/3.0 amdscalapack/3.0
Compiler, Library and MPI combinations
The following table shows the supported compiler/library/MPI combinations available to users. Each column shows a supported combination of tools, this means that this particular combination is known to work and is part of our regression testing.
You may find that other combinations of Compilers and libraries work and give good performance, (e.g. Intel Compilers with Cray libsci, or GCC compilers with AOCL libraries). However, we only test the combinations shown.
Component | AMD | Cray | GNU | Intel | OneAPI |
---|---|---|---|---|---|
Compiler | aocc | PrgEnv-cray | gcc | intel-parallel-studio | intel-oneapi-compilers |
BLAS | amdblis | " | openblas | " | intel-oneapi-mkl |
LAPACK | amdflame | " | " | ||
MPI | openmpi | cray-mpich-ucx cray-pmi |
openmpi | " | intel-oneapi-mpi |
ScalaPACK | amdscalapack | " | " | ||
FFTW | amdfftw | cray-fftw | " |
The table shows the generic name of module that you need to load in order to bring a specific component into your environment. Please note that you also need to specify the version number when loading the moules. Examples are given in the following section.
How to load a specific Compiler/Library/MPI combination
Examples of loading specific Compiler/Library/MPI combinations (including version numbers). The use of hierarchical modules also helps to guide users into loading supported combinations of modules (and generally prevents users from loading incorrect combinations).
AMD:
Compiler version 3.0
module load aocc/3.0.0
module load amdblis/3.0
module load amdlibm/3.0
module load openmpi/4.0.5
module load amdfftw/3.0
module load amdscalapack/3.0
Compiler version 3.1
module load aocc/3.1.0
module load openmpi/4.0.5
module load amdblis/3.0
module load amdlibm/3.0
module load amdfftw/3.0
module load amdscalapack/3.0
Compiler version 4.1
module load aocc/4.1
module load amdblis/4.1
module load amdlibm/4.1
module load openmpi/4.1.6
module load amdfftw/4.1
module load amdscalapack/4.1
Cray:
module load PrgEnv-cray/8.0.0
module swap cray-mpich cray-mpich-ucx
module swap craype-network-ofi craype-network-ucx
module load cray-pmi
module load cray-fftw/3.3.8.8
GNU:
module load gcc/10.3.0
module load openmpi/4.0.5
module load openblas/0.3.15
Intel Classic:
module load intel-parallel-studio
Intel OneAPI:
Compiler version 2021.2.0
module load intel-oneapi-compilers/2021.2.0
module load intel-oneapi-mkl/2021.4.0
module load intel-oneapi-mpi/2021.4.0
Compiler version 2023.2.1
module load intel-oneapi-compilers/2023.2.1
module load intel-oneapi-mkl/2023.2.0
module load intel-oneapi-mpi/2021.10.0