Hi all,
I am posting to ask the Intel MPI v2019 options corresponding to v2018.
I have tested the various versions of Intel compiler and MPI libraries to evaluate the performance for weather forecasting model.
The testes versions are below(BTW, it is not important).
Intel Compiler: 18u5, 19u2, 19u4, 19u5, 20u0, 20u1
Inel MPI library: 17u4, 18u4, 19u6, 19u7
The best performance pair is [C:19u5, M:18u4].
I think one of the reasons is that the mpi option is different between v18 and v19 and the mpi option in the 2019 that I used is less than the 2018 version.
Here is my options. There are so many difference between v18 and v19.
As many of the options used in 2018 disappeared in 2019, the number of options decreased a lot.
Could you tell me if there's anything I took out wrong?
(It means that when I have another option, I don't know it and just take it out.)
2018 version
export I_MPI_FALLBACK=0
export I_MPI_JOB_FAST_STARTUP=enable
export I_MPI_SCALABLE_OPTIMIZATION=enable
export I_MPI_TIMER_KIND=rdtsc
export I_MPI_PLATFORM_CHECK=0
export I_MPI_HYDRA_PMI_CONNECT=alltoall
export I_MPI_THREAD_LEVEL_DEFAULT=FUNNELED
export I_MPI_EXTRA_FILESYSTEM=on
export I_MPI_EXTRA_FILESYSTEM_LIST=gpfs
export I_MPI_FABRICS=shm:dapl
export I_MPI_DAPL_UD=on
export I_MPI_DAPL_UD_RDMA_MIXED=on
export DAPL_IB_MTU=4096
export I_MPI_DAPL_TRANSLATION_CACHE=1
export I_MPI_DAPL_TRANSLATION_CACHE_AVL_TREE=1
export I_MPI_DAPL_UD_TRANSLATION_CACHE=1
export I_MPI_DAPL_UD_TRANSLATION_CACHE_AVL_TREE=1
export I_MPI_DAPL_UD_EAGER_DYNAMIC_CONNECTION=off
export I_MPI_DAPL_UD_MAX_MSG_SIZE=4096
2019 version
export I_MPI_EXTRA_FILESYSTEM=on
export I_MPI_EXTRA_FILESYSTEM_FORCE=gpfs
Thank you in advance