Quantcast
Channel: Clusters and HPC Technology
Viewing all articles
Browse latest Browse all 927

Intel MPI: MPI_Comm_connect with I_MPI_FABRICS tmi results in an error

$
0
0

Hello,

I have two programs which are connected at runtime via MPI_Comm_connect.

If I use the dapl fabrics everything works fine:

mpirun -genv I_MPI_FABRICS dapl -np 1 ping.exe

mpirun -genv I_MPI_FABRICS dapl -np 1 pong.exe

If I use tmi instead of dapl the programs crash.

MPID_nem_tmi_vc_connect: tmi_connect returns 45
Fatal error in MPI_Comm_connect: Other MPI error, error stack:
MPI_Comm_connect(131)............................: MPI_Comm_connect(port="tag#0$epaddr_size#16$epaddr#02000000000000000305020000000000$", MPI_INFO_NULL, root=0, MPI_COMM_WORLD, newcomm=0x7f1db04115f8) failed
MPID_Comm_connect(206)...........................:
MPIDI_Comm_connect(393)..........................:
MPIDI_Create_inter_root_communicator_connect(134):
MPIDI_CH3_Connect_to_root(274)...................:
MPID_nem_tmi_connect_to_root(813)................:
(unknown)(): Other MPI error

However tmi works fine for MPI-I calls e.g. MPI_Send.

Is there anyway to debug that case? 


Viewing all articles
Browse latest Browse all 927

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>