Yambo 4.2.1, ompi_mpi_comm_world error

Having trouble compiling the Yambo source? Using an unusual architecture? Problems with the "configure" script? Problems in GPU architectures? This is the place to look.

Moderators: Davide Sangalli, andrea.ferretti, myrta gruning, andrea marini, Daniele Varsano, Conor Hogan, Nicola Spallanzani

Forum rules
If you have trouble compiling Yambo, please make sure to list:
(1) the compiler (vendor and release: e.g. intel 10.1)
(2) the architecture (e.g. 64-bit IBM SP5)
(3) if the problems occur compiling in serial/in parallel
(4) the version of Yambo (revision number/major release version)
(5) the relevant compiler error message
Fabiof
Posts: 68
Joined: Wed Feb 11, 2015 10:52 am

Yambo 4.2.1, ompi_mpi_comm_world error

Post by Fabiof » Mon Mar 05, 2018 1:49 pm

Dear all,

I am having some problem when installing yambo 4.2.1.

This is my config command:
./configure --enable-openmpi --with-iotk-path="/home/fabiof/bin/espresso-5.4.0/iotk" --with-p2y-version=5.4 --with-fft-libdir="/usr/local/lib/" --with-fft-includedir="/usr/local/include/" --with-blas-libs="-L//opt/intel/composer_xe_2013.1.117/mkl/lib/intel64/" --with-blas-libs="-L/opt/intel/composer_xe_2013.1.117/mkl/lib/intel64 -lmkl_blas95_lp64 -lmkl_core -lmkl_intel_lp64 -lmkl_sequential" --with-lapack-libs="-L/opt/intel/composer_xe_2013.1.117/mkl/lib/intel64 -lmkl_lapack95_lp64 -lmkl_core -lmkl_intel_lp64 -lmkl_sequential" --with-blacs-libs="-lblacs


After executing the command make yambo, I have the following error:

>>>[Linking yambo]<<<
make[1]: Entering directory '/home/fabiof/bin/yambo-4.2.1/driver'
cd /home/fabiof/bin/yambo-4.2.1/driver; /home/fabiof/bin/yambo-4.2.1/sbin/moduledep.sh yambo_driver.o > /home/fabiof/bin/yambo-4.2.1/driver/make.dep
yambo_driver.F
driver.o: In function `main':
/home/fabiof/bin/yambo-4.2.1/driver/driver.c:267: undefined reference to `ompi_mpi_comm_world'
/home/fabiof/bin/yambo-4.2.1/driver/driver.c:268: undefined reference to `ompi_mpi_comm_world'
/home/fabiof/bin/yambo-4.2.1/driver/driver.c:380: undefined reference to `ompi_mpi_comm_world'
/home/fabiof/bin/yambo-4.2.1/driver/driver.c:365: undefined reference to `ompi_mpi_comm_world'


I have tried the commands make clean or make clean_all, and I still get the error. I don't know what to do.
Do you have any suggestions?

The config file is attached.


Thank you,
Fabio
You do not have the required permissions to view the files attached to this post.
Fábio Ferreira, Graduate Student
University of Minho, Portugal

User avatar
Daniele Varsano
Posts: 3816
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: Yambo 4.2.1, ompi_mpi_comm_world error

Post by Daniele Varsano » Mon Mar 05, 2018 4:23 pm

Dear Fabio,
please note that the option --enable-openmpi it is not correct, it should be --enable-openmp if you want to activate the OPENMP parallelism.
Next, I suggest you define in the configure line the compilers you want to use, in particular, you are using gnu for precompiler and intel for Fortran/C compilers, and this
could cause problems, you can add the following lines to the configure: CPP='icc -E' CC='icc' FC='ifort' F77='ifort' MPICC='mpicc' MPIFC='mpiifort'
Finally, check if your mpi libraries are installed correctly, do other compiled software works well in MPI?

Best,
Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

Fabiof
Posts: 68
Joined: Wed Feb 11, 2015 10:52 am

Re: Yambo 4.2.1, ompi_mpi_comm_world error

Post by Fabiof » Mon Mar 05, 2018 5:23 pm

Dear Daniele.

Adding the flags CPP='icc -E' CC='icc' MPICC='mpiicc' F77='ifort' FC='mpiifort' to the ./config command solved the problem.

Thank you!

Fabio
Fábio Ferreira, Graduate Student
University of Minho, Portugal

Fabiof
Posts: 68
Joined: Wed Feb 11, 2015 10:52 am

Re: Yambo 4.2.1, ompi_mpi_comm_world error

Post by Fabiof » Tue Mar 06, 2018 1:24 pm

Dear Daniele,

I am trying to run yambo in parallel mode, but it runs on serial mode.

My command is

Code: Select all

/usr/local/openmpi/bin/mpirun -np 24  yambo   -F yambo.in -J PPA

The Yambo file:

Code: Select all

ppa                            # [R Xp] Plasmon Pole Approximation
gw0                            # [R GW] GoWo Quasiparticle energy levels
HF_and_locXC                   # [R XX] Hartree-Fock Self-energy and Vxc
em1d                           # [R Xd] Dynamical Inverse Dielectric Matrix
NLogCPUs=0                     # [PARALLEL] Live-timing CPU`s (0 for all)
PAR_def_mode= "balanced"       # [PARALLEL] Default distribution mode ("balanced"/"memory"/"workload")
X_all_q_CPU= "4 3 2 1"                # [PARALLEL] CPUs for each role
X_all_q_ROLEs= "q k c v"              # [PARALLEL] CPUs roles (q,k,c,v)
X_all_q_nCPU_LinAlg_INV= 1     # [PARALLEL] CPUs for Linear Algebra
SE_CPU= "4 3 1"                     # [PARALLEL] CPUs for each role
SE_ROLEs= "q qp b"                   # [PARALLEL] CPUs roles (q,qp,b)
EXXRLvcs= 50       Ry      # [XX] Exchange RL components
Chimod= ""                     # [X] IP/Hartree/ALDA/LRC/BSfxc
% BndsRnXp
   1 | 200 |                   # [Xp] Polarization function bands
%
NGsBlkXp= 5            Ry      # [Xp] Response block size
% LongDrXp
 1.000000 | 0.000000 | 0.000000 |        # [Xp] [cc] Electric Field
%
PPAPntXp= 27.21138     eV      # [Xp] PPA imaginary energy
% GbndRnge
   1 | 200 |                   # [GW] G[W] bands range
%
GDamping=  0.10000     eV      # [GW] G[W] damping
dScStep=  0.10000      eV      # [GW] Energy step to evaluate Z factors
GTermEn= 40.81708      eV      # [GW] GW terminator energy (only for kind="BG")
DysSolver= "n"                 # [GW] Dyson Equation solver ("n","s","g")
%QPkrange                    # [GW] QP generalized Kpoint/Band indices
  1| 36|  15|22|
%

Is this a problem of using mpirun from openmpi to run the jobs? I used another mpi (mpiifort) to compile yambo?
I attached the config.log

Thank you,

Fabio
You do not have the required permissions to view the files attached to this post.
Fábio Ferreira, Graduate Student
University of Minho, Portugal

User avatar
Daniele Varsano
Posts: 3816
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: Yambo 4.2.1, ompi_mpi_comm_world error

Post by Daniele Varsano » Tue Mar 06, 2018 1:36 pm

Dear Fabio,
from the conifg.log I cannot see anything wrong, can you upload the ./config/report file?
In what kind of machine are you running? There is a batch system? in this case can you provide the launch script?
Please post also the report and log files.

Nothing to do with your problem but note that in your input SE_CPU is not consistent with 24 cpus.


Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

Fabiof
Posts: 68
Joined: Wed Feb 11, 2015 10:52 am

Re: Yambo 4.2.1, ompi_mpi_comm_world error

Post by Fabiof » Tue Mar 06, 2018 4:05 pm

The machine is a regular computer (not a cluster or anything like that) with the distro Linux Mint 18.1.
It has the intel 2017 installed on it.

I tried this batch file.

Code: Select all

#! /bin/sh

MPIRUN=/usr/local/openmpi/bin/mpirun
QE=/home/fabiof/bin/espresso-5.4.0/bin
YAMBO=/home/fabiof/bin/yambo-4.2.1/bin

for i in  5

do

sed 's/yy/'$i'/g' PPA.in > yambo.in

$MPIRUN -np 24  $YAMBO/yambo   -F yambo.in -J PPAW$i

done
The yy is the energy of the NGsBlkXp.

The report file is attached (I renamed it to report.log).
You do not have the required permissions to view the files attached to this post.
Fábio Ferreira, Graduate Student
University of Minho, Portugal

User avatar
Daniele Varsano
Posts: 3816
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: Yambo 4.2.1, ompi_mpi_comm_world error

Post by Daniele Varsano » Tue Mar 06, 2018 4:09 pm

Ok the code seems to be correctly compiled in parallel.
Can you provide a report and a log file of your calculation?

Does qe or any other code works in parallel? The "regular" computer has 24 cores?

Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

Fabiof
Posts: 68
Joined: Wed Feb 11, 2015 10:52 am

Re: Yambo 4.2.1, ompi_mpi_comm_world error

Post by Fabiof » Tue Mar 06, 2018 4:25 pm

Yes, it worked with qe and the regular computer has more than 24 cores.
I called it regular because it is just a desktop computer with many cores and a significant memory ram.


I tried to do the calculation with 16 processor. But I still have the same problem.
I attached one of the log and report files. They are not complete.
You do not have the required permissions to view the files attached to this post.
Fábio Ferreira, Graduate Student
University of Minho, Portugal

User avatar
Daniele Varsano
Posts: 3816
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: Yambo 4.2.1, ompi_mpi_comm_world error

Post by Daniele Varsano » Tue Mar 06, 2018 4:45 pm

Dear Fabio,
yambo seems compiled correctly as you can see from the header:

GPL Version 4.2.1 Revision 110. (Based on r.14778 h.7b4dc3c)
MPI Build

the problem it looks to come from the mpi wrapper.
According to this link (openMPI with intel):
https://software.intel.com/en-us/articl ... -compilers

possibly you need to recompile yambo with mpif90 wrapper instead of mpiifort so substituting MPIFC='mpif90' instead of mpiifort.
Remember to do a make clean_all before recompiling.

Best,
Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

Fabiof
Posts: 68
Joined: Wed Feb 11, 2015 10:52 am

Re: Yambo 4.2.1, ompi_mpi_comm_world error

Post by Fabiof » Tue Mar 06, 2018 5:06 pm

Dear Daniele,

With the flags

Code: Select all

MPIFC='mpif90' CPP='icc -E' CC='icc' MPICC='mpiicc'  F77='ifort' FC='mpiifort'
I get the error

configure: error: in `/home/fabiof/bin/yambo-4.2.1':
configure: error: linking to Fortran libraries from C fails

If I change FC to mpif90 i get the error

checking if FC precompiler works on FC source... no
configure: error: Found FC precompiler problems in processing FC source.
Fábio Ferreira, Graduate Student
University of Minho, Portugal

Post Reply