Issues compiling yambo 5.0 on HPC

Having trouble compiling the Yambo source? Using an unusual architecture? Problems with the "configure" script? Problems in GPU architectures? This is the place to look.

Moderators: Davide Sangalli, andrea.ferretti, myrta gruning, andrea marini, Daniele Varsano, Conor Hogan, Nicola Spallanzani

Forum rules
If you have trouble compiling Yambo, please make sure to list:
(1) the compiler (vendor and release: e.g. intel 10.1)
(2) the architecture (e.g. 64-bit IBM SP5)
(3) if the problems occur compiling in serial/in parallel
(4) the version of Yambo (revision number/major release version)
(5) the relevant compiler error message
pyadav
Posts: 86
Joined: Thu Nov 26, 2020 2:56 pm
Contact:

Re: Issues compiling yambo 5.0 on HPC

Post by pyadav » Thu Mar 18, 2021 4:11 pm

Dear Andrea,

Thank you for your reply. Yes it seems that it's not able to unzip the tar file as I got the following message

Code: Select all

tar xvfz libxc-2.2.3.tar.gz

gzip: stdin: unexpected end of file
tar: Child returned status 1
tar: Error is not recoverable: exiting now
-------------------------------------
tar -zxvf libxc-2.2.3.tar.gz
gzip: stdin: unexpected end of file
tar: Child returned status 1
tar: Error is not recoverable: exiting now

How can I get rid of it?, Should I replace the tar files with freshly downloaded ones? and where can I get these files?

best,
pushpendra
Pushpendra Yadav
Ph.D. Research Scholar
Quantum Transport and Theory Group
Department of Physics
Indian Instittute of Technology Kanpur, India.

https://sites.google.com/site/amitkag1/

andrea.ferretti
Posts: 206
Joined: Fri Jan 31, 2014 11:13 am

Re: Issues compiling yambo 5.0 on HPC

Post by andrea.ferretti » Thu Mar 18, 2021 5:05 pm

Dear Marzieh,

unless I am mistaken, it looks all good to me.
In terms of possible causes to be further investigated
* the MPI issue does not look promising. Can you check with your sys-admin ?
* otherwise, I would try to either use external hdf5 and recompile internal netcdf and netcdff,
* or simply to compile the whole IO stack (netcdf, netcdff, hdf5) internally (just drop the related lines from configure invocation)

take care
Andrea
Andrea Ferretti, PhD
CNR-NANO-S3 and MaX Centre
via Campi 213/A, 41125, Modena, Italy
Tel: +39 059 2055322; Skype: andrea_ferretti
URL: http://www.nano.cnr.it

pyadav
Posts: 86
Joined: Thu Nov 26, 2020 2:56 pm
Contact:

Re: Issues compiling yambo 5.0 on HPC

Post by pyadav » Thu Mar 18, 2021 6:37 pm

Dear Andrea,

When I'm downloading a fresh libxc-2.2.3.tar.gz file, it successfully unzips but when I put it in the archive folder and trying to build the code it shows the error "libxc build failed". Later on when I'm trying to unzip the same "libxc-2.2.3.tar.gz" in the archive, it shows the error

Code: Select all

gzip: stdin: unexpected end of file
tar: Child returned status 1
tar: Error is not recoverable: exiting now
How can I fix this?

Thanks,
Pushpendra
Pushpendra Yadav
Ph.D. Research Scholar
Quantum Transport and Theory Group
Department of Physics
Indian Instittute of Technology Kanpur, India.

https://sites.google.com/site/amitkag1/

Marzieh
Posts: 30
Joined: Sun Feb 28, 2021 8:46 pm

Re: Issues compiling yambo 5.0 on HPC

Post by Marzieh » Fri Mar 19, 2021 1:29 pm

Dear Andrea

I recompiled yambo with these modules an configuration:

Code: Select all

module --force  purge
module load releases/2019b
module load intel/2019b
module load libxc/4.3.4-iccifort-2019.5.281
module load HDF5/1.10.5-iimpi-2019b
module load netCDF/4.7.1-iimpi-2019b
module load netCDF-Fortran/4.5.2-iimpi-2019b

Code: Select all

./configure FC="ifort" PFC="mpiifort" F77="ifort" CC="mpiicc" CXX="mpiicpc" \
--enable-hdf5-par-io \
--enable-mpi \
--enable-open-mp \
--enable-dp \
--with-blas-libs="-lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -liomp5 -lpthread -lm" \
--with-lapack-libs="-lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -liomp5 -lpthread -lm" \
--with-fft-includedir="-I${MKLROOT}/include/fftw" \
--with-fft-libs="-lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -liomp5 -lpthread -lm" \
--with-blacs-libs="-lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64 -lmkl_core -liomp5 -lpthread -lm" \
--with-scalapack-libs="-lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64 -lmkl_core -liomp5 -lpthread -lm" \
--with-netcdf-libs="-L$EBROOTNETCDF/lib64/" \
--with-netcdf-path="$EBROOTNETCDF/" \
--with-netcdf-libdir="$EBROOTNETCDF/lib64/" \
--with-netcdf-includedir="$EBROOTNETCDF/include/" \
--with-netcdff-libs="-L$EBROOTNETCDFMINFORTRAN/lib/" \
--with-netcdff-path="$EBROOTNETCDFMINFORTRAN/" \
--with-netcdff-libdir="$EBROOTNETCDFMINFORTRAN/lib/" \
--with-netcdff-includedir="$EBROOTNETCDFMINFORTRAN/include/" \
--with-hdf5-libs="-L$EBROOTHDF5/lib/" \
--with-hdf5-path="$EBROOTHDF5/" \
--with-hdf5-libdir="$EBROOTHDF5/lib/" \
--with-hdf5-includedir="$EBROOTHDF5/include/"
The output of the configuaration was as:

Code: Select all

# [VER] 5.0.0 r.19466
#
# - GENERAL CONFIGURATIONS -
#
# [SYS] linux@x86_64
# [SRC] /home/ucl/modl/afekete/src/yambo-5.0.0
# [BRN]
# [CMP] /home/ucl/modl/afekete/src/yambo-5.0.0
# [TGT] /home/ucl/modl/afekete/src/yambo-5.0.0
# [BIN] /home/ucl/modl/afekete/src/yambo-5.0.0/bin
# [LIB] /home/ucl/modl/afekete/src/yambo-5.0.0/lib/external
#
# [EDITOR] vim
# [ MAKE ] make
#
# [X] Double precision
# [X] Keep object files
# [X] Run-Time timing profile
# [-] Run-Time memory profile
#
# - SCRIPTS -
#
# [-] YDB: Yambo DataBase
# [-] YAMBOpy: Yambo Python scripts
#
# - PARALLEL SUPPORT -
#
# [X] MPI
# [X] OpenMP
#
# - LIBRARIES [E=external library; I?=internal library (c=to be compiled / f=found already compiled); X=system default; -=not used;] -
#
# > I/O: (NETCDF with large files support)
#
# [ - ] FUTILE  :
#
# [ - ] YAML    :
#
# [ If] IOTK    : /home/ucl/modl/afekete/src/yambo-5.0.0/lib/external/intel/mpiifort/lib/libiotk.a (QE hdf5-support)
#                 -I/home/ucl/modl/afekete/src/yambo-5.0.0/lib/external/intel/mpiifort/include/
# [ - ] ETSF_IO :
#
# [ E ] NETCDF  : -L/opt/sw/arch/easybuild/2019b/software/netCDF-Fortran/4.5.2-iimpi-2019b/lib/ -lnetcdff -L/opt/sw/arch/easybuild/2019b/software/netCDF/4.7.1-iimpi-2019b/lib64/ -lnetcdf
#                 -I/opt/sw/arch/easybuild/2019b/software/netCDF-Fortran/4.5.2-iimpi-2019b/include/ -I/opt/sw/arch/easybuild/2019b/software/netCDF/4.7.1-iimpi-2019b/include/
# [ E ] HDF5    : -L/opt/sw/arch/easybuild/2019b/software/HDF5/1.10.5-iimpi-2019b/lib/ -lhdf5hl_fortran -lhdf5_fortran -lhdf5_hl -lhdf5 -lz -lm -ldl -lcurl
#                 -I/opt/sw/arch/easybuild/2019b/software/HDF5/1.10.5-iimpi-2019b/include/
#
# > MATH: (FFTW MKL)
#
# [ E ] FFT       : -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -liomp5 -lpthread -lm
#
# [ E ] BLAS      : -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -liomp5 -lpthread -lm
# [ E ] LAPACK    : -lmkl_intel_lp64 -lmkl_intel_thread -lmkl_core -liomp5 -lpthread -lm
# [ E ] SCALAPACK : -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64 -lmkl_core -liomp5 -lpthread -lm
# [ E ] BLACS     : -lmkl_scalapack_lp64 -lmkl_blacs_intelmpi_lp64 -lmkl_core -liomp5 -lpthread -lm
# [ - ] PETSC     :
#
# [ - ] SLEPC     :
#
#
# > OTHERs
#
# [ If] LibXC     : /home/ucl/modl/afekete/src/yambo-5.0.0/lib/external/intel/mpiifort/lib/libxcf90.a /home/ucl/modl/afekete/src/yambo-5.0.0/lib/external/intel/mpiifort/lib/libxc.a
#                   -I/home/ucl/modl/afekete/src/yambo-5.0.0/lib/external/intel/mpiifort/include
# [ E ] MPI       : -L/opt/sw/arch/easybuild/2019b/software/impi/2018.5.288-iccifort-2019.5.281/intel64/lib -Xlinker --enable-new-dtags -Xlinker -rpath -Xlinker /opt/sw/arch/easybuild/2019b/software/impi/2018.5.288-iccifort-2019.5.281/intel64/lib/release_mt -Xlinker -rpath -Xlinker /opt/sw/arch/easybuild/2019b/software/impi/2018.5.288-iccifort-2019.5.281/intel64/lib -Xlinker -rpath -Xlinker /opt/intel/mpi-rt/2017.0.0/intel64/lib/release_mt -Xlinker -rpath -Xlinker /opt/intel/mpi-rt/2017.0.0/intel64/lib -lmpifort -lmpi -lmpigi -ldl -lrt -lpthread
#                   -I/opt/sw/arch/easybuild/2019b/software/impi/2018.5.288-iccifort-2019.5.281/intel64/include
# [ Ic] Ydriver   : 1.0.0
#
# - COMPILERS -
#
# FC kind = intel ifort version 19.0.5.281
# MPI kind= Intel(R) MPI Library 2018 Update 5 for Linux* OS
#
# [ CPP ] mpiicc -E -ansi -D_HDF5_LIB -D_HDF5_IO -D_PAR_IO -D_MPI -D_FFTW -D_FFTW_OMP  -D_SCALAPACK  -D_DOUBLE -D_OPENMP -D_TIMING     -D_P2Y_QEXSD_HDF5
# [ FPP ] fpp -free -P -D_HDF5_LIB -D_HDF5_IO -D_PAR_IO -D_MPI -D_FFTW -D_FFTW_OMP  -D_SCALAPACK  -D_DOUBLE -D_OPENMP -D_TIMING
# [ CC  ] mpiicc -O2 -D_C_US -D_FORTRAN_US
# [ FC  ] mpiifort -assume bscc -O3 -g -ip    -qopenmp
# [ FCUF] -assume bscc -O0 -g
# [ F77 ] mpiifort -assume bscc -O3 -g -ip
# [ F77U] -assume bscc -O0 -g
# [Cmain] -nofor_main
The previous error was solved but I do not have any values for E-Eo [eV] and Sc|Eo [eV] in o.qp

Code: Select all

#    K-point            Band               Eo [eV]            E-Eo [eV]          Sc|Eo [eV]
#
        1                  1                 -39.73587                   NaN                NaN
        1                  2                 -19.57856                   NaN                NaN
        1                  3                 -1.099598                   NaN                NaN
        1                   4                -0.251880                   NaN                NaN
        1                   5                -0.205098                   NaN                NaN
....
Could you please tell me where I made mistake?

Best,
Marzieh
Marzieh Ghoohestani, PhD
Institute of Condensed Matter and Nanosciences,
Université Catholique de Louvain,
8 Chemin des étoiles, 1348 Louvain-la-Neuve, Belgium
https://uclouvain.be/en/directories/marzieh.ghoohestani

User avatar
Daniele Varsano
Posts: 3816
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: Issues compiling yambo 5.0 on HPC

Post by Daniele Varsano » Fri Mar 19, 2021 2:53 pm

Dear Marzieh,
can you check in the report file if you have NaN also in the HF part of the calculation, or in the Sc only?
Otherwise, If you attach your report file we can have a look.

Best,
Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

Marzieh
Posts: 30
Joined: Sun Feb 28, 2021 8:46 pm

Re: Issues compiling yambo 5.0 on HPC

Post by Marzieh » Fri Mar 19, 2021 3:09 pm

Dear Daniele,
I have attached the generated files.

Best,
Marzieh
You do not have the required permissions to view the files attached to this post.
Marzieh Ghoohestani, PhD
Institute of Condensed Matter and Nanosciences,
Université Catholique de Louvain,
8 Chemin des étoiles, 1348 Louvain-la-Neuve, Belgium
https://uclouvain.be/en/directories/marzieh.ghoohestani

User avatar
Daniele Varsano
Posts: 3816
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: Issues compiling yambo 5.0 on HPC

Post by Daniele Varsano » Fri Mar 19, 2021 3:23 pm

Dear Marzieh,

the problem is in the correlation part of the self energy. Not easy to spot, but note the parallelism is incomplete.

Can you remove the ./SAVE/ndb.QP and ./SAVE/ndb.pp* files,
and run again the code after setting:

SE_CPU= "1 1 2" # [PARALLEL] CPUs for each role
SE_ROLEs= "q qp b" # [PARALLEL] CPUs roles (q,qp,b)

X_and_IO_CPU= "1 1 1 2 1" # [PARALLEL] CPUs for each role
X_and_IO_ROLEs= "q g k c v" # [PARALLEL] CPUs roles (q,g,k,c,v)

Note that this is just a test, in your input file you have "NGsBlkXp= 1 RL" , and this make your screening unreliable.

Best,
Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

Marzieh
Posts: 30
Joined: Sun Feb 28, 2021 8:46 pm

Re: Issues compiling yambo 5.0 on HPC

Post by Marzieh » Fri Mar 19, 2021 7:14 pm

Dear Daniele,
I did what you suggested me. But I still have that problem.
Best,
Marzieh
You do not have the required permissions to view the files attached to this post.
Marzieh Ghoohestani, PhD
Institute of Condensed Matter and Nanosciences,
Université Catholique de Louvain,
8 Chemin des étoiles, 1348 Louvain-la-Neuve, Belgium
https://uclouvain.be/en/directories/marzieh.ghoohestani

Marzieh
Posts: 30
Joined: Sun Feb 28, 2021 8:46 pm

Re: Issues compiling yambo 5.0 on HPC

Post by Marzieh » Mon Mar 22, 2021 12:44 pm

Dear Daniele,

Since I did not see these parts during run by yambo 5.0.0 (in both parallel and without parallel):

Code: Select all

[WF-HF/Rho] Performing Wave-Functions I/O from ./SAVE
[WF-GW] Performing Wave-Functions I/O from ./SAVE
It seems yambo 5.0.0 can not read files from ./SAVE.
( As I mentioned before I use abinit (of course a2y) to generate files.

So I had to compile yambo 4.5.3 on the cluster. It works.
I would like to ask you if there are main differences between yambo-5.0.0 and yambo4.5.03 especially in nonlinear optics? because I am going to calculate nlo.

Best,
Marzieh
Marzieh Ghoohestani, PhD
Institute of Condensed Matter and Nanosciences,
Université Catholique de Louvain,
8 Chemin des étoiles, 1348 Louvain-la-Neuve, Belgium
https://uclouvain.be/en/directories/marzieh.ghoohestani

User avatar
claudio
Posts: 458
Joined: Tue Mar 31, 2009 11:33 pm
Location: Marseille
Contact:

Re: Issues compiling yambo 5.0 on HPC

Post by claudio » Sun Mar 28, 2021 5:46 pm

Dear Marzieh

there are not particular differences in the non-linear optics part from the 4.5.3 and the 5.0, as far as I remember,
you can continue with the 4.5.3.
In the while we are trying to solve all compilation problems of the 5.0

best
Claudio

NB: as you see the databases of the 4.5.3 are not compatible with the 5.0
Claudio Attaccalite
[CNRS/ Aix-Marseille Université/ CINaM laborarory / TSN department
Campus de Luminy – Case 913
13288 MARSEILLE Cedex 09
web site: http://www.attaccalite.com

Post Reply