Multiple output file for HF calculation

Concerns issues with computing quasiparticle corrections to the DFT eigenvalues - i.e., the self-energy within the GW approximation (-g n), or considering the Hartree-Fock exchange only (-x)

Moderators: Davide Sangalli, andrea.ferretti, myrta gruning, andrea marini, Daniele Varsano

Post Reply
el16yz
Posts: 23
Joined: Mon Sep 23, 2019 4:34 pm
Location: United Kingdom

Multiple output file for HF calculation

Post by el16yz » Thu Feb 06, 2020 12:10 pm

Dear Yambo users,

The Yambo version I use is Yambo 4.5.0. I try to use parallel calculation for HF. The input file is edited as below:
HF_and_locXC # [R XX] Hartree-Fock Self-energy and Vxc
NLogCPUs=0 # [PARALLEL] Live-timing CPU`s (0 for all)
FFTGvecs= 14 Ry # [FFT] Plane-waves
PAR_def_mode= "balanced" # [PARALLEL] Default distribution mode ("balanced"/"memory"/"workload")
SE_CPU= "1 1 8" # [PARALLEL] CPUs for each role
SE_ROLEs= "q qp b" # [PARALLEL] CPUs roles (q,qp,b)
EXXRLvcs= 20 Ry # [XX] Exchange RL components
VXCRLvcs= 10 Ry # [XC] XCpotential RL components
%QPkrange # # [GW] QP generalized Kpoint/Band indices
1|1|256|257|
There are several output files and report files rather than one output file and one report file. All the output file are the same. And the information relative to core allocated in one of report file:
[01] CPU structure, Files & I/O Directories
===========================================

* CPU : 1
* THREADS (max): 1
* THREADS TOT(max): 1
* I/O NODES : 1
* NODES(computing): 1
* (I/O): 1
* Fragmented WFs : yes

CORE databases in .
Additional I/O in .
Communications in .
Input file is hf.in
Report file is ./r-HF_20Ry_HF_and_locXC_40
Precision is SINGLE
Log files in ./LOG

Job string(s)-dir(s) (main): HF_20Ry
Is that means although I aske for 8 core (also in the Scheduler for HPC), the code still do serial calculation on 8 core separately? In other word, I do the calculation 8 times in which only one core used for calculation. Is my guess correct? I attach the 'config.log' file, and hope it will do some help for solving my problem. Thank you.
You do not have the required permissions to view the files attached to this post.
Yang Zhou
PhD student
University of Leeds
United Kingdom

User avatar
Daniele Varsano
Posts: 4198
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: Multiple output file for HF calculation

Post by Daniele Varsano » Thu Feb 06, 2020 3:23 pm

Dear Yang Zhou,
as you argued the code is running in serial. From the config.log it seems that parallel compiler has been used, can you anyway check in the config/report file if the MPI checkbox is marked?
Most probably it is a problem of the launch script, are you using mpirun? is this working for other applications?
You can check the syntax of your launch script and maybe you can ask help to your system administrator.
Best,
Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

el16yz
Posts: 23
Joined: Mon Sep 23, 2019 4:34 pm
Location: United Kingdom

Re: Multiple output file for HF calculation

Post by el16yz » Fri Feb 07, 2020 11:42 am

Dear Daniele,

Thank you for your advice. The MPI is marked in the config/report file as:
# - PARALLEL SUPPORT -
#
# [X] MPI
# [-] OpenMP
Most probably it is a problem of the launch script, are you using mpirun? is this working for other applications?
Yes, it works fine when I do the scf calculation based on QE. By the way, I use intelmpi rather than openmpi as the library. Could it be the reason leading to this problem? Thank you.
Yang Zhou
PhD student
University of Leeds
United Kingdom

User avatar
Daniele Varsano
Posts: 4198
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: Multiple output file for HF calculation

Post by Daniele Varsano » Fri Feb 07, 2020 3:27 pm

Dear Yang Zhou,
I use intelmpi rather than openmpi as the library. Could it be the reason leading to this problem?
Yes it could be, check that you are using libraries and wrapper (mpirun, mpiexec) compatible with the compilers you are using.
Best,
Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

Post Reply