Page 1 of 3
GW calculation using 5.0 fails with segmentation fault (signal 11)
Posted: Sat Jan 09, 2021 8:13 am
by Bramhachari Khamari
Dear Developer,
I am facing an error while calculating the GW correction using yambo 5.0. The problem says
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 33173 RUNNING AT hpc872
= EXIT CODE: 11
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
APPLICATION TERMINATED WITH THE EXIT STRING: Segmentation fault (signal 11)
I do not know why it appears. I am following the procedure to generate input same as in earlier version which was working fine. Any help to rectify this issue will be highly appreciated.
Regards,
Bramhachari Khamari
Re: GW calculation using 5.0 fails with segmentation fault (signal 11)
Posted: Sat Jan 09, 2021 9:43 am
by Daniele Varsano
Dear Bramhachari,
the error you reported comes from the system and not from yambo, so it is impossibile to help you with so few information.
Can you post any report/log file, if any, at least to see in which step of the calculation yambo crashes.
Best,
Daniele
Re: GW calculation using 5.0 fails with segmentation fault (signal 11)
Posted: Sat Jan 09, 2021 11:20 am
by Bramhachari Khamari
Dear Sir,
Thanks a lot for the quick reply. Following steps are followed,
i)p2y
ii)yambo produces the r_setup
iii) yambo -g n -p p -V par gives the following input
dyson # [R] Dyson Equation solver
gw0 # [R] GW approximation
HF_and_locXC # [R] Hartree-Fock
em1d # [R][X] Dynamically Screened Interaction
StdoHash= 40 # [IO] Live-timing Hashes
Nelectro= 26.00000 # Electrons number
ElecTemp= 0.000000 eV # Electronic Temperature
BoseTemp=-1.000000 eV # Bosonic Temperature
OccTresh= 0.100000E-4 # Occupation treshold (metallic bands)
NLogCPUs=0 # [PARALLEL] Live-timing CPU`s (0 for all)
DBsIOoff= "none" # [IO] Space-separated list of DB with NO I/O. DB=(DIP,X,HF,COLLs,J,GF,CARRIERs,OBS,W,SC,BS,ALL)
DBsFRAGpm= "none" # [IO] Space-separated list of +DB to FRAG and -DB to NOT FRAG. DB=(DIP,X,W,HF,COLLS,K,BS,QINDX,RT,ELP
FFTGvecs= 11485 RL # [FFT] Plane-waves
#WFbuffIO # [IO] Wave-functions buffered I/O
PAR_def_mode= "balanced" # [PARALLEL] Default distribution mode ("balanced"/"memory"/"workload")
X_and_IO_CPU= "1 1 10 4" # [PARALLEL] CPUs for each role
X_and_IO_ROLEs= "q,k,c,v" # [PARALLEL] CPUs roles (q,g,k,c,v)
X_and_IO_nCPU_LinAlg_INV=-1 # [PARALLEL] CPUs for Linear Algebra (if -1 it is automatically set)
DIP_CPU= "" # [PARALLEL] CPUs for each role
DIP_ROLEs= "" # [PARALLEL] CPUs roles (k,c,v)
SE_CPU= "1 10 4" # [PARALLEL] CPUs for each role
SE_ROLEs= "q,qp,b" # [PARALLEL] CPUs roles (q,qp,b)
EXXRLvcs= 69697 RL # [XX] Exchange RL components
VXCRLvcs= 69697 RL # [XC] XCpotential RL components
#UseNLCC # [XC] If present, add NLCC contributions to the charge density
Chimod= "HARTREE" # [X] IP/Hartree/ALDA/LRC/PF/BSfxc
ChiLinAlgMod= "LIN_SYS" # [X] inversion/lin_sys,cpu/gpu
XfnQPdb= "none" # [EXTQP Xd] Database action
XfnQP_INTERP_NN= 1 # [EXTQP Xd] Interpolation neighbours (NN mode)
XfnQP_INTERP_shells= 20.00000 # [EXTQP Xd] Interpolation shells (BOLTZ mode)
XfnQP_DbGd_INTERP_mode= "NN" # [EXTQP Xd] Interpolation DbGd mode
% XfnQP_E
0.000000 | 1.000000 | 1.000000 | # [EXTQP Xd] E parameters (c/v) eV|adim|adim
%
XfnQP_Z= ( 1.000000 , 0.000000 ) # [EXTQP Xd] Z factor (c/v)
XfnQP_Wv_E= 0.000000 eV # [EXTQP Xd] W Energy reference (valence)
% XfnQP_Wv
0.000000 | 0.000000 | 0.000000 | # [EXTQP Xd] W parameters (valence) eV| 1|eV^-1
%
XfnQP_Wv_dos= 0.000000 eV # [EXTQP Xd] W dos pre-factor (valence)
XfnQP_Wc_E= 0.000000 eV # [EXTQP Xd] W Energy reference (conduction)
% XfnQP_Wc
0.000000 | 0.000000 | 0.000000 | # [EXTQP Xd] W parameters (conduction) eV| 1 |eV^-1
%
XfnQP_Wc_dos= 0.000000 eV # [EXTQP Xd] W dos pre-factor (conduction)
GfnQPdb= "none" # [EXTQP G] Database action
GfnQP_INTERP_NN= 1 # [EXTQP G] Interpolation neighbours (NN mode)
Report file during the run is attached.
Re: GW calculation using 5.0 fails with segmentation fault (signal 11)
Posted: Sat Jan 09, 2021 11:24 am
by Daniele Varsano
Dear Bramhachari.
the input file seems incomplete, I can't see the number of bands.
Unfortunately the report does not help much, can you also post one of the log files (see if there is an error message at the end of one of them!).
Maybe assigning the parallel strategy for the dipoles could also help.
Best,
Daniele
Re: GW calculation using 5.0 fails with segmentation fault (signal 11)
Posted: Sat Jan 09, 2021 11:48 am
by Bramhachari Khamari
Dear Sir,
In the input file number of band is there. please see the attachment. LOG file is also attached. If in the bin directory all the required executable is created then installation is fine, am I correct. As per your suggestion I will use parallel strategy for dipoles and see if the problem persists or not.
Regards,
Bramhachari Khamari
Re: GW calculation using 5.0 fails with segmentation fault (signal 11)
Posted: Sat Jan 09, 2021 11:59 am
by Daniele Varsano
Dear Bramhachari,
the LOG dir is empty, is this actually the case?
Anyway, please note that your input file it is not a plasmon-pole approximation calculation, but a full frequency (ppa is missing in the run level)
Some command line are changed in v5.0, even if the old ones should work (see yambo -h).
should produce the right input.
Best,
Daniele
Re: GW calculation using 5.0 fails with segmentation fault (signal 11)
Posted: Sat Jan 09, 2021 12:14 pm
by Bramhachari Khamari
Dear Sir,
I have used parallel strategy for dipoles as well but the problem mentioned remains as it is. I also checked all the log files generated i.e 40 (l_HF_and_locXC_gw0_dyson_em1d_CPU_1 to l_HF_and_locXC_gw0_dyson_em1d_CPU_40) as I have used 40 CPU. But there is no error reported. In One of the log file many tags as follows
<---> P40: [01] MPI/OPENMP structure, Files & I/O Directories
<---> P40-hpc860: MPI Cores-Threads : 40(CPU)-1(threads)-1(threads@X)-1(threads@DIP)-1(threads@SE)-1(threads@RT)-1(threads@K)-1(threads@NL)
<---> P40-hpc860: MPI Cores-Threads : DIP(environment)-2 5 4(CPUs)-k,c,v(ROLEs)
<---> P40-hpc860: MPI Cores-Threads : X_and_IO(environment)-1 1 10 4(CPUs)-q,k,c,v(ROLEs)
<---> P40-hpc860: MPI Cores-Threads : SE(environment)-1 10 4(CPUs)-q,qp,b(ROLEs)
<---> P40-hpc860: [02] CORE Variables Setup
<---> P40-hpc860: [02.01] Unit cells
<---> P40-hpc860: [02.02] Symmetries
<---> P40-hpc860: [02.03] Reciprocal space
<---> P40-hpc860: [02.04] K-grid lattice
<---> P40-hpc860: Grid dimensions : 12 12
<---> P40-hpc860: [02.05] Energies & Occupations
Regards,
Bramhachari Khamari
Re: GW calculation using 5.0 fails with segmentation fault (signal 11)
Posted: Sat Jan 09, 2021 12:25 pm
by Daniele Varsano
Dear Bramhachari,
first of all I suggest you to prepare a PP input file, as a full frequency for your system could be prohibitive.
Unfortunately the log file does not help.
The problem anyway, could be the wrong input, as also the dipoles keyword is missing, even if in this case the code should recognise it and calculate them anyway.
I suggest you the following:
1) prepare a new input file from scratch as indicated in the previous post (be sure you have gw0 /ppa/ dyson/HF_and_locXC /em1d keywords at the beginning of the input file).
2) if the problem persist, you can run a serial interactive job, as the code stops after few seconds, in this way you can see the error message and in which part the code is complaining.
Best,
Daniele
Re: GW calculation using 5.0 fails with segmentation fault (signal 11)
Posted: Sat Jan 09, 2021 1:07 pm
by Bramhachari Khamari
Dear Sir,
Using the suggested command line yambo -X p -g n -V all I created the input file whose zip is attached. What I found is there are several tags missing
such as
% GbndRnge
| | # [GW] G[W] bands range
%
%QPkrange # # [GW] QP generalized Kpoint/Band indices
| | | |
Can you suggest command which can generate all the tags for GW input.
Regards,
Bramhachari Khamari
Re: GW calculation using 5.0 fails with segmentation fault (signal 11)
Posted: Sat Jan 09, 2021 1:13 pm
by Daniele Varsano
Dear Bramhachari,
should do the job, or simply:
Best,
Daniele