GW quits unexpected with yambo-5.3

Concerns issues with computing quasiparticle corrections to the DFT eigenvalues - i.e., the self-energy within the GW approximation (-g n), or considering the Hartree-Fock exchange only (-x)

Moderators: Davide Sangalli, andrea.ferretti, myrta gruning, andrea marini, Daniele Varsano

User avatar
jasonhan0710
Posts: 70
Joined: Wed Dec 23, 2020 6:48 am
Location: China

GW quits unexpected with yambo-5.3

Post by jasonhan0710 » Tue Oct 14, 2025 4:07 am

Dear yambo developers,

I have run GW calculations with yambo-5.2 and 5.3. However, an error occurs with yambo-5.3
After EXS process, yambo-5.3 quits without any error information

Code: Select all

  <21m-20s> P1-c54: EXS |########################################| [100%] 20m-46s(E) 20m-46s(X)
 <26m-34s> P1-c54: [xc] Functional : Perdew, Burke & Ernzerhof SOL(X)+Perdew, Burke & Ernzerhof SOL(C)
But yambo-5.2 can finish the calculation.

Here is the input file for the two versions of yambo

Code: Select all

rim_cut                          # [R] Coulomb potential
gw0                              # [R] GW approximation
ppa                              # [R][Xp] Plasmon Pole Approximation for the Screened Interaction
el_el_corr                       # [R] Electron-Electron Correlation
dyson                            # [R] Dyson Equation solver
HF_and_locXC                     # [R] Hartree-Fock
em1d                             # [R][X] Dynamically Screened Interaction
NLogCPUs=1                       # [PARALLEL] Live-timing CPU`s (0 for all)
PAR_def_mode= "balanced"         # [PARALLEL] Default distribution mode ("balanced"/"memory"/"workload"/"KQmemory")
X_and_IO_CPU= "1 1 4 8 4"                 # [PARALLEL] CPUs for each role
X_and_IO_ROLEs= "q g k c v"               # [PARALLEL] CPUs roles (q,g,k,c,v)
X_and_IO_nCPU_LinAlg_INV=-1      # [PARALLEL] CPUs for Linear Algebra (if -1 it is automatically set)
X_Threads=4                      # [OPENMP/X] Number of threads for response functions
DIP_CPU= "4 8 4"                      # [PARALLEL] CPUs for each role
DIP_ROLEs= "k c v"                    # [PARALLEL] CPUs roles (k,c,v)
DIP_Threads=4                    # [OPENMP/X] Number of threads for dipoles
SE_CPU= "1 16 8"                       # [PARALLEL] CPUs for each role
SE_ROLEs= "q qp b"                     # [PARALLEL] CPUs roles (q,qp,b)
SE_Threads=4                     # [OPENMP/GW] Number of threads for self-energy
RandQpts= 1000000                # [RIM] Number of random q-points in the BZ
RandGvec= 101              RL    # [RIM] Coulomb interaction RS components
CUTGeo= "box z"                  # [CUT] Coulomb Cutoff geometry: box/cylinder/sphere/ws/slab X/Y/Z/XY..
% CUTBox
  0.00000 |  0.00000 | 47.00000 |        # [CUT] [au] Box sides
%
CUTRadius= 0.000000              # [CUT] [au] Sphere/Cylinder radius
CUTCylLen= 0.000000              # [CUT] [au] Cylinder length
CUTwsGvec= 0.700000              # [CUT] WS cutoff: number of G to be modified
EXXRLvcs=  50          Ry    # [XX] Exchange    RL components
VXCRLvcs=  587995          RL    # [XC] XCpotential RL components
Chimod= "HARTREE"                # [X] IP/Hartree/ALDA/LRC/PF/BSfxc
% BndsRnXp
    1 |  330 |                       # [Xp] Polarization function bands
%
NGsBlkXp= 9000                mRy    # [Xp] Response block size
% LongDrXp
 1.000000 | 1.000000 | 0.000000 |        # [Xp] [cc] Electric Field
%
PPAPntXp= 27.21138         eV    # [Xp] PPA imaginary energy
% GbndRnge
    1 |  330 |                       # [GW] G[W] bands range
%
GTermKind= "none"                # [GW] GW terminator ("none","BG" Bruneval-Gonze,"BRS" Berger-Reining-Sottile)
DysSolver= "n"                   # [GW] Dyson Equation solver ("n","s","g","q")
%QPkrange                        # [GW] QP generalized Kpoint/Band indices
1|116|25|41|
%
Would you please tell me how to figure it out?

Best,
Jason
Jason Han

Assistant Professor
Department of Physics
National University of Defense Technology
Hunan, China

User avatar
Daniele Varsano
Posts: 4306
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: GW quits unexpected with yambo-5.3

Post by Daniele Varsano » Wed Oct 15, 2025 11:34 am

Dear Jason,

yambo 5-3 should not have more memory needs than 5.2.

Can you check if you have any error message in the job submission output file, report or any of the log files?
Some more info is needed to spot the problem.

Best,

Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

User avatar
jasonhan0710
Posts: 70
Joined: Wed Dec 23, 2020 6:48 am
Location: China

Re: GW quits unexpected with yambo-5.3

Post by jasonhan0710 » Mon Oct 20, 2025 10:40 am

Hi Daniele,

There is no error information in the Log file or report file. However, the slurm error file said that the job quit due to the "segmentation fault". Do you have any ideas about this error? The yambo code compiled with Intel OneAPI 2021 and with the following scripts

Code: Select all

./configure --enable-mpi --enable-openmp --enable-time-profile --enable-memory-profile --enable-slepc-linalg
Best,
Jason
Jason Han

Assistant Professor
Department of Physics
National University of Defense Technology
Hunan, China

User avatar
Daniele Varsano
Posts: 4306
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: GW quits unexpected with yambo-5.3

Post by Daniele Varsano » Tue Oct 21, 2025 8:26 am

Dear Jason,

it's not easy to understand what is going on without any message.

Could you try to compile the branch 5.4? This is a release candidate, so quite stable.

Best,

Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

User avatar
jasonhan0710
Posts: 70
Joined: Wed Dec 23, 2020 6:48 am
Location: China

Re: GW quits unexpected with yambo-5.3

Post by jasonhan0710 » Wed Oct 22, 2025 8:08 am

Dear Daniele,

I have tried the yambo-5.4 branch. However, the error still exists with an error "KILLED BY SIGNAL: 11 (Segmentation fault)". It seems something is wrong with the memory allocation.

Do you have any suggestions to solve the problem?

Best,
Jason
Jason Han

Assistant Professor
Department of Physics
National University of Defense Technology
Hunan, China

User avatar
Daniele Varsano
Posts: 4306
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: GW quits unexpected with yambo-5.3

Post by Daniele Varsano » Wed Oct 22, 2025 10:17 am

Dear Jason,

you can try to distribute differently your MPI tasks, something like:

SE_CPU= "1 2 64" # [PARALLEL] CPUs for each role
SE_ROLEs= "q qp b"

in this way you will have a better memory distribution over the CPU's.

Best,

Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

User avatar
jasonhan0710
Posts: 70
Joined: Wed Dec 23, 2020 6:48 am
Location: China

Re: GW quits unexpected with yambo-5.3

Post by jasonhan0710 » Thu Nov 27, 2025 8:57 am

Dear Daniele,

I have tried different parallel strategies, however, the problem still appears. Even when I compiled yambo on another machine, it has the same problem. The GW stops at the "W" calculation.

Yambo-5.2 doesn't have such a problem.

Best,
Jason
Jason Han

Assistant Professor
Department of Physics
National University of Defense Technology
Hunan, China

User avatar
Daniele Varsano
Posts: 4306
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: GW quits unexpected with yambo-5.3

Post by Daniele Varsano » Fri Nov 28, 2025 4:04 pm

Dear Jason,

can you please post your input/report/and one of the log files?

Best,
Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

User avatar
jasonhan0710
Posts: 70
Joined: Wed Dec 23, 2020 6:48 am
Location: China

Re: GW quits unexpected with yambo-5.3

Post by jasonhan0710 » Fri Dec 05, 2025 4:51 am

Hello Daniele,

Thank you for your reply!

The attachments are the input file, the report file and part of the log file for the GW calculation. And the slurm reports errors as:

Code: Select all

...
[s192:249482:0:249959] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x2b208ac37000)
[s196:477490:0:477948] Caught signal 11 (Segmentation fault: address not mapped to object at address 0x2b579f835000)
[s192:249469:0:249957] Caught signal 11 (Segmentation fault: invalid permissions for mapped object at address 0x2b4ccb7da000)
*** stack smashing detected ***: yambo terminated
*** stack smashing detected ***: yambo terminated
...


Hope the information is sufficient to solve the problem.

Best,
Jason
You do not have the required permissions to view the files attached to this post.
Jason Han

Assistant Professor
Department of Physics
National University of Defense Technology
Hunan, China

User avatar
Daniele Varsano
Posts: 4306
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: GW quits unexpected with yambo-5.3

Post by Daniele Varsano » Fri Dec 05, 2025 3:16 pm

Dear Jason,

it is possible you are experiencing memory issue when calculating the correlation part of the self energy. You are actually many QP corrections at the same time.

Here below two possible solutions to distribute/minimize memory load.

You can try:
1. Distribute the MPI tasks on bands, this will distribute the memory among processors:

Code: Select all

SE_CPU= "1 1 1 128"                       # [PARALLEL] CPUs for each role
SE_ROLEs= "q g qp b"                     # [PARALLEL] CPUs roles (q,g,qp,b)
2. If this does not work you can split your calculations in different runs e.g.
Run n.1

Code: Select all

%QPkrange                        # [GW] QP generalized Kpoint/Band indices
1|116|26|28|
%
Run.2

Code: Select all

%QPkrange                        # [GW] QP generalized Kpoint/Band indices
1|116|29|31|
%
etc....

When doing so, in order to not overwrite QP database you can use the -J option:
yambo -F GW1.in -J run1
yambo -F GW2.in -J run2
etc...

Once the runs have finished, you can assemble the ndb.QP present in the run* directories using yambopy or ypp.

Best,

Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

Post Reply