Segmentation fault (core dumped) at the step of computing dipoles

Deals with issues related to computation of optical spectra in reciprocal space: RPA, TDDFT, local field effects.

Moderators: Davide Sangalli, andrea.ferretti, myrta gruning, andrea marini, Daniele Varsano, Conor Hogan

Post Reply
clin
Posts: 27
Joined: Fri Sep 17, 2021 10:27 am

Segmentation fault (core dumped) at the step of computing dipoles

Post by clin » Mon Feb 10, 2025 8:42 pm

Dear developers,

Recently, I encountered the problem of Segmentation fault when running static screening calculations. It crashed at the step of computing dipoles. From the report and CPU log files, I don't see any obvious problem. Do you have any idea why this happens?

I am not sure if it could be related to an compilation issue or due to any problem of library used.

The input file is

Code: Select all

screen                           # [R] Inverse Dielectric/Response Matrix
em1s                             # [R][Xs] Statically Screened Interaction
rim_cut                          # [R] Coulomb potential
infver                           # [R] Input file variables verbosity
dipoles                          # [R] Oscillator strenghts (or dipoles)
ElecTemp= 0.000000         eV    # Electronic Temperature
BoseTemp= 0.000000         eV    # Bosonic Temperature
NLogCPUs=8                       # [PARALLEL] Live-timing CPU`s (0 for all)
PAR_def_mode= "balanced"         # [PARALLEL] Default distribution mode ("balanced"/"memory"/"workload"/"KQmemory")
DIP_CPU= "4 32 1"                # [PARALLEL] CPUs for each role
DIP_ROLEs= "k c v"               # [PARALLEL] CPUs roles (k,c,v)
DIP_Threads=  8                  # [OPENMP/X] Number of threads for dipoles
X_and_IO_CPU= "4 1 1 32 1"       # [PARALLEL] CPUs for each role
X_and_IO_ROLEs= "q g k c v"      # [PARALLEL] CPUs roles (q,g,k,c,v)
X_and_IO_nCPU_LinAlg_INV=-1      # [PARALLEL] CPUs for Linear Algebra (if -1 it is automatically set)
X_Threads=  8                    # [OPENMP/X] Number of threads for response functions
RandQpts= 3000000                # [RIM] Number of random q-points in the BZ
RandGvec= 109              RL    # [RIM] Coulomb interaction RS components
CUTGeo= "slab z"                 # [CUT] Coulomb Cutoff geometry: box/cylinder/sphere/ws/slab X/Y/Z/XY..
Chimod= "HARTREE"                # [X] IP/Hartree/ALDA/LRC/PF/BSfxc
% BndsRnXs
   1 | 200 |                         # [Xs] Polarization function bands
%
NGsBlkXs= 10               Ry    # [Xs] Response block size
% LongDrXs
 1.000000 | 1.000000 | 0.000000 |        # [Xs] [cc] Electric Field
%
Many thanks,
Changpeng
You do not have the required permissions to view the files attached to this post.
Changpeng Lin
Doctoral Assistant, EPFL

User avatar
Daniele Varsano
Posts: 4198
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: Segmentation fault (core dumped) at the step of computing dipoles

Post by Daniele Varsano » Tue Feb 11, 2025 8:32 am

Dear Changpeng,
you have a lot of k points and it could be a memory issue.
My suggestion is the following:
1. Try to run a calculation considering very few bands and see if it works:

2a If it works, for the larger calculation try to use less MPI tasks in order to have more memory available
2b If it does not work, try to do a simple exchange self energy calculation (yambo -x) to confirm that there are problems with the linked linear algebra libraries.

If 2b also provide a seg fault, please report here and send the config.log file.

Best,
Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

clin
Posts: 27
Joined: Fri Sep 17, 2021 10:27 am

Re: Segmentation fault (core dumped) at the step of computing dipoles

Post by clin » Thu Feb 20, 2025 4:46 pm

Dear Daniele,

Thanks a lot. Following your suggestions, I find it's due to the compilation. The calculation is successful if I enforce the compilation using internal linear algebra by --enable-int-linalg. It's solved.

Best - Changpeng
Changpeng Lin
Doctoral Assistant, EPFL

Post Reply