PETSC error in the main yambo code during SLEPC in BSE

Various technical topics such as parallelism and efficiency, netCDF problems, the Yambo code structure itself, are posted here.

Moderators: Davide Sangalli, andrea.ferretti, myrta gruning, andrea marini, Daniele Varsano, Conor Hogan, Nicola Spallanzani

Post Reply
User avatar
malwi
Posts: 48
Joined: Mon Feb 29, 2016 1:00 pm

PETSC error in the main yambo code during SLEPC in BSE

Post by malwi » Fri Jan 02, 2026 12:43 pm

Dear YAMBO TEAM, Happy New 2026!

Some of the SLEPC calculations go well and some fail with the obscure error PETSC.
The cases differ only by the number of oxygen vacancies.
The calculations read kernel and the slepc directory contains only ndb.dipoles, the second large file is missing.

I attach one of the slurms.

Best regards,
Gosia
error-petsc.tar.gz
You do not have the required permissions to view the files attached to this post.
dr hab. Małgorzata Wierzbowska, Prof. IHPP PAS
Institute of High Pressure Physics Polish Academy of Sciences
Warsaw, Poland

User avatar
Davide Sangalli
Posts: 659
Joined: Tue May 29, 2012 4:49 pm
Location: Via Salaria Km 29.3, CP 10, 00016, Monterotondo Stazione, Italy
Contact:

Re: PETSC error in the main yambo code during SLEPC in BSE

Post by Davide Sangalli » Fri Jan 02, 2026 2:40 pm

Dear Gosia,
you are considering a rather large BSE matrix.

Some comments/suggestions.

a) The step at which the log stops,

Code: Select all

 <03m-30s> P1-nid002200: Folding BSE Kernel |                                        | [000%] --(E) --(X)
seems the one where some arrays are transferred into memory (the message "folding BSE kernel" is not correct, it should be "Unfolding BSE components")
However, the PETSC error suggests that the problem might be later.

b) From this message

Code: Select all

[SLEPC] Slower algorithm but BSE matrix distributed over MPI tasks
your run is using the "Shell Matrix" approach of slepc, likely with yambo 5.2. (It might be the default in your version)
We did some performace tests and the approach is not performing well over many MPI tasks. You can find some details here:
https://arxiv.org/html/2504.10096v1
(see figure 1, upper panels, comparing Shell vs Explicit)
Notice that this change would not affect the previous point, i.e. if the problem was due to memory usage, it will persist.

To switch to the Explicit approach you can either add the option (up to yambo 5.2)

Code: Select all

BSSSlepcMatrix
or use (from yambo 5.3 and with more recent versions)

Code: Select all

BSSSlepcMatrixFormat="explicit"
c) Can you also try yambo 5.3 or, even better, this release: https://gitlab.com/lumen-code/lumen/-/releases/2.0.0
The name of the release is "lumen 2.0", but, for now, it is just a fork of yambo where new developments enter more quickly.
There we integrated also some changes related to slepc and petsc which might help.

Best,
D.
Davide Sangalli, PhD
Piazza Leonardo Da Vinci, 32, 20133 – Milano
CNR, Istituto di Struttura della Materia (ISM)
https://sites.google.com/view/davidesangalli
http://www.max-centre.eu/

User avatar
malwi
Posts: 48
Joined: Mon Feb 29, 2016 1:00 pm

Re: PETSC error in the main yambo code during SLEPC in BSE

Post by malwi » Fri Jan 02, 2026 7:43 pm

Dear Davide,
thank you very much.

I used BSSSlepcMatrix with 5.2.3 and the same error.
Now I will repeat kernel with the same version - maybe something was distroyed on the disk.

Would it be possible to use kernel or epsilon from 5.2.3 as data for slepc in 5.3? Would it work?

Best regards,
Gosia
dr hab. Małgorzata Wierzbowska, Prof. IHPP PAS
Institute of High Pressure Physics Polish Academy of Sciences
Warsaw, Poland

User avatar
malwi
Posts: 48
Joined: Mon Feb 29, 2016 1:00 pm

Re: PETSC error in the main yambo code during SLEPC in BSE

Post by malwi » Fri Jan 02, 2026 10:32 pm

Davide, I made one more check with parallelization: I added more memory and threads, and..... two cases went without error :-)))
Strange thing is that it said "PETSC error" in slurm but in fact it was "out of memory".

I will switch to 5.3 in future calculations.
Thank you again,
Gosia
dr hab. Małgorzata Wierzbowska, Prof. IHPP PAS
Institute of High Pressure Physics Polish Academy of Sciences
Warsaw, Poland

User avatar
Davide Sangalli
Posts: 659
Joined: Tue May 29, 2012 4:49 pm
Location: Via Salaria Km 29.3, CP 10, 00016, Monterotondo Stazione, Italy
Contact:

Re: PETSC error in the main yambo code during SLEPC in BSE

Post by Davide Sangalli » Sat Jan 03, 2026 1:57 pm

Probably it was a memory issue.

Maybe, the Petsc error is there because the solver initializes some Petsc/Slepc variables before the crash. Accordingly, the Petsc library reports a non completed procedure.

Best,
D.
Davide Sangalli, PhD
Piazza Leonardo Da Vinci, 32, 20133 – Milano
CNR, Istituto di Struttura della Materia (ISM)
https://sites.google.com/view/davidesangalli
http://www.max-centre.eu/

Post Reply