Page 1 of 1

PETSC error in the main yambo code during SLEPC in BSE

Posted: Fri Jan 02, 2026 12:43 pm
by malwi
Dear YAMBO TEAM, Happy New 2026!

Some of the SLEPC calculations go well and some fail with the obscure error PETSC.
The cases differ only by the number of oxygen vacancies.
The calculations read kernel and the slepc directory contains only ndb.dipoles, the second large file is missing.

I attach one of the slurms.

Best regards,
Gosia
error-petsc.tar.gz

Re: PETSC error in the main yambo code during SLEPC in BSE

Posted: Fri Jan 02, 2026 2:40 pm
by Davide Sangalli
Dear Gosia,
you are considering a rather large BSE matrix.

Some comments/suggestions.

a) The step at which the log stops,

Code: Select all

 <03m-30s> P1-nid002200: Folding BSE Kernel |                                        | [000%] --(E) --(X)
seems the one where some arrays are transferred into memory (the message "folding BSE kernel" is not correct, it should be "Unfolding BSE components")
However, the PETSC error suggests that the problem might be later.

b) From this message

Code: Select all

[SLEPC] Slower algorithm but BSE matrix distributed over MPI tasks
your run is using the "Shell Matrix" approach of slepc, likely with yambo 5.2. (It might be the default in your version)
We did some performace tests and the approach is not performing well over many MPI tasks. You can find some details here:
https://arxiv.org/html/2504.10096v1
(see figure 1, upper panels, comparing Shell vs Explicit)
Notice that this change would not affect the previous point, i.e. if the problem was due to memory usage, it will persist.

To switch to the Explicit approach you can either add the option (up to yambo 5.2)

Code: Select all

BSSSlepcMatrix
or use (from yambo 5.3 and with more recent versions)

Code: Select all

BSSSlepcMatrixFormat="explicit"
c) Can you also try yambo 5.3 or, even better, this release: https://gitlab.com/lumen-code/lumen/-/releases/2.0.0
The name of the release is "lumen 2.0", but, for now, it is just a fork of yambo where new developments enter more quickly.
There we integrated also some changes related to slepc and petsc which might help.

Best,
D.

Re: PETSC error in the main yambo code during SLEPC in BSE

Posted: Fri Jan 02, 2026 7:43 pm
by malwi
Dear Davide,
thank you very much.

I used BSSSlepcMatrix with 5.2.3 and the same error.
Now I will repeat kernel with the same version - maybe something was distroyed on the disk.

Would it be possible to use kernel or epsilon from 5.2.3 as data for slepc in 5.3? Would it work?

Best regards,
Gosia

Re: PETSC error in the main yambo code during SLEPC in BSE

Posted: Fri Jan 02, 2026 10:32 pm
by malwi
Davide, I made one more check with parallelization: I added more memory and threads, and..... two cases went without error :-)))
Strange thing is that it said "PETSC error" in slurm but in fact it was "out of memory".

I will switch to 5.3 in future calculations.
Thank you again,
Gosia

Re: PETSC error in the main yambo code during SLEPC in BSE

Posted: Sat Jan 03, 2026 1:57 pm
by Davide Sangalli
Probably it was a memory issue.

Maybe, the Petsc error is there because the solver initializes some Petsc/Slepc variables before the crash. Accordingly, the Petsc library reports a non completed procedure.

Best,
D.