Page 1 of 1
application called MPI_Abort
Posted: Mon Mar 10, 2025 2:10 pm
by xjxiao
Dear all,
What is causing this error, and how can I resolve it? According to the backend logs, there is no memory shortage.
Code: Select all
application called MPI_Abort(MPI_COMM_WORLD, 1) - process 61
l-HF_and_locXC_gw0_dyson_em1d_ppa_CPU_61.txt
Thanks!
Xiao
Re: application called MPI_Abort
Posted: Mon Mar 10, 2025 2:57 pm
by Daniele Varsano
Dear Xiao,
in order to inspect the problem, you should provide more information as the input and report files.
Best,
Daniele
Re: application called MPI_Abort
Posted: Mon Mar 10, 2025 4:19 pm
by xjxiao
Daniele Varsano wrote: ↑Mon Mar 10, 2025 2:57 pm
Dear Xiao,
in order to inspect the problem, you should provide more information as the input and report files.
Best,
Daniele
Dear Daniele,
I have attached the files. Could you please help me? Thank you!
Yours,
Xiao
Re: application called MPI_Abort
Posted: Mon Mar 10, 2025 4:28 pm
by Daniele Varsano
Dear Xiangjun,
I have the impression you have a memory problem. A cutoff of 30Ry in the response function can be rather large.
My suggestion is first to verify if you really need such a large cutoff in the dielectric matrix.
Then you can try to optimize the memory distribution using:
Code: Select all
X_and_IO_CPU= "1 1 16 4" # [PARALLEL] CPUs for each role
X_and_IO_ROLEs= "q k c v"
or try:
Code: Select all
X_and_IO_CPU= "1 4 1 8 2" # [PARALLEL] CPUs for each role
X_and_IO_ROLEs= "q g k c v"
Best,
Daniele