Segmentation fault (core dumped) in HF calculation
Posted: Tue Nov 28, 2023 11:01 pm
Hello,
I'm trying to converge a Hartree Fock computation, but I keep running into a Segmentation fault error. I've attached the quantum espresso output files from the input that I used, as well as the failed yambo run. I've tried with a few different yambo compilations, playing with the linear algebra versions as indicated in this post viewtopic.php?t=1834, but the error remained the same.
I hadn't been getting this error before, with any BSE or GW calculations, but in this run I increased ecutwfc in my quantum espresso input. Presumably it's some sort of memory issue, but I've watched the nodes running the calculation and they never go anywhere near their memory limit (they don't even reach 1/5 the total ram on the node). My guess is that there's some sort of bug in how OpenMP is used, but I'm interested in your opinion. I'm using yambo-5.1.1
Thanks,
Miles
I'm trying to converge a Hartree Fock computation, but I keep running into a Segmentation fault error. I've attached the quantum espresso output files from the input that I used, as well as the failed yambo run. I've tried with a few different yambo compilations, playing with the linear algebra versions as indicated in this post viewtopic.php?t=1834, but the error remained the same.
I hadn't been getting this error before, with any BSE or GW calculations, but in this run I increased ecutwfc in my quantum espresso input. Presumably it's some sort of memory issue, but I've watched the nodes running the calculation and they never go anywhere near their memory limit (they don't even reach 1/5 the total ram on the node). My guess is that there's some sort of bug in how OpenMP is used, but I'm interested in your opinion. I'm using yambo-5.1.1
Thanks,
Miles