[ERROR][NetCDF]

Run-time issues concerning Yambo that are not covered in the above forums.

Moderators: myrta gruning, andrea marini, Daniele Varsano, Conor Hogan

Post Reply
cudazzo

[ERROR][NetCDF]

Post by cudazzo » Fri May 15, 2009 11:52 am

Dear all,
I' m running yambo to solve the Bethe Salpeter eq. on a k points mesh 30x30x1 (nkpt=452), but the code gives the following error message:

"P01: [ERROR] STOP signal received while in :[08.02] Main loop
P01: [ERROR][NetCDF] No such file or directory"

However using a smaller mesh (16x16x1) the code does not give any problems.

Is the problem related to the dimension of the files, or to the disk space?

User avatar
myrta gruning
Posts: 240
Joined: Tue Mar 17, 2009 11:38 am
Contact:

Re: [ERROR][NetCDF]

Post by myrta gruning » Fri May 15, 2009 12:19 pm

cudazzo wrote:Dear all,
I' m running yambo to solve the Bethe Salpeter eq. on a k points mesh 30x30x1 (nkpt=452), but the code gives the following error message:

"P01: [ERROR] STOP signal received while in :[08.02] Main loop
P01: [ERROR][NetCDF] No such file or directory"

However using a smaller mesh (16x16x1) the code does not give any problems.

Is the problem related to the dimension of the files, or to the disk space?
Yes, it may be a problem related to the fact the ndb.* becomes larger than 2GB. I also got strange messages from the Netcdf in that case (even though this seems new to me). I am not a netcdf guru, but as I said I experienced some similar chrashes, so here some suggestions, I hope they may help you.
One way to prevent problems with large ndb.* (usually) is to invoke Yambo with the -S keyword that fragmented the database (yambo -H).
However, if you did not use this keyword from the beginning and you have a RESTART directory (and the calculations took already some time so that starting from scratch is not a good option) restarting with -S may cause a little mess (at least I was able to generate a mess in such a case).
Another option (maybe safer in the restart case) is to reconfigure Yambo with --enable-largedb feature (for more info ./configure --help in the yambo dir) that should enable netcdf to work with files larger than 2GB (even though I do not know exactly if this is possible only for 64 bits archs, and how this may affect performance, I/O etc.), and see if the program get over the point where it crashes now.

Cheers,
m
Dr Myrta Grüning
School of Mathematics and Physics
Queen's University Belfast - Northern Ireland

http://www.researcherid.com/rid/B-1515-2009

User avatar
Conor Hogan
Posts: 111
Joined: Tue Mar 17, 2009 12:17 pm
Contact:

Re: [ERROR][NetCDF]

Post by Conor Hogan » Fri May 15, 2009 1:48 pm

Another common reason for this error is that the netCDF databases have not been created properly for some reason. Check that the SAVE directory contains a directory wf_0000n_... for every k-point used, and particularly that the size of all the SAVE/wf*/ndb.fragment files are about the same. It may simply be that you have to run a2y/p2y again.
Dr. Conor Hogan
CNR-ISM, via Fosso del Cavaliere, 00133 Roma, Italy;
Department of Physics and European Theoretical Spectroscopy Facility (ETSF),
University of Rome "Tor Vergata".

Post Reply