File too large

Deals with issues related to computation of optical spectra in reciprocal space: RPA, TDDFT, local field effects.

Moderators: Davide Sangalli, andrea.ferretti, myrta gruning, andrea marini, Daniele Varsano, Conor Hogan

Post Reply
tolsen
Posts: 25
Joined: Thu Oct 28, 2010 9:45 am

File too large

Post by tolsen » Wed Mar 02, 2011 8:53 am

Hi

I get the error message when running a BSE calculation on MoS2 with 10x30x30 kpoints (846 irreducible).


.
.
.
<---> [FFT-BSK] Mesh size: 12 12 45
<---> [WF-BSK loader] Wfs (re)loading | | [000%] --(E) --(X)
<---> [WF-BSK loader] Wfs (re)loading |####################| [100%] --(E) --(X)
<---> [M 1.192 Gb] Alloc BS_W (0.167)
<---> [05.01] Main loop


<---> BSK | | [000%] --(E) --(X)
P01: [ERROR] STOP signal received while in :[05.01] Main loop
P01: [ERROR][NetCDF] File too large



It seems to work fine when I perform the calculation on the system using less kpoints. I use the -S option. What file becomes too large? And is there anyway around this problem?

BR
Thomas Olsen
Post Doc
Technical University of Denmark

User avatar
andrea marini
Posts: 325
Joined: Mon Mar 16, 2009 4:27 pm
Contact:

Re: File too large

Post by andrea marini » Fri Mar 04, 2011 10:00 am

This is strange. The -S, especially in the case of many k-points, should fragment the BS kernel in small pieces. Can you check the the code is actually creating the fragments in the SAVE folder ? Otherwise please post the GS input file so that we can try to reproduce the error.

Andrea
Andrea MARINI
Istituto di Struttura della Materia, CNR, (Italy)

User avatar
myrta gruning
Posts: 240
Joined: Tue Mar 17, 2009 11:38 am
Contact:

Re: File too large

Post by myrta gruning » Fri Mar 04, 2011 12:43 pm

Hallo Thomas

Besides checking that the BS matrix is fragmented, check the SAVE dir for files larger than 2Gb. Attach also the input that may help us to see what can cause the error.
Best,
m
Dr Myrta Grüning
School of Mathematics and Physics
Queen's University Belfast - Northern Ireland

http://www.researcherid.com/rid/B-1515-2009

tolsen
Posts: 25
Joined: Thu Oct 28, 2010 9:45 am

Re: File too large

Post by tolsen » Mon Mar 07, 2011 2:41 pm

Hi

There is lots of BS fragments in the SAVE directory. The calculation fails while at the ndb.BS_Q1_fragments_112_744 fragment. I guess it should go up to ndb.BS_Q1_fragments_001_846. The RESTART/db file writes:

Section Completed 277030 . To reach 358282

I tried to attach the yambo.in file but it was told that the extension is not allowed.

BR
Thomas

User avatar
myrta gruning
Posts: 240
Joined: Tue Mar 17, 2009 11:38 am
Contact:

Re: File too large

Post by myrta gruning » Mon Mar 07, 2011 3:23 pm

tolsen wrote:There is lots of BS fragments in the SAVE directory. The calculation fails while at the ndb.BS_Q1_fragments_112_744 fragme
What is their dimension?
tolsen wrote:I tried to attach the yambo.in file but it was told that the extension is not allowed.
It is just text, you can just cut&paste it

Best,
m
Dr Myrta Grüning
School of Mathematics and Physics
Queen's University Belfast - Northern Ireland

http://www.researcherid.com/rid/B-1515-2009

tolsen
Posts: 25
Joined: Thu Oct 28, 2010 9:45 am

Re: File too large

Post by tolsen » Mon Mar 07, 2011 4:45 pm

Hi

Good point:-) Here it is:

optics # [R OPT] Optics
bse # [R BSK] Bethe Salpeter Equation.
bss # [R BSS] Bethe Salpeter Equation solver
StdoHash= 20 # [IO] Live-timing Hashes
Nelectro= 36.00000 # Electrons number
ElecTemp= 0.000000 eV # Electronic Temperature
OccTresh=0.1000E-4 # Occupation treshold (metallic bands)
More_IO_Path= "." # [IO] Additional I/O directory
Com_Path= "." # [IO] Communication directory
FFTGvecs= 2967 RL # [FFT] Plane-waves
NonPDirs= "none" # [X/BSS] Non periodic chartesian directions (X,Y,Z,XY...)
#KfnQPdb= "E < ../GW/SAVE/ndb.QP" # [EXTQP BSK BSS] Database
KfnQP_N= 1 # [EXTQP BSK BSS] Interpolation neighbours
% KfnQP_E
0.800000 | 1.000000 | 1.000000 | # [EXTQP BSK BSS] E parameters (c/v)
%
LongPath= "none" # [Xd] Longitudinal gauge path
BSresKmod= "xc" # [BSK] Resonant Kernel mode. (`x`;`c`;`d`)
BScplKmod= "none" # [BSK] Coupling Kernel mode. (`x`;`c`;`d`;`u`)
% BSEQptR
1 | 1 | # [BSK] Transferred momenta range
%
% BSEBands
16 | 20 | # [BSK] Bands range
%
BSENGBlk= 111 RL # [BSK] Screened interaction block size
BSENGexx= 2967 RL # [BSK] Exchange components
% BSEEhEny
-1.000000 |-1.000000 | eV # [BSK] Electron-hole energy range
%
BSEClmns=0 # [BSK] Kernel Columns
% BSehWind
100.0000 | 100.0000 | # [BSK] [o/o] E/h coupling pairs energy window
%
BoseCut= 0.10000 # [BOSE] Finite Tel Bose function cutoff
BSSmod= "d" # [BSS] Solvers `h/d/i/t`
% BEnRange
0.00000 | 10.00000 | eV # [BSS] Energy range
%
% BDmRange
0.10000 | 0.10000 | eV # [BSS] Damping range
%
BDmERef= 0.000000 eV # [BSS] Damping energy reference
BEnSteps= 200 # [BSS] Energy steps
% BLongDir
1.000000 | 0.000000 | 0.000000 | # [BSS] [cc] Electric Field
%
WRbsWF # [BSS] Write to disk excitonic the FWs

and the static dielectric function was calculated with the same parameters (100 bands and 111 Gs in the response function).

BR
Thomas

User avatar
myrta gruning
Posts: 240
Joined: Tue Mar 17, 2009 11:38 am
Contact:

Re: File too large

Post by myrta gruning » Tue Mar 08, 2011 12:13 am

Hallo Thomas

I cannot spot any problem from the input. You use a small number of bands so the size of the fragments should not be that large.

Can you check anyway if there are files exceeding 2G?

Code: Select all

find SAVE/ -size +2G -ls


Which version of Netcdf are you using?

Best,
m
Dr Myrta Grüning
School of Mathematics and Physics
Queen's University Belfast - Northern Ireland

http://www.researcherid.com/rid/B-1515-2009

tolsen
Posts: 25
Joined: Thu Oct 28, 2010 9:45 am

Re: File too large

Post by tolsen » Tue Mar 08, 2011 8:57 am

Hi Myrta

There is no files larger than 2 GB and I am using netcdf 4.0.1.

BR
Thomas

User avatar
andrea marini
Posts: 325
Joined: Mon Mar 16, 2009 4:27 pm
Contact:

Re: File too large

Post by andrea marini » Tue Mar 08, 2011 10:28 am

Thomas, I see very little possibilities here. The Netcdf error message is not given by Yambo. In mod_IO.F you can find the line

Code: Select all

/src/modules/mod_IO.F:       write (msg,'(2a)') '[NetCDF] ',trim(nf90_strerror(status))
as you can see the error message is composed using the nf90_strerror function that is an internal NetCDF error function. Therefore the problem of "too large file" is internal to Netcdf. From the man page of netcdf you can see that

Code: Select all

4.4 Large File Support
======================
It is possible to write netCDF files that exceed 2 GiByte on platforms
that have "Large File Support" (LFS). Such files are
platform-independent to other LFS platforms, but trying to open them on
an older platform without LFS yields a "file too large" error.
Therefore the only possibility is that you are not using the LFS. Now the point is that at the moment I do not remember the story of the LFS. Yambo must be compiled with the LFS. In addition however, Netcdf itself and/or the platform must support the LFS. Myrta, maybe you remember the details better than me.

Andrea
Andrea MARINI
Istituto di Struttura della Materia, CNR, (Italy)

Post Reply