Convergence and disk space of BSE calculation

Deals with issues related to computation of optical spectra in reciprocal space: RPA, TDDFT, local field effects.

Moderators: Davide Sangalli, andrea.ferretti, myrta gruning, andrea marini, Daniele Varsano, Conor Hogan

Post Reply
Flex
Posts: 37
Joined: Fri Mar 25, 2016 4:21 pm

Convergence and disk space of BSE calculation

Post by Flex » Wed May 25, 2016 3:06 pm

Hello,

I am working at the moment on BSE calculations of the optical properties of monolayer black phosphorus, based on pwscf calculations. I did the scf and nscf needed (with a grid of 20*20), converted with p2y without problems. That gave me about 600,000 Gvecs that I reduced to 10,000 when I did the initialization. I have 121 Q-points.

Attached (BSE.in/txt) is the input I obtained with the command : yambo -o b -k sex -y h. I reduced some parameters (BSENGexx, BSENGBlk, and block size) in order to do a first run, not precise but fast.

This is where I ran into the first problems. Even with these very reduced parameters, the calculation seems to need huge computing power and RAM. I had to run it on 16 nodes, 16 cores each and 16G of ram per core. If I used less, it would take an entire week. And this is only the first non precise run. Max job time on my cluster is about 1 week.

(BTW, I also did GW calculations earlier, and they needed much less resources.)

Is it normal ? Do I have to explicitely add parameters to optimize parallelization ? Are my parameters still too big ?

So, I launched the calculation with 16*16 and it ran quite well until about halfway, when I saw the save took about 600 GB of disk space. I had to abort it.
This is a problem for me as I work on a cluster I share with other people and I only have about 1 TB of disk space I can use. I already use about 300 GB for other pwscf and GW.

Is there a way to reduce the disk space used ? How much disk space does a full precision BSE calculation need ? (by full precision I mean good enough to have reliable results)

I can attach other logs and files if needed

Thanks in advance
You do not have the required permissions to view the files attached to this post.
Thierry Clette
Student at Université Libre de Bruxelles, Belgium

User avatar
amolina
Posts: 135
Joined: Fri Jul 15, 2011 11:23 am
Location: Valencia, Spain
Contact:

Re: Convergence and disk space of BSE calculation

Post by amolina » Wed May 25, 2016 3:51 pm

Dear Flex,

looking at the input file I see you have set BSEBands = [1,30]. This is a huge number of bands for a Bethe-Salpeter calculation. You should set this number to the conduction and valence bands close to the gap, with importance for the calculation of the absorption. A good test is to set just one valence and one conduction band to make some convergence tests.

Cheers,
Alejandro.
Alejandro Molina-Sánchez
Institute of Materials Science (ICMUV)
University of Valencia, Spain

Flex
Posts: 37
Joined: Fri Mar 25, 2016 4:21 pm

Re: Convergence and disk space of BSE calculation

Post by Flex » Wed May 25, 2016 4:14 pm

Oh, so that was the problem.

I guess the spectrum can be already very precise with about 5-10 bands around the gap, then.

Thank you very much for the quick answer.
Thierry Clette
Student at Université Libre de Bruxelles, Belgium

User avatar
Daniele Varsano
Posts: 4198
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: Convergence and disk space of BSE calculation

Post by Daniele Varsano » Wed May 25, 2016 4:20 pm

Dear Flex,

.first please note that you are using a very old version of yambo. I strongly suggest to update to a newer version as you will have more flexibility int he parallelization strategy.

.Next, as suggested by Alejandro, in BSE most probably you do not need deep bands: start with few bands across Fermi and enlarge the windows till convergence. I can's see the report file, but there you can find the dimensione the BSE, which is Nc*Nv*NK, where Nk is the entire BZ so it can explode easily.

. in the 4.x of the code there is also a flag to avoid to write the BS matrix on disk.

Best,

Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

Post Reply