Running optical spectrum on parallel

Deals with issues related to computation of optical spectra in reciprocal space: RPA, TDDFT, local field effects.

Moderators: Davide Sangalli, andrea.ferretti, myrta gruning, andrea marini, Daniele Varsano, Conor Hogan

Post Reply
User avatar
hammouri
Posts: 34
Joined: Fri Feb 09, 2018 11:25 pm

Running optical spectrum on parallel

Post by hammouri » Wed Mar 30, 2022 3:29 am

Hi all,
I am following the steps in this tutorial http://www.yambo-code.org/wiki/index.ph ... parameters for the optical absorption.
I'm using 8 cpus but based on r_screen_dipoles_em1s file it say 1 for cores, threads, nodes, ... etc so why it does not show that I'm running it on 8 cpus?
Also how to check the progress of my job, I see nothing in the log, err, out files I included in the submit_job.

thank you in advance!

Code: Select all

    ooooo   oooo ..     ooo        ooo ooooooooo.    .oooo.
     `88.   .8" .88.    `88.       .88 `88"   `Y8b  dP"  `Yb
      `88. .8" .8"88.    888b     d"88  88     888 88      88
       `88.8" .8" `88.   8 Y88. .P  88  88oooo888" 88      88
        `88" .88ooo888.  8  `888"   88  88    `88b 88      88
         88 .8"     `88. 8    Y     88  88    .88P `8b    d8"
        o88o88o      888o8          88 o88bood8P"   `Ybod8P"


           Version 5.0.4 Revision 19595 Hash 896bffc02
                            Branch is
                        MPI+HDF5_IO Build
                    http://www.yambo-code.org

 03/29/2022 at 13:04 yambo @ submit-1.chtc.wisc.edu
 ==================================================

 Cores               :  1
 Threads per core    :  1
 Threads total       :  1
 Nodes Computing     :  1
 Nodes IO            :  1

 Fragmented WFs      : yes
 CORE databases      : .
 Additional I/O      : .
 Communications      : .
 Input file          : 02_screening.in
 Report file         : ./r_screen_dipoles_em1s
 Verbose log/report  : no

 Precision           : SINGLE

 [RD./SAVE//ns.db1]--------------------------------------------------------------
  Bands                                            :   40
  K-points                                         :  286
  G-vectors                                        :  21559 [RL space]
  Components                                       :  2730 [wavefunctions]
  Symmetries                                       :   48 [spatial]
  Spinor components                                :  1
  Spin polarizations                               :  1
  Temperature                                      :  0.000000 [eV]
  Electrons                                        :  44.00000
  WF G-vectors                                     :   3431
  Max atoms/species                                :  3
  No. of atom species                              :  3
  Exact exchange fraction in XC                    :  0.000000
  Exact exchange screening in XC                   :  0.000000
  Magnetic symmetries                              : no
 - S/N 009369 ---------------------------------------------- v.05.00.04 r.19595 -

 [02] CORE Variables Setup
 =========================
  [02.04] K-grid lattice
  ======================

  Compatible Grid is   : 3D
  Base K vectors       :  K_min[ 1 ]  K_min[ 2 ]  K_min[ 3 ]
  K_min[ 1 ] :  0.000000  0.050000  0.000000 [rlu]
  K_min[ 2 ] :  0.050000  0.000000  0.000000 [rlu]
  K_min[ 3 ] :  0.000000  0.000000 -0.050000 [rlu]
  Grid dimensions      :  20  20  20
  K lattice UC volume  :  0.689179E-4 [a.u.]

  [02.05] Energies & Occupations
  ==============================

  [X] === General ===
  [X] Electronic Temperature                        :  0.000000  0.000000 [eV K]
  [X] Bosonic    Temperature                        :  0.000000  0.000000 [eV K]
  [X] Finite Temperature mode                       : no
  [X] El. density                                   :  0.65998E+24 [cm-3]
  [X] Fermi Level                                   :  4.528345 [eV]

  [X] === Gaps and Widths ===
  [X] Conduction Band Min                           :  1.534023 [eV]
  [X] Valence Band Max                              :  0.000000 [eV]
  [X] Filled Bands                                  :  22
  [X] Empty Bands                                   :   23   40
  [X] Direct Gap                                    :  2.253872 [eV]
  [X] Direct Gap localized at k-point               :  1
  [X] Indirect Gap                                  :  1.534023 [eV]
  [X] Indirect Gap between k-points                 :  286    1
  [X] Last valence band width                       :  0.719849 [eV]
  [X] 1st conduction band width                     :  3.714666 [eV]

Code: Select all

## Submit file options for running your program
executable =run_script
log = 02_screening.log
output = 02_screening.out
error = 02_screening.err

# arguments = (if you want to pass any to the shell script)
should_transfer_files = YES
when_to_transfer_output = ON_EXIT
transfer_input_files =pwscf.save/
requirements = (HasChtcSoftware == true)
request_cpus = 8
request_memory =32GB
request_disk = 32GB

queue 1

Code: Select all

#!/bin/bash
. /etc/profile.d/modules.sh
module load OpenMPI/3.1.4-GCC-8.3.0

mpirun -np 8 ./yambo -F <02_screening.in

Code: Select all

#                                                                     
#  __  __   ________   ___ __ __    _______   ______                  
# /_/\/_/\ /_______/\ /__//_//_/\ /_______/\ /_____/\                 
# \ \ \ \ \\::: _  \ \\::\| \| \ \\::: _  \ \\:::_ \ \                
#  \:\_\ \ \\::(_)  \ \\:.      \ \\::(_)  \/_\:\ \ \ \               
#   \::::_\/ \:: __  \ \\:.\-/\  \ \\::  _  \ \\:\ \ \ \              
#     \::\ \  \:.\ \  \ \\. \  \  \ \\::(_)  \ \\:\_\ \ \             
#      \__\/   \__\/\__\/ \__\/ \__\/ \_______\/ \_____\/             
#                                                                     
#                                                                     
#       Version 5.0.4 Revision 19595 Hash 896bffc02                   
#                        Branch is                                    
#                    MPI+HDF5_IO Build                                
#                http://www.yambo-code.org                            
#
screen                           # [R] Inverse Dielectric Matrix
em1s                             # [R][Xs] Statically Screened Interaction
dipoles                          # [R] Oscillator strenghts (or dipoles)
Chimod= "HARTREE"                # [X] IP/Hartree/ALDA/LRC/PF/BSfxc
% BndsRnXs
   1 |  40 |                         # [Xs] Polarization function bands
%
NGsBlkXs= 4                RL    # [Xs] Response block size
% LongDrXs
 1.000000 | 1.000000 | 1.000000 |        # [Xs] [cc] Electric Field
%
XTermKind= "none"                # [X] X terminator ("none","BG" Bruneval-Gonze)
Hammouri, M.
Assistant Professor of Physics
University of Wisconsin
USA

User avatar
Daniele Varsano
Posts: 3808
Joined: Tue Mar 17, 2009 2:23 pm
Contact:

Re: Running optical spectrum on parallel

Post by Daniele Varsano » Wed Mar 30, 2022 8:16 am

Dear Hammouri,

I'm not sure that this is the problem, but the correct syntax is:

mpirun -np 8 ./yambo -F 02_screening.in

Next, check if the mpirun you are using is the right one pointing to the mpif90 wrapper you used to compile the code.
Check also if the mpirun command needs the machinefile option.

Best,
Daniele
Dr. Daniele Varsano
S3-CNR Institute of Nanoscience and MaX Center, Italy
MaX - Materials design at the Exascale
http://www.nano.cnr.it
http://www.max-centre.eu/

Post Reply