Both sides previous revision Previous revision Next revision | Previous revision |
doku:gpaw [2014/08/27 11:26] – jz | doku:gpaw [2014/08/27 11:27] (current) – jz |
---|
a:2:{s:7:"current";a:9:{s:4:"date";a:2:{s:7:"created";i:1396952157;s:8:"modified";i:1397050095;}s:4:"user";s:0:"";s:7:"creator";s:0:"";s:11:"last_change";a:7:{s:4:"date";i:1397050095;s:2:"ip";s:14:"128.131.35.162";s:4:"type";s:1:"E";s:2:"id";s:9:"doku:gpaw";s:4:"user";s:6:"markus";s:3:"sum";s:0:"";s:5:"extra";s:0:"";}s:11:"contributor";a:1:{s:6:"markus";s:13:"Stoehr Markus";}s:5:"title";s:4:"GPAW";s:11:"description";a:2:{s:15:"tableofcontents";a:1:{i:0;a:4:{s:3:"hid";s:4:"gpaw";s:5:"title";s:4:"GPAW";s:4:"type";s:2:"ul";s:5:"level";i:2;}}s:8:"abstract";s:264:"GPAW | ===== GPAW ===== |
| |
Settings Scientific linux 6.5 (VSC 1+2) | === Settings Scientific linux 6.5 (VSC 1+2) === |
| Following Revisions of GPAW and ASE were used, specify the revisions in ''install_gpaw_vsc_sl65.sh'': |
| |
Following Revisions of GPAW and ASE were used, specify the revisions in install_gpaw_vsc_sl65.sh: | * gpaw (?): 11253 ; ase (?): 3547 |
| * gpaw 0.10.0: 11364 ; ase 3.8.1: 3440 |
| * gpaw-setups-0.9.9672 |
| * numpy-1.6.2 |
| |
* gpaw (?): 11253 ; ase (?): 3547 | * MPI Version: impi-4.1.x |
* gpaw 0.10.0: 11364 ; ase 3.8.1: 3440 | * FFTW from INTEL MKL |
* gpaw-setups-0.9.9672 | * files: {{:doku:gpaw:sl65:site.cfg.numpy.sl65_icc_mkl.txt}}, {{:doku:gpaw:sl65:install_gpaw_vsc_sl65.sh}}, {{:doku:gpaw:sl65:customize_sl65_icc_mkl.py}}, {{:doku:gpaw:sl65:config.py}} |
* numpy-1.6.2";}s:8:"relation";a:2:{s:5:"media";a:14:{s:46:"doku:gpaw:sl65:site.cfg.numpy.sl65_icc_mkl.txt";b:1;s:39:"doku:gpaw:sl65:install_gpaw_vsc_sl65.sh";b:1;s:40:"doku:gpaw:sl65:customize_sl65_icc_mkl.py";b:1;s:24:"doku:gpaw:sl65:config.py";b:1;s:41:"doku:gpaw:site.cfg.numpy.sl64_icc_mkl.txt";b:1;s:34:"doku:gpaw:install_gpaw_vsc_sl64.sh";b:1;s:35:"doku:gpaw:customize_sl64_icc_mkl.py";b:1;s:19:"doku:gpaw:config.py";b:1;s:33:"doku:gpaw:site.cfg.numpy.vsc1.txt";b:1;s:31:"doku:gpaw:install_gpaw_vsc_x.sh";b:1;s:31:"doku:gpaw:customize_vsc1_icc.py";b:1;s:33:"doku:gpaw:site.cfg.numpy.vsc2.txt";b:1;s:31:"doku:gpaw:customize_vsc2_icc.py";b:1;s:24:"doku:gpaw:config.qmmm.py";b:1;}s:10:"firstimage";s:0:"";}s:8:"internal";a:2:{s:5:"cache";b:1;s:3:"toc";b:1;}}s:10:"persistent";a:5:{s:4:"date";a:2:{s:7:"created";i:1396952157;s:8:"modified";i:1397050095;}s:4:"user";s:0:"";s:7:"creator";s:0:"";s:11:"last_change";a:7:{s:4:"date";i:1397050095;s:2:"ip";s:14:"128.131.35.162";s:4:"type";s:1:"E" | |
| Notes: |
| * The library ''libxc'' had to be installed. |
| * In customize_sl65_icc_mkl.py ''extra_link_args += ['-lxc']'' had to be added. |
| * In config.py ''mpicompiler = 'mpiicc' '' needed to be set. |
| |
| === Settings Scientific linux 6.4 (VSC 1+2) === |
| Following Versions of GPAW and ASE were used: |
| * gpaw: svn checkout https://svn.fysik.dtu.dk/projects/gpaw/trunk gpaw -r 10428 |
| * ase: svn checkout https://svn.fysik.dtu.dk/projects/ase/trunk ase -r 2801 |
| * gpaw-setups-0.8.7929 |
| * numpy-1.6.2 |
| |
| * MPI Version: impi-4.1.0.024 |
| * FFTW from INTEL MKL |
| * files: {{:doku:gpaw:site.cfg.numpy.sl64_icc_mkl.txt}}, {{:doku:gpaw:install_gpaw_vsc_sl64.sh}}, {{:doku:gpaw:customize_sl64_icc_mkl.py}}, {{:doku:gpaw:config.py}} |
| |
| * With Scientific Linux 6.4 the header file '/usr/include/python2.6/modsupport.h' makes some Problems, for compiling gpaw some RedHat specific lines at the bottom of the file had to be commented out |
| |
| === Settings VSC-1 centos 5.7 (old) === |
| Following Versions of GPAW and ASE were used: |
| * gpaw: svn checkout https://svn.fysik.dtu.dk/projects/gpaw/trunk gpaw -r 9616 |
| * ase: svn checkout https://svn.fysik.dtu.dk/projects/ase/trunk ase -r 2801 |
| * gpaw-setups-0.8.7929 |
| * numpy-1.6.2 |
| |
| * MPI Version: mvapich2_intel_qlc-1.6 ; qlogic MPI did not work |
| * FFTW from INTEL MKL |
| * files: {{:doku:gpaw:site.cfg.numpy.vsc1.txt}}, {{:doku:gpaw:install_gpaw_vsc_x.sh}}, {{:doku:gpaw:customize_vsc1_icc.py}}, {{:doku:gpaw:config.py}} |
| |
| |
| |
| |
| === Settings VSC-2 scientific linux 6.1 (old) === |
| Following Versions of GPAW and ASE were used: |
| * gpaw: svn checkout https://svn.fysik.dtu.dk/projects/gpaw/trunk gpaw -r 9616 |
| * ase: svn checkout https://svn.fysik.dtu.dk/projects/ase/trunk ase -r 2801 |
| * gpaw-setups-0.8.7929 |
| * numpy-1.6.2 |
| |
| * MPI Version: intel_mpi_intel64-4.0.3.008 |
| * FFTW from INTEL MKL |
| * files: {{:doku:gpaw:site.cfg.numpy.vsc2.txt}}, {{:doku:gpaw:install_gpaw_vsc_x.sh}}, {{:doku:gpaw:customize_vsc2_icc.py}}, {{:doku:gpaw:config.py}} |
| |
| === Settings gpaw qmmm version === |
| As other versions, but with special config file: |
| * {{:doku:gpaw:config.qmmm.py}} |
| * use other config files for specific cluster / OS |
| |
| === Installation procedure === |
| * For installing download the following files from one of the settings sections above to a directory of your choice |
| * After downloading edit file 'install_gpaw_*.sh'. |
| * Execute install_gpaw_*.sh |
| <code> |
| bash install_gpaw_*.sh all |
| </code> |
| * Testing of GPAW: |
| <code> |
| gpaw-python <DIR_TO_GPAW_INST_BIN>/gpaw-test |
| </code> |
| |
| |
| For certain problems an assert statement in |
| <code> |
| ~/gpaw_inst_latest/lib64/python2.6/site-packages/gpaw/wavefunctions/lcao.py |
| </code> |
| |
| has to be commented out (approx line 239): |
| <code> |
| #assert abs(c_n.imag).max() < 1e-14 |
| </code> |
| |
| === running gpaw jobs === |
| == Job submission using all cores on the compute nodes (VSC-1 and VSC-2) == |
| <code> |
| #!/bin/sh |
| #$ -N Cl5_4x4x1 |
| #$ -pe mpich 256 |
| #$ -V |
| |
| mpirun -machinefile $TMPDIR/machines -np $NSLOTS gpaw-python static.py --domain=None --band=1 --sl_default=4,4,64 |
| </code> |
| |
| == Job submission using half of the cores on the compute nodes on VSC-2. == |
| If each of your processes require more than 2GB (and less than 4GB) of memory, you can use the parallel environment ''mpich8''. This will allocate only 8 processes on each node while still starting 256 processes but distributed over 32 nodes. It is necessary to use the variable ''$NSLOTS_REDUCED'' instead of ''$NSLOTS'' in that case. |
| <code> |
| #!/bin/sh |
| #$ -N Cl5_4x4x1 |
| #$ -pe mpich8 256 |
| #$ -V |
| |
| mpirun -machinefile $TMPDIR/machines -np $NSLOTS_REDUCED gpaw-python static.py --domain=None --band=1 --sl_default=4,4,64 |
| </code> |
| If even more memory per process is required the environments ''mpich4'', ''mpich2'', and ''mpich1'' are also available, as discussed in [[doku:ompmpi|Hybrid OpenMP/MPI jobs]]. |
| Alternatively, you can simply start your GPAW job with more processes which will reduce the amount of memory per process. GPAW usually scales well with the number of processes. |
| |
| Significant speed up is seen **in our test case** when --sl_default is set. First two parameters for the BLACS grid should be similar in size to get an optimal memory distribution on the nodes. |