Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revision Previous revision
Next revision
Previous revision
Last revisionBoth sides next revision
doku:mathematica [2015/05/13 08:36] – [Parallel Usage] irdoku:mathematica [2022/03/10 12:24] – [remote kernel connection] goldenberg
Line 1: Line 1:
 ======= Mathematica Batch Jobs ======= ======= Mathematica Batch Jobs =======
-===== VSC-2 ===== 
-==== Sun Grid Engine ==== 
  
-The simplest way is to create a file with your mathematica commands, e.g. 'math-input.m', and just input that in the batch file. 
-<code> 
-# the following option makes sure the job will run in the current directory 
-#$ -cwd 
-# this option makes sure the job has the same environment variables as the submission shell 
-#$ -V 
  
-MATHDIR=/opt/sw/Mathematica/90/bin +=== SLURM === 
-$MATHDIR/math < math-input.m +In order to be able to use Mathematica, you have to load the program with the ''module load xyz'' command
-</code> +
-If we name the above file as math-job.sh and place it in the same directory as math-input.m we can submit it from the same directory with+
 <code> <code>
-user@l01 qsub math-job.sh+[username@l32]module avail  # select a Mathematica version 
 +[username@l32]$ module load Mathematica/10.0.2 # load desired version 
 +[username@l32]$ module list   # check loaded modules 
 +Currently Loaded Modulefiles: 
 +  1) Mathematica/10.0.2  
 </code> </code>
 +(See also the introduction to the [[https://wiki.vsc.ac.at/doku.php?id=doku:slurm|module command.]])
  
-The output will be left in file ''math-job.sh.o$JOB_ID'' and any errors in ''math-job.e$JOB_ID''+NowMathematica can be called by
- +
-You could also incorporate mathematica commands in the job file itselfrather then have them in a separate file: +
 <code> <code>
-#-cwd +[username@l32]$ math 
-#$ -V +
- +
-MATHDIR=/opt/sw/Mathematica/90/bin +
- +
-$MATHDIR/math <<END_MATH_COMMANDS +
-1+1 +
-3*3 +
-END_MATH_COMMANDS+
 </code> </code>
  
-The above notations means that everything between "<<END_MATH_COMMANDS" and "END_MATH_COMMANDS" will be used as math program's input. You can again submit this job with qsub.+==== Example: Parallel Mathematica Task ====
  
-==== Parallel Usage ==== +Mathematica script example ''math-vsc3.m'':
-Jobscript:+
 <code> <code>
-#$ -N parallel 
-#$ -q all.q 
-#$ -pe mpich 32 
  
 +GetEnvironment["MATH_BIN"]
 +math = Environment["MATH_BIN"]
 +kernelsperhost = 16
 +hosts=Import["nodelist","List"];
  
-# $TMPDIR/machines shows which host machines are reserved for the current job; 
-# copy this file to a temporary file which will be read in by Mathematica; 
-# in the following the small Mathematics m-file will start a kernel on each machine; 
-# since the kernel on the master is already running, it is sufficient 
-# to start the kernels beginning with the second host in the machines file. 
  
-cp -v $TMPDIR/machines machines_tmp 
- 
-/opt/sw/Mathematica/90/bin/math -run "<<math.m" 
- 
-</code> 
- 
-Mathematica script named ''math.m'': 
- 
-<code> 
-(* configuration for starting remote kernels *) 
  
 Needs["SubKernels`RemoteKernels`"] Needs["SubKernels`RemoteKernels`"]
-$RemoteCommand= "ssh `1` --l `3` \"/opt/sw/Mathematica/90/bin/math -mathlink -linkmode Connect `4` -linkname `2` -subkernel -noinit >& /dev/null &\"" +$RemoteCommand = "ssh -x -f -l `3` `1` <> math <> " -mathlink -linkmode Connect `4` -linkname '`2`-subkernel -noinit" 
  
-(* initialize the kernels on all machines defined in the host file *) 
  
-hosts=Import["machines_tmp","List"+(* first host with one kernel less *) 
 +LaunchKernels[RemoteMachine[hosts[[1]]kernelsperhost-1]];
  
-(* since the kernel on the master is already running, initialization starts with host 2 *) 
 imin=2; imin=2;
 imax=Length[hosts]; imax=Length[hosts];
 idelta=1; idelta=1;
  
-Do[ +Do[  
- Print["starting Kernel: ",i," on ",hosts[[i]]]; +        LaunchKernels[RemoteMachine[hosts[[i]], kernelsperhost]]; 
- LaunchKernels[RemoteMachine[hosts[[i]]]];, +        , {i, imin, imax, idelta}] 
- {i,imin,imax,idelta} +
-]+
  
- 
-(* actual calculation *) 
 primelist = ParallelTable[Prime[k], {k, 1, 20000000}]; primelist = ParallelTable[Prime[k], {k, 1, 20000000}];
 Print[primelist] Print[primelist]
- 
 </code> </code>
  
 +sbatch job script 
 +''jobPar.sh'':
 +<code>
 +#!/bin/bash
 +#
 +#SBATCH -J par
 +#SBATCH -N 2
 +#SBATCH -L mathematica@vsc
  
-===== VSC-3 =====  + 
-=== SLURM === +module purge 
-In order to be able to use Mathematica, you have to load the program with the ''module load xyz'' command +module load Mathematica/10.0.2 # load desired version 
-<code> + 
-[username@l32]$ module avail  # select a Mathematica version + 
-[username@l32]$ module load Mathematica/10.0.2 # load selected version +export MATH_BIN=`which math` 
-[username@l32]module list   check loaded modules +#export MATH_PROC=16 
-Currently Loaded Modulefiles: +scontrol show hostnames $SLURM_NODELIST > nodelist 
-  1) Mathematica/10.0.2  +#execute prolog for getting access to license everywhere 
 +srun hostname 
 + 
 + 
 +math -run < math-vsc3.m
 </code> </code>
-(See also the introduction to the [[https://wiki.vsc.ac.at/doku.php?id=doku:slurm|module command.]]) 
  
-Now, Mathematica can be called by 
 <code> <code>
-[username@l32]$ math +[username@l32]$ sbatch jobPar.sh      # submit your job 
 +[username@l32]$ squeue -u username    # check state of your job
 </code> </code>
  
-==== Example: Parallel Mathematica Task ====+==== remote kernel connection ==== 
 + 
 +The setup follows 
 +[[https://cc-mathematik.univie.ac.at/services/vsc3-cluster/remote-mathematica-kernel/|Local mathematica with remote kernel]] 
 +with the
  
-Using the same Mathematica script example ''math.m'' as used for for VSC-2, we use create a job script called ''jobPar.sh'':+**Shell command to launch kernel:** 
 + 
 +for VSC3+
 <code> <code>
-#!/bin/bash +/Users/<LOCAL-USERNAME>/Library/tunnel.sh <VSC-USERNAME>@localhost:9998 /opt/sw/x86_64/generic/Mathematica/11.3/bin/math `linkname`
-+
-#SBATCH -J par                      # job name +
-#SBATCH -N 2                        # number of nodes=2 +
-#SBATCH --ntasks-per-node=16        # uses all cpus of one node       +
-#SBATCH --ntasks-per-core=1 +
-#SBATCH --threads-per-core=1 +
- +
-math -run "<<math.m"+
 </code> </code>
  
 +for VSC4
 <code> <code>
-[username@l32]$ sbatch jobPar.sh      # submit your job +/Users/<LOCAL-USERNAME>/Library/tunnel.sh <VSC-USERNAME>@localhost:9998 /opt/sw/vsc4/VSC/x86_64/generic/Mathematica/12.3.1/bin/math `linkname`
-[username@l32]$ squeue -u username    # check state of your job+
 </code> </code>
 +
 +Note that on VSC4, the salloc command [as described on the
 +cc-mathematik.univie.ac.at web link] does not need the Mathematica
 +license. Just leave the -L parameter away.
  • doku/mathematica.txt
  • Last modified: 2022/09/20 13:41
  • by groda