Differences
This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision Next revisionBoth sides next revision | ||
doku:ansys [2021/10/22 09:25] – [General Workflow] ir | doku:ansys [2022/08/30 13:40] – ir | ||
---|---|---|---|
Line 22: | Line 22: | ||
===== General Workflow ===== | ===== General Workflow ===== | ||
- | ==== Local machine: pre-/postprocessing, VSC: computation ==== | + | ==== Pre- and postprocessing |
The subsequent figure shows an overview of the general workflow if you use Fluent on your local machine for pre- and postprocessing and the cluster for solving your case, respectively.\\ For this workflow a graphical connection isn't necessary. | The subsequent figure shows an overview of the general workflow if you use Fluent on your local machine for pre- and postprocessing and the cluster for solving your case, respectively.\\ For this workflow a graphical connection isn't necessary. | ||
Line 34: | Line 34: | ||
===== Input file ===== | ===== Input file ===== | ||
- | Create a journal file (fluent.jou) which is written in a dialect of Lisp called Scheme | + | Create a journal file (fluent.jou) which is written in a dialect of Lisp called Scheme. |
+ | This file may be very short, i.e., instructing | ||
+ | However, it may contain up to every single instruction of a case file executed during the run. | ||
+ | You always have the choice between writing a certain command in the journal file or in the graphical user interface (GUI). | ||
+ | |||
+ | A basic form of the journal | ||
< | < | ||
Line 48: | Line 53: | ||
exit yes | exit yes | ||
</ | </ | ||
- | The '' | + | The '' |
+ | |||
+ | Preferably | ||
{{ : | {{ : | ||
- | Keep in mind to set the appropriate path for the cluster. Here the files will be saved in the same directory as the journal file is located. It could be better for the sake of clarity to create | + | Keep in mind to set the appropriate path for the cluster. Here the files will be saved in the same directory as the journal file is located. It could be better for the sake of clarity to create |
+ | This is a relative reference to the path, where your *.cas.gz-file is located. Also keep in mind that the folder called " | ||
---- | ---- | ||
Line 59: | Line 67: | ||
===== Job script ===== | ===== Job script ===== | ||
+ | ==== Single node script ==== | ||
A script for running Ansys/ | A script for running Ansys/ | ||
Line 88: | Line 97: | ||
fi | fi | ||
+ | </ | ||
+ | |||
+ | ==== Multiple node script ==== | ||
+ | A script for running Ansys/ | ||
+ | |||
+ | < | ||
+ | #!/bin/sh | ||
+ | #SBATCH -J fluent_1 # Job name | ||
+ | #SBATCH -N 2 # Number of nodes | ||
+ | #SBATCH --partition=mem_0096 # mem_0096, mem_0384, mem_0768 | ||
+ | #SBATCH --qos=mem_0096 | ||
+ | #SBATCH --ntasks-per-node=48 # number of cores | ||
+ | #SBATCH --threads-per-core=1 # disable hyperthreading | ||
+ | |||
+ | |||
+ | module purge | ||
+ | module load ANSYS/ | ||
+ | |||
+ | unset SLURM_JOBID | ||
+ | export FLUENT_AFFINITY=0 | ||
+ | export SLURM_ENABLED=1 | ||
+ | export SCHEDULER_TIGHT_COUPLING=1 | ||
+ | FL_SCHEDULER_HOST_FILE=slurm.${SLURM_JOB_ID}.hosts | ||
+ | /bin/rm -rf ${FL_SCHEDULER_HOST_FILE} | ||
+ | scontrol show hostnames " | ||
+ | |||
+ | JOURNALFILE=fluent_transient # .jou file | ||
+ | |||
+ | if [ $SLURM_NNODES -eq 1 ]; then | ||
+ | # Single node with shared memory | ||
+ | fluent 3ddp -g -t $SLURM_NTASKS -i $JOURNALFILE.jou > $JOURNALFILE.log | ||
+ | else | ||
+ | # Multi-node | ||
+ | fluent 3ddp -platform=intel -g -t $SLURM_NTASKS -pib.ofed -mpi=intel -cnf=${FL_SCHEDULER_HOST_FILE} -i $JOURNALFILE.jou > $JOURNALFILE.log | ||
+ | fi | ||
</ | </ | ||