Differences
This shows you the differences between two versions of the page.
Both sides previous revision Previous revision Next revision | Previous revision | ||
doku:ansys [2024/04/03 07:32] – [Module] amelic | doku:ansys [2024/04/25 08:14] (current) – [Job script] amelic | ||
---|---|---|---|
Line 1: | Line 1: | ||
- | ====== | + | ====== |
+ | ===== General Workflow ===== | ||
+ | To generate your workflow for Ansys most effectively, | ||
===== Module ===== | ===== Module ===== | ||
- | Check available versions of Ansys: | + | Discover the available versions of Ansys by executing the following command in your terminal: |
< | < | ||
module avail 2>&1 | grep -i Ansys | module avail 2>&1 | grep -i Ansys | ||
</ | </ | ||
- | and load your preferred | + | and load your preferred |
< | < | ||
Line 14: | Line 16: | ||
</ | </ | ||
- | ---- | ||
- | ===== General Workflow ===== | ||
==== Pre- and postprocessing locally & remote computation on VSC ==== | ==== Pre- and postprocessing locally & remote computation on VSC ==== | ||
+ | For utilizing Ansys Workbench and Ansys Fluent, the RSM Scheduler presents an effective option. You can find comprehensive guidance on its usage here: | ||
- | The subsequent figure shows an overview of the general | + | ===Ansys Fluent=== |
+ | |||
+ | The following provides a comprehensive | ||
+ | |||
+ | For learning | ||
+ | |||
+ | - Start Ansys Fluent and open the .cas file. | ||
+ | - Write a .cas and .dat file out. | ||
+ | - Submit | ||
+ | sbatch fluent_run.sh | ||
+ | </ | ||
+ | - Check the allocated number of processes using the terminal commands: logging into the computing nodes and running | ||
+ | - Check the status of your job in the fluent.out file (not the slurm.out file). | ||
+ | - When the job is finished, open/write the .dat file and check the results. | ||
All files needed for this testcase are provided here: | All files needed for this testcase are provided here: | ||
Line 29: | Line 43: | ||
===== Input file ===== | ===== Input file ===== | ||
- | Create a journal file (fluent.jou) | + | You can now create your own input file (**Fluent |
- | This file may be very short, i.e., instructing the code to be **read, run and written**. | + | |
- | However, it may contain up to every single instruction of a case file executed during the run. | + | |
- | You always have the choice between writing a certain command in the journal file or in the graphical user interface (GUI). | + | |
- | A basic form of the journal file reads: | + | This file instructs Ansys Fluent to read, run, and write files. |
+ | |||
+ | Alternatively, | ||
+ | |||
+ | A basic template for the journal file is provided below: | ||
< | < | ||
# ----------------------------------------------------------- | # ----------------------------------------------------------- | ||
# SAMPLE JOURNAL FILE | # SAMPLE JOURNAL FILE | ||
- | # | ||
# read case file (*.cas.gz) that had previously been prepared | # read case file (*.cas.gz) that had previously been prepared | ||
file/ | file/ | ||
- | file/ | ||
solve/ | solve/ | ||
solve/ | solve/ | ||
Line 48: | Line 61: | ||
exit yes | exit yes | ||
</ | </ | ||
- | The '' | ||
- | Preferably | + | Preferably |
- | {{ : | + | {{: |
- | + | Keep in mind to set the appropriate path for your directories | |
- | Keep in mind to set the appropriate path for the cluster. Here the files will be saved in the same directory as the journal file is located. It could be better for the sake of clarity to create an additional directory for this backupfiles, | + | |
- | This is a relative reference to the path, where your *.cas.gz-file is located. Also keep in mind that the folder called " | + | |
+ | The '' | ||
---- | ---- | ||
===== Job script ===== | ===== Job script ===== | ||
+ | When executing Slurm jobs for Fluent, it's essential to consider specific parameters within the command, including: | ||
+ | |||
+ | * 2d: Indicates a job for 2D Fluent simulations. | ||
+ | * 2ddp: Denotes 2D Fluent jobs in double precision. | ||
+ | * 3d: Specifies a job for 3D Fluent simulations. | ||
+ | * 3ddp: Represents 3D Fluent jobs configured for double precision. | ||
+ | |||
+ | These parameters play a crucial role in defining the type and precision of the Fluent simulations run through Slurm. | ||
+ | |||
==== Single node script ==== | ==== Single node script ==== | ||
- | A script for running Ansys/ | + | |
+ | A script for running Ansys/ | ||
< | < | ||
#!/bin/sh | #!/bin/sh | ||
#SBATCH -J fluent | #SBATCH -J fluent | ||
- | #SBATCH -N 2 | + | #SBATCH -N 1 |
#SBATCH -o job.%j.out | #SBATCH -o job.%j.out | ||
#SBATCH --ntasks-per-node=24 | #SBATCH --ntasks-per-node=24 | ||
- | #SBATCH --threads-per-core=1 | + | |
- | #SBATCH --time=04: | + | |
module purge | module purge | ||
Line 79: | Line 99: | ||
JOURNALFILE=fluent.jou | JOURNALFILE=fluent.jou | ||
- | if [ $SLURM_NNODES -eq 1 ]; then | + | time fluent 3ddp -g -t 24 < "./$JOURNALFILE" |
- | # Single node with shared memory | + | |
- | | + | |
- | else | + | |
- | # Multi-node | + | |
- | fluent 3ddp \ # call fluent with 3D double precision solver | + | |
- | -g \ # run without GUI | + | |
- | -slurm -t $SLURM_NTASKS \ # run via SLURM with NTASKS | + | |
- | -pinfiniband \ # use Infiniband interconnect | + | |
- | -mpi=openmpi \ # use IntelMPI | + | |
- | -i $JOURNALFILE > fluent.log # input file | + | |
- | fi | + | |
</ | </ |