Please use the Table of Contents
or your browsers quick search
to find what you’re looking for, as this document is auto-generated from a presentation and context may not always be recognizable without the corresponding talk.
The job of filesystems is to keep data in a structured way. Every filesystem has a filesystem root, directories and files. The filesystem root is a special object which acts as an entry-point for using that filesystem. All other objects (directories and files) are structured below the root.
Two main concepts emerged for using more than one filesystem on a single machine:
<HTML><ol style=“list-style-type: decimal;”></HTML> <HTML><li></HTML><HTML><p></HTML>Special handling (a.k.a Drive letters)<HTML></p></HTML> <HTML><p></HTML>This is how Windows handles filesystems. It explicitly shows which drive you’re working on.<HTML></p></HTML><HTML></li></HTML> <HTML><li></HTML><HTML><p></HTML>Virtual Filesystem<HTML></p></HTML> <HTML><p></HTML>This is a more subtle approach, used by Linux, where you specify in the beginning (booting) which filesystem will be used for which part of the virtual filesystem.<HTML></p></HTML><HTML></li></HTML><HTML></ol></HTML>
Linux’ virtual filesystem tree represents all the files and directories that are reachable from the system. The nice part is that you can work on a Linux machine and not care about whether your file is on the network or on a local filesystem. The main difference for users is the performance delivered by different filesystems.
This is how the (virtual) filesystem looks on Linux:
/
” denotes the root directory/dir1/dir2/object
dir2/object
.
” is a reference to the directory itself..
” is a reference to the parent directoryNext to basic storage and organization of data filesystems have different properties and functionality. Most filesystems provide a way to store and access attributes, different kinds of special files and some filesystems provide various advanced features:
The home directory of the user is located on an NFS filesystem, which ensures that all parts of the cluster have a consistent view of files.
The filesystem behind the $SCRATCH
variable is located on a tmpfs filesystem, which is a double-edged sword. On one hand it’s fast, but since it uses RAM as a storage device it does limit the amount of memory available for programs. Also you can only use data stored in a tmpfs only on the host itself.
This is how the prompt looks by default:
<HTML> <!– ![](pictures/tango_terminal.svg){height=100px style=“float: left; padding-left: 2em;” .slidy}–> </HTML>
[myname@l3_ ~]$
$PS1
echo $PS1
Ways to get help when you’re stuck:
Most of the time a command doesn’t act as expected, it shows an error message. From this point you have multiple approaches:
-h
/--help
flagman <COMMAND>
will be available for most programs too, if not man -K <KEYWORD>
will search all man-pages that contain the keyword. (FYI: press ‘q’ to quit)man
is info <COMMAND>
, which is like a browser from the ’80sTo execute a program, we call it:
gcc FizzBuzz.c -o FizzBuzz
./FizzBuzz
false
echo $?
The examples show compiling a program, executing the result, trying to load a module on our cluster and checking if the previous command succeeded.
gcc
is a program that is in a directory specified by the $PATH
variable and will be found without specifying its exact location../FizzBuzz
is a newly compiled executable, which is not found by looking at $PATH
, so we explicitly add ./
, to show that we want to execute it from the current directorymodule load non-existent-module
fails, as the module command can’t find non-existent-module
. Whenever a command fails, its return value
is set to a value other than zero. The manual for some commands has a map from return-value to error-description to aid the user debugging.echo $?
is a command that prints the return value of the previous command.Your shell keeps a log of all the commands you executed.
history
command is used to access this history<CTRL>-R
keys or the <Up-Arrow>
The default way to apply parameters to a program is to write a space separated list of parameters after the program when calling it.
These parameters are either
where some parameters also take additional arguments.
For most commands you can combine multiple single-character parameters. This doesn’t change the meaning of the parameters, but is limited to single-character parameters which don’t take extra arguments.
COMMAND -j 2 -a -b -c COMMAND -j 2 -abc
One thing to look out for is the order of parameters. Most of the time no specific order is required, but you should look out for things like copying the target over the source file. Also watch out to keep parameters and their arguments together.
COMMAND <SRC> <DEST> # OK COMMAND <DEST> <SRC> # PROBABLY WRONG COMMAND -j 2 --color auto # OK COMMAND -j auto --color 2 # PROBABLY WRONG
Whenever a parameter has to contain a character that is either unprintable or reserved for the shell, you can use:
COMMAND This\ is\ a\ single\ parameter COMMAND "This is a single parameter" COMMAND 'This is a single parameter'
You can define aliases in your shell. These are usually used to shorten names for commands which are used often with a fixed set of parameters or where you have to be careful to get things right. These aliases are accessable as if they were commands.
alias ll='ls -alh' alias rm='rm -i' alias myProject='cd $ProjectDir; testSuite; compile && testSuite; cd -'
After that, you can use the aliases synonymously.
ll # Same as 'ls -alh' rm # Same as 'rm -i' myProject # Same as 'cd $ProjectDir; testSuite; compile && testSuite; cd -'
Patterns are an easy way of defining multiple arguments, which are mostly the same. The pattern will match anything in it’s place.
The other concept is a expansion. In this case only defined patterns will be matched.
You can try these commands and see what they do. These are all totally safe, even if you modify the arguments.
ls file.??? ls *.*
echo {{A..Z},{a..z}} echo {{A,B},{X,Y}}
echo {A..Z},{a..z} echo {A,B}{X,Y}
Often you need to specify some string, but patterns and expansions aren’t enough, to cover all possibilities. In these cases you can use a regular expression also known as regex. These regexes are used by editors for search and replace, the egrep
command for filtering through files and inside many scripts to validate parameters.
.+ # Match any character, once or more \. # match a dot (A|a)p{2}le # apple, Apple ^[^aeiouAEIOU]+$ # any line of only non-vowels
For a detailed explanation see Wikipedia, Regular-expressions.info or try regex101.
If you want to challenge yourself, try Regex Crossword!
In the shell language there are a few ways to organize the execution path. The most important ones are:
The simplest mechanism for control flow is to chain commands together in a simple if COMMAND then NEXTCOMMAND else ERRORCOMMAND
. Since this would be cumbersome to write, most shells provide simple syntax for this: COMMAND && NEXTCOMMAND || ERRORCOMMAND
and if a command should be run without relying on the return value of its predecessor it’s written: COMMAND; NEXTCOMMAND
. And if you only want to execute further commands in one case (but not the other), you don’t even have to specify both branches.
false ; echo "Should I be Printed?" false && echo "Should I be Printed?" false || echo "Should I be Printed?"
Should I be Printed? Should I be Printed?
The other way to execute commands conditionally are loops. You can loop over files, numeric arguments, until a either the loop condition is false or a break is encountered.
for i in * do mv $i{,.bak} done
while true do echo "Annoying Hello World" sleep 3 done
for i in *; do mv $i{,.bak}; done while true; do echo "Annoying Hello World"; sleep 3; done
If
is similar to the previous chaining of commands, except that it is more verbose and nicer to read if you have many commands to execute one branch of the decision. For more conditions the elif
(else if) statement can be used. If you use a lot of elif
s and you only check one variable with them, you should consider using a case
statement.
if [ $VARIABLE1 ] then COMMAND1 elif [ $VARIABLE2 ] COMMAND2 else COMMAND3 fi
Case
statements are for querying all states of a single variable and making a decision based on that. It can match some simple expansions, which do NOT follow the general syntax of bash expansions. Also it processes alternative matches when seperated with |
(pipe character).
case $VARIABLE in [0-9] | [1-2][0-9]) COMMAND1 ;; *) COMMAND2 ;; esac
Write output to a file or file-descriptor
Command | Redirect | Append | Description |
---|---|---|---|
program | > std.log | >> std.log | redirect stdout to a file |
program | 2> err.log | 2>> err.log | redirect stderr to a file |
program | 2>&1 | redirect stderr to stdout |
Write output into the input-stream of another process
Command | Pipe | Description |
---|---|---|
program | | grep -i foo | pipe stdout into grep |
program | | tee file1 file2 | overwrite files and stdout |
program | | tee -a file | append to files and stdout |
LANG=en_US.UTF-8 bash export LANG=en_US.UTF-8
env echo ${LANG} echo $PWD
unset LANG env -u LANG
Some variables that could affect you are:
$EDITOR # the default editor for the CLI $PAGER # utility to read long streams $PATH # program paths, in priority order
if you’re aiming for programming, these could be more interesting:
$LIBRARY_PATH # libraries to link by the compiler $LD_LIBRARY_PATH # libraries to link at runtime $CC # sometimes used to set default C compiler $CFLAGS # default flags for compiling C
<HTML> <div class=“incremental”> <div> <hr color=“lightgrey” class=“slidy” /> </HTML>
if you have a lot of self-compiled binaries:
export PATH="./:$HOME/bin/:$PATH"
<HTML> </div> </div> <!– ## Pause Please?<br><br> → Get Coffee<br> → Get Cookies {.slidy} –> </HTML>
Just to ensure that you are able to run your scripts
Change the owner of files and directories by:
# only works with root privileges chown user file chown -R user:group dirs files
Change the mode of files and directories by:
chmod -R u=rwx,g+w,o-rwx dirs files chmod 640 files chmod 750 dirs chmod 750 executables
A little test program, which we mark as executable and hand it over to the corresponding interpreter:
cat << EOF > test.sh echo "${LANG}" echo "${PATH}" EOF
chmod +x test.sh
bash test.sh
Don’t we have an OS, capable of executing everything it recognizes as an executable?
Yes, we do!
cat << EOF > test.sh #!/bin/bash echo "${LANG}" echo "${PATH}" EOF
chmod +x test.sh
./test.sh
Programming in bash would be cumbersome without functions, so here we go:
allNumbersFromTo () { echo "1 2 3" }
This isn’t good, as were only getting a fixed amount of numbers. Let’s try a recursive approach:
allNumbersFromTo () { num=$1 max=$2 echo "${num}" if [ $num -lt $max ]; then allNumbersFromTo "$(($num + 1))" $max fi }
allNumbersFromTo () { min=$1 max=$2 for num in $(seq $min $max) do echo "${num}" done }
allNumbersFromTo 1 10
# .bashrc # Source global definitions if [ -f /etc/bashrc ]; then . /etc/bashrc fi # User specific aliases and functions alias sq='squeue -u $USER' alias rm='rm -i' export PATH="./:$HOME/bin:$PATH"
<HTML> <!– ## Legal {.slidy}
### Copyleft & Copyright {.slidy}
<div class=slidy> - wikipedia — “Sadegh 1990 hosseini”: - [CC-Attribution-ShareAlike (CC-BY-SA)](https://creativecommons.org/licenses/by-sa/4.0/deed.en) - [Directory Tree](https://commons.wikimedia.org/wiki/File:Linux_directories.jpg) - VSC — Lengyel Balazs: - No explicit license - Screenshots </div> –> </HTML>