Bash Commands

My commonly used commands

This is a page for my personal reference. TLDR pages has a more comprehensive command collection.

Background settings

Take the local repo of this blog as an example.

Show current path.

$ pwd

Distinguish files from folders .

$ ls -l
total 28
drwxrwxr-x 2 vin100 vin100 4096 Jun 28 21:00 archetypes
-rw-rw-r-- 1 vin100 vin100 2734 Aug 28 17:30 config.toml
drwxrwxr-x 4 vin100 vin100 4096 Jun 29 00:47 content
drwxrwxr-x 3 vin100 vin100 4096 Aug 25 15:29 layouts
drwxrwxr-x 9 vin100 vin100 4096 Jun 29 00:47 public
drwxrwxr-x 4 vin100 vin100 4096 Aug 25 13:36 static
drwxrwxr-x 3 vin100 vin100 4096 Aug 27 18:06 themes


Avanced password generator.

My preferred way: partial management.

  1. Partial password saved: apg -a 1 -n 1 -m 6 -x 8 -M SNCL > output.txt
  2. manages partial password_s_.
  3. (De)centralized local storage. (bare repo on USB devices)

Flags explanations: (in alphabetical order)

  • -a [0|1]: algorigthm
    • 0: can be “pronounced”
    • 1: “random” string
  • -m: minimum password length
  • -M: mode
    • c: should include capital letter
    • C: must include capital letter
    • l: should include lowercase letter
    • L: must include lowercase letter
    • n: should include number
    • N: must include number
    • s: should include special character
    • S: must include special character
  • -n: number of output passwords
  • -x: maximum password length


CLI Package manager. Requires sudo privilege.

sudo apt-get … Function
update update local repo
upgrade upgrade package version
install install a package
remove remove a package
purge remove a package and erase its folder
autoremove automatically remove unnecessary dependencies
clean remove unused local cache files
-s simulate, print only, dry run


I redirect interested readers to the post on my old blog to avoid duplicating efforts.


I use it to extract column(s). I don’t know why double quotes " doesn’t work.

$ ls -dl * | awk '{print $9, $5}'
archetypes 4096
config.toml 2861
content 4096
layouts 4096
public 4096
static 4096
themes 4096

It can be used to extract Git remote URL from git remote -v. They stand for fetch and push URLs.

$ git remote -v | awk '{print $2}'

It’s a sequence of /PAT/ {ACTION}. At most one of these two can be omitted. I suggest man mawk for a concise user guide. Things are executed record-wise, and the record separator (RS) is, by default, the newline character. By setting this to the empty string, RS becomes an empty line.

/PAT/ can be BEGIN and END. The former is useful for string substitutions and printing table headers.

Some built-in variables:

  • NR: number of records (e.g. NR==1{next} 1 prints all lines except the first one)
  • NF: number of fields (e.g. $NF stands for the last field.)
  • RS: input record separator (default: newline ↵)
  • ORS: output record separator (default: white space ␣)
  • IFS: input field separator (default: white space ␣)
  • OFS: output field separator (default: white space ␣)
  • RSTART: regex match() start index (0 if no match)
  • RLENGTH: regex match() length (-1 if no match)

Some string functions:

  • match(str, regex): return character position for the leftmost longest match

  • substr(str, start[, len]): return a len-character-long substring of str. If len is omitted, then return the whole line suffix starting from start.

  • sub(pat, repl[, target]): substitute leftmost longest pat in target ($0 if omitted) with repl. Return the number of match. (either 0 or 1)

  • gsub(pat, repl[, target]): substitute pat's in target ($0if omitted) withrepl`. Return the number of match(es).

    In section 9.1.3 of the GNU Awk User’s Guide, the following incorrect example is shown. I didn’t know why that’s wrong.

      gsub(/xyz/, "pdq", substr($0, 5, 20))  # WRONG

    Even though some commercial variants of awk allow this, that’s not portable to usual *nix shells. As a result, that should be avoided.

    The reason that that’s wrong is that (g)sub attempts to assign value pdq to part of substr which matches xyz. But the output of substr() can’t be assigned.

  • split(str, arr[, fldsp[, sparr]]): split str according to pattern fldsp (FS if omitted) and store them into arr. Return length(arr).

      split("123.456.789", arr, ".", sparr)
      arr[1] = "123"
      arr[3] = "789"
      sparr[2] = "."
  • exec(cmd): execute the command represented by the variable cmd.

    The following command attempts to fetch a local copy of a remote image file referenced in The rationale for doing that is to avoid hot-linking. This command would fail for a line containing both and sample.png, but that’s good enough to replace repeated copy and paste in a very simple file.

      $ awk '/https/ && /png/ {cmd="curl -L -O "$NF;system(cmd)}'


GNU’s Bourne-again shell. Use this with -c [CMD] to execute a CMD.

Key SIGNAL effect
^C SIGINT interrupt current process in the foreground
^D NIL send EOF by default, terminate normally a bash utility
^Z SIGTSTP suspend current process (see also: fg and bg)

The caret notation ^★ means <C-★>. For example, ^D means <C-d>.

Some built-in commands:

  • pwd: print currect directory

  • cd [PATH]: change directory. If PATH is omitted, switch to ~.

  • history [n]: print last n command(s) with line numbers.

  • alias foo=[CMD]: set up foo as a shorthand of CMD, usually appear in ~/.bashrc

  • jobs: display a list of current shell process(es)

  • bg [jobspec] / fg [jobspec]: run suspended job in the background / foreground.

  • kill [jobspec]: send SIGTERM to jobspec to kill current job.

    jobspec meaning
    %+ current job, same as % and %%
    %- previous job
    %[n] job number n
    %foo a job named foo (the first word in cmd)
    %?foo a job matched foo

    In fact, bg and fg can be omitted. Simply calling the jobspec (say, %2) will run the job in fg. Appending & to the jobspec (say, %1 &) will run the job in bg.

    Usgae with process ID: kill [OPTION(S)] [PID]. PID can be consulted from ps aux. When referring to processes in jobs, prepend the job ID by %. (e.g. kill %3)

    • -TSTP: “polite” stop (SIGTSTP), suspend a job instead of killing it. (Thanks to Steve Burdine .)
    • -STOP: “hard” stop (SIGSTOP), use -TSTP instead if possible
    • -CONT: continue job (SIGCONT)

    Some thoughts on : To suspend a running job (in either bg or fg, of whatever sign: +,-,  , through ^Z or kill -TSTP), it’s automatically brought to fg. As a result,

    • newly suspended job → %+
    • %+%-

    Some thoughts on %foo and %?foo: if foo matches multiple jobs, bash will throw an error. However, Zsh won’t: the most recent matching job will be run.

  • read [VAR]: read STDIN and set it to shell VAR. This avoids exposing VAR value to bash history.

  • which [CMD]: find out the absolute path of CMD

  • while: get loop variable values from file

    while read url; do echo "$url"; done (Source)

    See also: ruakh’s explanation for while IFS= ...

  • command [-Vv] [CMD] [ARG(S)]

    • -v: print command absolute path or alias
    • -V: explain command type of CMD in a complete sentence
    • no flags: execute commands with colors stripped off.

Some shell evaluations:

  • $(( 5%3 )): do basic arithmetic

  • $(...)/`...`: command evaluation

  • =~: binary operator for regex matches

      $ `[[ "foobar" =~ "bar" ]]` && echo matched

Some shell strings manipulations:

Sotapme’s answer on Stack Overflow refers to an old Linux journal article that explains clearly the shell parameter expansion.

Four trimming combinations: ${variable#pattern}, …

from \ match shortest longest
start # ##
end % %%

Example: file (extension) name/path extraction

$ foo=/tmp/my.dir/filename.tar.gz
$ path=${foo%/*}
$ echo $path
$ file=${foo##*/}
$ echo $file
$ base=${file%%.*}
$ echo $base
$ ext=${file#*.}
$ echo $ext

Boolean operators:

  • !: NOT

      if [ ! -d $DIR ]
  • &&: AND

    Can be used for chaining commands in list constructs. cmd2 will be executed only if cmd1 returns true (zero). The command terminates if cmd1 returns false (non-zero).

      cmd1 && cmd2
  • ||: OR

    cmd2 will be executed only if cmd1 returns false (nonzero). The command terminates if cmd1 returns true (zero).

      cmd1 || cmd2

&& and || can be chained together. The following two commands are equivalent.

    cmd1 && cmd2 || cmd3
    (cmd1 && cmd2) || cmd3

In the first command, even though cmd1 returns false, the command won’t terminate. Nonetheless, such chaining is not associative.


GNU’s basic calculator

Flags explanations: (in alphabetical order)

  • -l: load math library

    syntax function
    a $\arctan$
    s $\sin$
    c $\cos$
    l $\log$
  • -q: quiet, don’t display GNU’s standard welcome message


Display hard disk UUID.


Catenate (combine) files and display them (in STDOUT).

  • -n: display line number

This can be used with the “null device” to show string without leaving a trace in the bash history.

$ cat > /dev/null
foo bar ...


General usage: convert [input-option] [input-file] [output-option] [output-file].

Supported formats: JPG, PNG, GIF, SVG, etc

GUI softwares (e.g. GIMP enable preview of processed images, which is necessary in image sharpening. Therefore, I only list a few options below.

format conversion

$ convert foo.ppm -quality [num] bar.jpg

[num] takes value from 1 to 100. The higher the value, the better the quality and the larger the file.

image manipulation

This is good for batch processing.


  • -crop WxH+X+Y: crop image of W px $\times$ H px from (X,Y)
  • -rotate [DEG]: rotate input-file by DEG clockwisely.
  • -resize [DIM1] [DIM2]: resize image (if DIM2 is missing, largest dimension will be taken)


Like wget, grab stuff from the Internet. Support all common protocols. (HTTP(S), (S)FTP, etc)

Basic usage:

$ curl [FLAG(S)] [URL]

It writes to STDOUT. An -o flag can be passed to specify output file.

$ curl -o foo.txt

File downloading

To keep the file name, use -O.

$ curl -O

To download multiple files, use -O in series.

$ curl -O [URL1] -O [URL2] ...

To continue an interrupted download, use -C like the following line.

$ curl -C - -O

This saves time and bandwidth in case of network interruption, and this can be applied to large files like ISO files for GNU/Linux distros.

URL redirects

If a page is “moved (permanently)” or the URL is the shortlink of another page, use -L to enable URL redirects.

$ curl       # no output
$ curl -L    # many lines of output

GET request

$ curl -i -X GET

See also: wget

Shorten GitHub URL

The online version of GitHub’s URL shortener doesn’t allow user-supplied short name.

Sucessful outcome

$ curl -i -F "url=" \
-F "code=bjsm18"; echo
HTTP/1.1 201 Created
Server: Cowboy
Connection: keep-alive
Date: Wed, 19 Dec 2018 21:38:46 GMT
Status: 201 Created
Content-Type: text/html;charset=utf-8
Content-Length: 45
X-Xss-Protection: 1; mode=block
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-Runtime: 0.034660
X-Node: 1ba43bb4-4d53-46d8-85c0-3c882f10dc56
X-Revision: 392798d237fc1aa5cd55cada10d2945773e741a8
Strict-Transport-Security: max-age=31536000; includeSubDomains
Via: 1.1 vegur

Failed outcome

$ curl -i -F "url=" \
-F "code=vbjz"; echo
HTTP/1.1 422 Unprocessable Entity
Server: Cowboy
Connection: keep-alive
Date: Wed, 19 Dec 2018 21:38:13 GMT
Status: 422 Unprocessable Entity
Content-Type: text/html;charset=utf-8
Content-Length: 114
X-Xss-Protection: 1; mode=block
X-Content-Type-Options: nosniff
X-Frame-Options: SAMEORIGIN
X-Runtime: 0.011171
X-Node: db5e77f9-b4e8-41b3-bb6c-a85a5c4493c1
X-Revision: 392798d237fc1aa5cd55cada10d2945773e741a8
Strict-Transport-Security: max-age=31536000; includeSubDomains
Via: 1.1 vegur

"" was supposed to be shortened to "vbjz", but "existing" already is!


  1. HowtoForge
  2. curl tutorial
  3. curl POST examples


Cut and display the useful part of each line of message.

  • -c[LIST]: select character according to [LIST]
  • -d[DELIM]: set delimiter (default is white space)
  • -f[LIST]: select fields according to [LIST]
  • -z: \0-delimited. Useful for handling strings containing newlines and white spaces, e.g. output of find -print0.

I discovered this new way of extracting columns with custom delimiters.

$ git --version
git version 2.17.1
$ git --version -v | cut -d' ' -f1  # returns git
$ git --version -v | cut -d' ' -f2  # returns version
$ git --version -v | cut -d' ' -f3  # returns 2.17.1

However, this can’t be used for extracting the Git remote URL from git remote due to tabs \t separating the first two columns.

$ git remote -v | head -1 | od -An -c
   o   r   i   g   i   n  \t   g   i   t   @   g   i   t   l   a
   b   .   c   o   m   :   V   i   n   c   e   n   t   T   a   m
   /   v   i   n   c   e   n   t   t   a   m   .   g   i   t   l
   a   b   .   i   o   .   g   i   t       (   f   e   t   c   h
   )  \n

In this case, awk has to be used.


Display or adjust system date. Default to current time (zone).

Flags explanations: (in alphabetical order)

  • -d [STR]: convert STR to +%c

      $ date -d '@1536336779'
      Friday, September 07, 2018 PM06:12:59 CEST
  • -f [FILE]: read from FILE line by line and convert to +%c

  • -I[d|h|m|s|n]: ISO 8601 (default to d)

    [dhmsn] output
    n 2018-09-07T18:12:59,822423484+02:00
    s 2018-09-07T18:12:59+02:00
    m 2018-09-07T18:12+02:00
    h 2018-09-07T18+02:00
    d 2018-09-07
  • -R: for sending emails 📧

      $ date -R -d "2018-09-07 18:12:59"
      Fri, 07 Sep 2018 18:12:59 +0200
  • --rfc-3339=[dsn]: similar to -I with little differences (T, ,.)

    [dsn] output
    n 2018-09-07 18:12:59.822423484+02:00
    s 2018-09-07 18:12:59+02:00
    d 2018-09-07


$ echo $LC_TIME
$ date +%A
+% e.g. Remarks
z +0200
:z +02:00
::z +02:00:00
:::z +02 shortest numeric time zone
c Friday, September 07, 2018 PM06:12:59 CEST locale’s date and time
Y 2018
C 20
y 18
q 3 quarter
m 09
B September
b Sep same as %h
U 35 Week no. (week starts from Sunday, 00–53)
V 36 ISO week no. (week starts from Monday, 01–53)
W 35 Week no. (week starts from Monday, 00–53)
j 250 jour in French (001–366)
F 2018-09-07 Full date: %Y-%m-%d
x Friday, September 07, 2018 locale’s date
w 6
A Friday
a Fri
d 07
e 7 %_d
p PM blank if unknown
P pm idem
r PM06:12:59 CEST locale’s 12-hour time
T 18:12:59
X 06:12:29 CEST locale’s time
R 18:12
H 18
k 18 %_H
I 06
l 6 %_I
M 12
s 1536336779 seconds elapsed since 01/01/1970 00:00:00 UTC
S 59
n Enter ↵
t Tab ↹


Optional flags in between % and [char] meaning
- no padding
_ pad with
0 pad with 0
^ try uppercase


Acronym meaning
LC locale
CEST central European standard time


Disk free. Return amount of used and available disk space.

If a file is specified in the argument, df will return the row which represents the file system containing the file.

$ df
Filesystem     1K-blocks     Used Available Use% Mounted on
udev             3982080        0   3982080   0% /dev
tmpfs             802728     1304    801424   1% /run
/dev/sda7       29396988 10706500  17174152  39% /
tmpfs            4013620    21500   3992120   1% /dev/shm
tmpfs               5120        4      5116   1% /run/lock
tmpfs            4013620        0   4013620   0% /sys/fs/cgroup
/dev/sda6         463826   151423    283936  35% /boot
tmpfs             802724       12    802712   1% /run/user/1000
  • -B[SIZE]: set unit to SIZE. GB is the SI counterpart of G.

      $ df -BGB
      Filesystem     1GB-blocks  Used Available Use% Mounted on
      udev                  5GB   0GB       5GB   0% /dev
      tmpfs                 1GB   1GB       1GB   1% /run
      /dev/sda7            31GB  11GB      18GB  39% /
      tmpfs                 5GB   1GB       5GB   1% /dev/shm
      tmpfs                 1GB   1GB       1GB   1% /run/lock
      tmpfs                 5GB   0GB       5GB   0% /sys/fs/cgroup
      /dev/sda6             1GB   1GB       1GB  35% /boot
      tmpfs                 1GB   1GB       1GB   1% /run/user/1000

    This flag doesn’t give accurate results due to rounding errors. Use this with care.

  • -h: human readable sizes

      $ df -h
      Filesystem      Size  Used Avail Use% Mounted on
      udev            3.8G     0  3.8G   0% /dev
      tmpfs           784M  1.3M  783M   1% /run
      /dev/sda7        29G   11G   17G  39% /
      tmpfs           3.9G   22M  3.9G   1% /dev/shm
      tmpfs           5.0M  4.0K  5.0M   1% /run/lock
      tmpfs           3.9G     0  3.9G   0% /sys/fs/cgroup
      /dev/sda6       453M  148M  278M  35% /boot
      tmpfs           784M   12K  784M   1% /run/user/1000
  • -H: SI counterpart of -h

      $ df -H
      Filesystem      Size  Used Avail Use% Mounted on
      udev            4.1G     0  4.1G   0% /dev
      tmpfs           822M  1.4M  821M   1% /run
      /dev/sda7        31G   11G   18G  39% /
      tmpfs           4.2G   23M  4.1G   1% /dev/shm
      tmpfs           5.3M  4.1k  5.3M   1% /run/lock
      tmpfs           4.2G     0  4.2G   0% /sys/fs/cgroup
      /dev/sda6       475M  156M  291M  35% /boot
      tmpfs           822M   13k  822M   1% /run/user/1000
  • -t: file system type

      $ df -t ext4
      Filesystem     1K-blocks     Used Available Use% Mounted on
      /dev/sda7       29396988 10706516  17174136  39% /
      /dev/sda6         463826   151423    283936  35% /boot
  • --total: produce a sum of disk spaces at the bottom row and complete that row.

  • -x: opposite of -t

See also: du


🆚 Display the difference between two text files.

Basic usage: diff [FILE1] [FILE2]

More useful form: show diff hunk. diff -u [FILE1] [FILE2]


Dealing with packages. I only know the listing functionalities.

Flags explanations: (in alphabetical order)

  • -l [PKG(S)]: display package name, version, architecture and description

  • -L [PKG]: show installed files’ absolute path

      $ dpkg -L g++


Display disk usage in KB. Only folders are shown by default.

General usage: du [FLAG(S)] [DIR(S)] ...

If the argument [DIR(S)] is omitted, the size of every (sub)folder (including hidden ones) will be displayed.

$ du

Display the size of each subfolder in the folder layouts.

$ du layouts
12 layouts/partials
20 layouts/

Flags explanations: (in alphabetical order)

  • -a: also include all files

  • -c: include grand total at the bottom

      $ du -c layouts static
      12  layouts/partials
      20  layouts
      8   static/css
      196 static/img
      212 static
      232 total
  • -d [n]: max depth

    Process contents of [DIR] up to at most n level(s) deep.

    I found the concept of “level” hard to understand when I ran this in . because the output was cluttered with folders holding binary objects.

    Let me illustrate this idea with the following example.

      $ du -d 2 content
      8       content/post/2018-08-29-csb-theorem
      16      content/post/2018-07-17-rodeo
      248     content/post/2018-07-07-upgraded-to-linux-mint-19
      500     content/post/2018-08-18-ubuntu-18-04-installation-on-fujitsu-ah557
      76      content/post/2018-07-26-web-image-optimisation-with-gimp
      920     content/post/2018-07-04-fujitsu-lh532-fan-cleaning
      388     content/post/2018-08-23-brighten-image-with-gimp
      1412    content/post/2018-07-23-fujitsu-lh532-keyboard-cleaning
      3624    content/post
      12      content/page/bash-commands
      20      content/page┆┆
      3652    content┆┆   ┆┆
                Lv 0  Lv 1                         Lv 2
  • -h: human readable

      $ du -h layouts
      12K layouts/partials
      20K layouts/
  • --exclude=[FILE]

  • -s: summary, display only [DIR(S)]’s size. (equivalent to -d 0) This can be used to measure the size of a folder .

      $ du -s static content
      212     static
      3656    content
  • --time: also display time in the middle of each row

      $ du --time static
      8       2018-08-28 16:58        static/css
      196     2018-07-26 15:47        static/img
      212     2018-08-28 16:58        static

See also: df


Display all arguments (in STDOUT).

$ echo foo bar
foo bar
  • -e: enable regular expressions

    • \a: shell beep (disabled in Ubuntu by default)

            $ echo -e "\a"
    • \n: newline ↵

    • \t: tab ↹

  • -n: don’t output ↵ at the end

      $ echo -en "\a"


Find files under PATH, -print and/or -execute command(s). PATH defaults to current path ..

General usage:

  • print / -print0: Display files: find [PATH] [FLAG(S)]

    • You may add -print at the end. This won’t change the display, but useful in conjunction with -exec.
    • You may use -print0 so that each output is delimited by null character \0 instead of newline ↵.

    In Ubuntu’s default location of Sublime Text 3 user config files, compare the output of -print

          find ~/.config/sublime-text-3/ -maxdepth 1 -print | od -c
          0000000   /   h   o   m   e   /   v   i   n   1   0   0   /   .   c   o
          0000020   n   f   i   g   /   s   u   b   l   i   m   e   -   t   e   x
          0000040   t   -   3   /  \n   /   h   o   m   e   /   v   i   n   1   0
          0000060   0   /   .   c   o   n   f   i   g   /   s   u   b   l   i   m
          0000100   e   -   t   e   x   t   -   3   /   I   n   s   t   a   l   l
          0000120   e   d       P   a   c   k   a   g   e   s  \n   /   h   o   m
          0000140   e   /   v   i   n   1   0   0   /   .   c   o   n   f   i   g
          0000160   /   s   u   b   l   i   m   e   -   t   e   x   t   -   3   /
          0000200   L   o   c   a   l  \n   /   h   o   m   e   /   v   i   n   1
          0000220   0   0   /   .   c   o   n   f   i   g   /   s   u   b   l   i
          0000240   m   e   -   t   e   x   t   -   3   /   L   i   b  \n   /   h
          0000260   o   m   e   /   v   i   n   1   0   0   /   .   c   o   n   f
          0000300   i   g   /   s   u   b   l   i   m   e   -   t   e   x   t   -
          0000320   3   /   C   a   c   h   e  \n   /   h   o   m   e   /   v   i
          0000340   n   1   0   0   /   .   c   o   n   f   i   g   /   s   u   b
          0000360   l   i   m   e   -   t   e   x   t   -   3   /   P   a   c   k
          0000400   a   g   e   s  \n

    with -print0 using od -c.

          find ~/.config/sublime-text-3/ -maxdepth 1 -print0 | od -c
          0000000   /   h   o   m   e   /   v   i   n   1   0   0   /   .   c   o
          0000020   n   f   i   g   /   s   u   b   l   i   m   e   -   t   e   x
          0000040   t   -   3   /  \0   /   h   o   m   e   /   v   i   n   1   0
          0000060   0   /   .   c   o   n   f   i   g   /   s   u   b   l   i   m
          0000100   e   -   t   e   x   t   -   3   /   I   n   s   t   a   l   l
          0000120   e   d       P   a   c   k   a   g   e   s  \0   /   h   o   m
          0000140   e   /   v   i   n   1   0   0   /   .   c   o   n   f   i   g
          0000160   /   s   u   b   l   i   m   e   -   t   e   x   t   -   3   /
          0000200   L   o   c   a   l  \0   /   h   o   m   e   /   v   i   n   1
          0000220   0   0   /   .   c   o   n   f   i   g   /   s   u   b   l   i
          0000240   m   e   -   t   e   x   t   -   3   /   L   i   b  \0   /   h
          0000260   o   m   e   /   v   i   n   1   0   0   /   .   c   o   n   f
          0000300   i   g   /   s   u   b   l   i   m   e   -   t   e   x   t   -
          0000320   3   /   C   a   c   h   e  \0   /   h   o   m   e   /   v   i
          0000340   n   1   0   0   /   .   c   o   n   f   i   g   /   s   u   b
          0000360   l   i   m   e   -   t   e   x   t   -   3   /   P   a   c   k
          0000400   a   g   e   s  \0

    The output of -print0 can be

  • -exec: Execute commands for each matching file: find [PATH] [FLAG(S)] -exec [CMD]

      $ find archetypes -exec file {} \;
      archetypes: directory
      archetypes/ ASCII text

    {} \; is necessary for representing an instance of matching file.

    -exec expects a bash command instead of an if-else statement or a for loop. Therefore, there’s no way to place them under -exec unless they are wrapped with sh -c. However, I’ve never tried this since I’ve no idea how to put {} \; inside sh -c.

More options:

  • type [d|f|l]: file type

    d directory
    f file
    l symbolic link
  • -mindepth [n], -maxdepth [n]: works like du’s -d flag.

  • -and: AND operator, allow conjunction of -print(0) and -exec.

  • -path [PATH]: matches PATH.

  • -o: OR operator, useful when used with -prune

  • -prune: return TRUE when a directory is matched. This is useful to exclude path when used with -path.

      $ ls static
      css  google191a8de293cb8fe1.html  img
      $ find static -path static/css -prune -o
      find: expected an expression after '-o'
      $ find static -path static/css -prune -o -print
  • -[a|c|m][time|min] [+|-][n]: operate on recently X‘ed file last n Y ago. n should be an integer.

    a accessed
    c status changed
    m modified


    time days
    min minutes
  • can be piped to a while loop with delimiter -d'' for batch file operations.

      find . -print0 | while IFS= read -d '' -r file; do ...; done


The most popular VCS (version control system). Here’s a minimal collection of commands needed for starters. I’ve put more advanced Git commands in a separate page.

getting started

start from cmd
scratch git init
existing project git clone <source_url> <target_path>

add files to be tracked

Basic usage:

goal cmd
add some files git add <PAT>
add all files git add .

git add adds files contents from the working tree to the index.

Technically, git add stages the file(s) matching the <PAT>tern to the index for committing.

Adding empty folder is forbidden.

Here’s a useful trick to list modified non-ignored (tracked and untracked) files. -n stands for --dry-run.

$ git add -An .
add 'content/page/bash-commands/'

Note that the behaviour of git add . has changed since Git 2.0.

unstage files

Avoid committing changes in certain file(s) to the version history. To be used for files in progress.

$ git reset HEAD -- <unready_file>

Update: A newer command git restore is available since Git 2.0. It’s more intuitive, comprehensible and specific than git checkout.

$ git restore -- <pathspec>

(re)move files

To rename files, use git mv <PAT>, which mv the file(s) matching the PATtern and then “inform” of this operation. (stage the mv operation)

To remove (unnecessary) files from , consider git rm, which rm the file(s) matching the PATtern and then “inform” of this operation. (stage the rm operation)

Keep the file? example cmd meaning
RSA/GPG private keys git rm --cached <PAT> remove matching file(s) from only
system generated files git rm <PAT> remove matching file(s) from and file system

ignore files

Simply put the files that you don’t wish to be tracked into ./.gitignore, one file per line. will read this GITIGNORE file take it into account.

show status

If you aren’t sure whether your files are changed, use this.

$ git status
On branch master
Your branch is up to date with 'origin/master'.

Changes not staged for commit:
  (use "git add <file>..." to update what will be committed)
  (use "git checkout -- <file>..." to discard changes in working directory)

    modified:   content/page/bash-commands/

no changes added to commit (use "git add" and/or "git commit -a")

You may run git diff / git diff --cached to view the unstaged / staged differences.

commit changes

Write the changes on the staged files to the history. Vim will pop up by default.

$ git commit

If you don’t want to work with Vim, you may either

  • configure the variable core.editor, or

  • input your commit message using STDIN.

      $ git commit -m "commit message"

In case that you want to abort a Git operation that automatically opens a Vim session (e.g. git commit, git rebase, etc), exit Vim with an error code with :cq.

view changes

  • show commits: git log
  • show code change: git show
  • view project at a certain commit: git checkout <SHA1-hash>

At this stage, one will be able to manage his/her own repo locally.

work with different parallel versions

  • branching: for testing new features on a new branch without cracking the core stuff

      $ git branch <branch-name>

    New syntax since Git 2.0:

      $ git switch <branch-name>
  • check branch(es): git branch only shows local branch(es)

    • -a: show remote-tracking branch(es) also
    • -v: show the SHA-1 hash at the tip of branch
  • switch between branches: git checkout <branch-name>

  • delete branch: git branch -d <branch-name>

  • rename branch: git branch -m <old-branch> <new-branch>

  • compare branches/commits: git diff <branch1>..<branch2>

    • two dots: usual diff -u

        $ git merge
        $ git diff HEAD^..HEAD  # view changes introduced by previous merge
    • three dots: function like two dots, but compare their common ancestor with branch2 instead, useful for checking merge conflicts.

    The two arguments for git diff can technically be any Git references (commits, branches, tags).

  • merge changes from other local branch: git merge <branch-name>

work with others

If you want to save your code to , , , or send upload your repo to an SSH remote for sharing.

  1. Add/set remote:
    • Existing repo: git remote set-url <remote-name> <addr>
    • New repo started from scratch: git remote add <remote-name> <addr>
    • <remote-name>: take this to be origin for the first time if you’re newbie.
    • <addr>: an HTTPS/SSH remote. For example,
      • SSH:
      • HTTPS:
  2. Submit your changes:
    • first time: git push -u <remote-name> <branch-name>. Set upstream branch to track in remote followed by sending out your commits.
    • afterwards: git push
    • delete remote branch: git push <remote-name> :<branch-name>
    • There’s no way to rename a remote branch. Stack Overflow suggests cloning and removing the target branch instead.
  3. Get updates from others:
    • git fetch <remote-name> <branch-name>: download a remote branch without merging it against the current branch.
    • git checkout <remote-name>/<branch-name>: view a remote branch
    • git checkout -b <loc-branch> <remote-name>/<remote-branch-name>: create a local branch from a remote one so that the former tracks the later. (Newer syntax: git switch -c <loc-branch>, -c means --create)
    • git merge <remote-name>/<branch-name>: merge it against the current branch.
      • --ff: fast-forward if possible, otherwise create merge commit. default behaviour.
        • If FETCH_HEAD is a descendant of HEAD, then the history will linear. The HEAD is fast-forwarded to FETCH_HEAD.
        • Otherwise, a merge commit will be created.
      • --no-ff: always create a merge commit
      • --ff-only: fast-forward only. If it’s not possible, abort the merge with an error code.
      • --squash: group all remote branch commits into one commit then append it to HEAD.
        • no merge relationship is made
        • no commit produced right away. Either add git commit -m "..." or use git merge --commit instead. (--commit is incompatible with --squash)
        • refer to Stack Overflow’s comparison on “squash and merge” (git merge --squash) and “rebase and merge” (git rebase, the --merge strategy is implied from the absence of -s).
    • git pull <remote-name> <branch-name>: perform git fetch and git merge.
    • git pull --rebase/git pull -r: perform git fetch and git rebase to get a linear commit history instead of multiple parents in the HEAD commit. Useful when you’ve already committed to a branch whose upstream has also changed, and you want a linear commit history.
      1. fetch commits from remote (git fetch)
      2. modify commit history by applying local commits on top of remote ones (git rebase).
    • abandon a merge: git merge --abort
    • revert a single commit
      • non-merge commit: git revert <commit>
      • merge commit: in git revert -m [1|2] <commit>, -m selects either the first or the second parent of <commit>.
    • See my GitHub pull request tutorial to get the latest features in development from GitHub.
  4. Save your unfinished work when switching to another branch having a different versions of the current unstaged file(s).
    • git stash: save your unfinished work.
    • git stash pop: apply and delete the most recent stashed changes.
    • git stash apply: apply and keep the most recent stashed changes.
    • git stash list: list the stashes.

modifiy history with git-rebase

Given the following graph.

      - A - B - C  topic*
D - E - F - G  master

On the branch topic, these two commands do the same thing.

$ git rebase master
$ git rebase master topic

They give the following graph.

              - A' - B' - C'  topic*
D - E - F - G  master

I find a great example that explains the use of --onto.

D - E - F - G  master
      - A - B - C  server
          - H - K  client

The following three-argument command seems scary at first glance.

$ git rebase --onto master server client

What it does:

  1. Find the common ancestor of 2nd & 3rd arguments. (i.e. A)
  2. Extract the branch A..K excluding A. (i.e. H - K)
  3. Place the extracted branch on top of 1st argument. (i.e. G)


              - H' - K'  client
D - E - F - G  master
      - A - B - C  server

N.B.: H and K is still accessible with their SHA-1 hash, but the branch name client will point to H' instead of H after this Git rebase.

Now it’s easy to understand what the following command does.

$ git rebase -i HEAD~N

It opens an editor session (--interactive mode) containing N commits (from HEAD~{N-1} to HEAD in chronological order). It’s used for modifying the last N commits on the current branch.

The commands git commit --fixup=<commit> and git rebase -i --autosquash <commit>~ can be used to fix a commit then rebase it into current branch.


A librarian like VCS on binary files through symbolic links.

Similar to git, I’ll only put the necessary commands to keep a repo running.

getting started

$ git init
$ git annex init "machine1"

add files

$ git annex add <file>

commit staged changes

$ git commit -m "<commit-msg>"

After git annex add, one can operate on file as in git.

add local remote

For example, backup data on a usb drive.

  1. Get git-annex started on usb.
  2. Add usb as remote on machine1: git remote add usb <path>
  3. and vice versa: git remote add machine1 <path>
  4. Sync remotes: git annex sync [--remote=<remote-name>]
  5. Send some files to usb: git annex sync --content

Working with the “library”:

  • Update catalogue: git annex sync
  • Catalogue check: git annex whereis <file>
  • Demand a copy: git annex copy <file> --from usb
  • Drop unnecessary copies: git annex drop <file>

If one of the remote is down, git annex sync throws an error. To avoid this, set up --remote=....

Before dropping a copy, ask git annex whereis <file>’s other copies.

If, for some reasons, you don’t want git-annex to touch your project, add an empty file .noannex at root level. This is useful if your project is being managed by other software. (say, rsync)

Further reading:

  1. Lee Hinman’s tutorial
  2. Git annex — un cas d’utilisation
  3. Git annex’s official walkthrough


Get regular expression and print.

General usage:

  • read from file(s): grep [FLAG(S)] [PATTERN] [FILE]
  • pipe from STDIN: [cmd] | grep [FLAG(S)] [PATTERN]
  • read from STDIN:
    1. Type grep [PATTERN].

       $ grep foo
    2. Input the content to be searched. After you’ve finished, terminate the line with newline ↵.

       $ grep foo
    3. Your search content is duplicated, with the matching part highlighted.

Flags explanations: (in alphabetical order)

  • -A [n]: print n lines after the matching line

  • -B [n]: print n lines before the matching line

      $ grep -B 3 Author config.toml
      #  src = "img/hexagon.jpg"
      #  desc = "Hexagon"
  • -C [n]: use this when the arguments for -A and -B are the same

  • -c: output the number of matching lines instead of the matching parts

      $ grep -c menu config.toml
  • -E: use regular expression in the search string

  • -i: ignore case

  • -I: ignore binary files

      $ cd content/post/2018-07-23-fujitsu-lh532-keyboard-cleaning
      $ grep if *
      Binary file 20180708_234331_HDR.jpg matches
      Binary file 20180709_001359_HDR.jpg matches
      ... the N key was fixed: *no* noticeable difference from other norm
      al size

    Use this flag to ignore the JPG files (and match the text files only).

      $ grep -I if *
      ... the N key was fixed: *no* noticeable difference from other norm
      al size
  • -l: show matched file path

  • -L: show unmatched file path

  • -n: also show matching line number

      $ grep -n menu config.toml
  • -o: print only matching part. Useful for string extraction.

  • -P: Perl mode, useful for lazy match (.*? for minimum expansion).

  • -r: recursive

      $ grep -r bl[au] content static
      content/post/" aria-hid
      den></i>, Windows 10 often falls into blue screen.
      content/post/2018-07-04-fujitsu-lh532-fan-cleaning/ can put a ti
      ny electronic component into intensive care.  To
      content/post/2018-08-23-brighten-image-with-gimp/ | <i class="fa fa-sq
      uare blue"></i>--<i class="fa fa-square yellow" aria-hidden></i>
      static/css/ {color: #0000FF;}
  • -R: like -r, but dereference links

  • -v: inverse match

  • -w: match whole word, similar to \< and \> in Vim.

  • -q: quiet

grep doesn’t work for multi-line regex match. Use sed or awk instead.

Print first n lines of a file or STDOUT. (n = 10 by default) Works like but opposite of tail.


directory Display binary files as blocks of hexadecimal numbers.

  • -c: character

See alse: od


Display and/or modify connection info.

  • connections
    • e...: ethernet, wired connection
    • lo: localhost
    • w...: Wi-Fi
  • inet: internal IP address

To get external IP address, one needs to send a request to an external server . See wget for the command to do so.


More informative than man.

General usage: info [cmd]


View without editing the file in a separate popup session with some Vim like key bindings.

General usage: less [FILE]

$ less config.toml

Compare with:

  • more: leave nothing on the screen after finish reading [FILE], ideal for text/code files / with a lot of lines.
  • vim: some keys are borrowed from Vim, loads faster due to its limited functionalities. This suits previewing important files, especially system files.
Key Function
b Scroll one page backward
f Scroll one page forward
d Scroll half page down
u Scroll half page up
g Jump to first line
G Jump to last line
j Move the cursor one line down
k Move the cursor one line up
/ Forward search
? Backword search
n Next match
N Previous match
h Show help page
q Quit

Some of the above keys can be quantified by prepending a number as in Vim.


List files in a directory .

Default bash aliases:

Flags added
l -CF
la -A
ll -alF

Flags explanations: (in alphabetical order)

  • -a: all

  • -A: almost all (except current directory . and parent directory ..)

  • --block-size=[SIZE]: SIZE represents a unit (B, K, M, G).

  • -d I don’t understand what “list directories themselves, not their contents” means, as it can display the contents of a directory. Running this on any folder containing files and subfolders illustrates my doubts.

      # see what '-d' does on current path '.'
      $ ls -ld
      drwxrwxr-x 9 vin100 vin100 4096 Aug 27 22:04 .
      # see what '-d' does on every non-hidden folders and files `*`
      $ ls -ld *
      drwxrwxr-x 2 vin100 vin100 4096 Jun 28 21:00 archetypes
      -rw-rw-r-- 1 vin100 vin100 2734 Aug 28 17:30 config.toml
      drwxrwxr-x 4 vin100 vin100 4096 Jun 29 00:47 content
      drwxrwxr-x 3 vin100 vin100 4096 Aug 25 15:29 layouts
      drwxrwxr-x 9 vin100 vin100 4096 Jun 29 00:47 public
      drwxrwxr-x 4 vin100 vin100 4096 Aug 25 13:36 static
      drwxrwxr-x 3 vin100 vin100 4096 Aug 27 18:06 themes
      # take away '-d', compare the two outputs
      $ ls -l *
      -rw-rw-r-- 1 vin100 vin100 2734 Aug 28 17:30 config.toml
      total 4
      -rw-rw-r-- 1 vin100 vin100 84 Jun 28 21:00
      total 12
      -rw-rw-r-- 1 vin100 vin100  279 Aug 28 02:23
      drwxrwxr-x 3 vin100 vin100 4096 Aug 28 17:18 page
      drwxrwxr-x 9 vin100 vin100 4096 Aug 23 13:36 post

    I use -d with * to keep ls from expanding the subfolders. .


Display manual page in a separate session. Less detailed than info.

General usage: man [num] [cmd].


Make directory.


Works like cat, but for files with more lines. Opposite of less.

General usage: more [FILE]


Node.js® Package Manager: manage different versions of Node.js packages.

  • install <package>: install <package> from the official repo.
    • -g: install globally under user’s home file.
    • otherwise: install locally under the current working directory.
  • version: list installed packages with their version.
  • --version: list NPM’s version.
  • help: get concise help.
  • -l: get a full list of options.


NPM Version Manager: manage different versions of NPM. Enable easy switch between different versions of NPM. Useful for cross-platform development.

  • install <vnum>: install NPM at version <vnum>.
  • install-latest-npm: try to upgrade to the latest working NPM.
  • use <vum>: use NPM at version <vnum>.
  • run <vum> <app> run <app> at NPM version <vnum>.
  • help: display help.
  • ls: display installed versions of NPM.
  • current/version: display current active version of NPM.
  • --version: display installed version of NVM.


Display binary files as blocks of octal numbers.

  • -c: character

See also: hexdump


OpenSSL tools

I only know two usages.

  1. random base64 encoded string generation.

     $ openssl rand -out output.txt -base64 12
     $ cat output.txt
  2. RSA key generation

    Generate a 2048-bit RSA key pair and output key.pem

     $ openssl genrsa -out key.pem 2048

    To add extra security to the private key, it may be encrypted with -des3.

    To export a corresponding public key of a private key, use -pubout.

     $ openssl rsa -in key.pem -outform PEM -pubout -out pubkey.pem


pdfcrop --margins '-30 -30 -250 -150' --clip input.pdf output.pdf
  • --margins '-L -T -R -B'
    • negative margins for cropping input.pdf
    • postive margins for whitespaces around input.pdf
    • single number for the four margins of the same width
  • --clip enables clipping support. I don’t know what’s that for.

For more information, see PDFcrop’s man page.


Grep string from PDF file(s). .

General usage:

  • Single file: pdfgrep "foo" bar.pdf
  • Multiple files: pdfgrep -R "foo" ./some/path


ps + grep

Basic usage: pgrep [PROC]. Only one pattern can be provided.

  • -f: match full pattern instead of process name
  • -l: show PROC at the 2nd column

It took me a while to understand what full pattern meant. Process 2339 bears the name Web Content, but it’s triggered by firefox. Viewing its command (by ps aux), I’ve understood why this process appeared in the output of pgrep -fl firefox.

$ pgrep -l firefox
2292 firefox
$ pgrep -lf firefox
2292 firefox
2339 Web Content
17363 Web Content
$ ps aux | grep 2339
vin100    2339 24.4  5.2 2230872 419804 ?      Sl   01:43  28:34 /usr/lib/firefox/firefox -contentproc -childID 1 -isForBrowser -prefsLen 10155 -schedulerPrefs 0001,2 -parentBuildID 20180905220717 -greomni /usr/lib/firefox/omni.ja -appomni /usr/lib/firefox/browser/omni.ja -appdir /usr/lib/firefox/browser 2292 true tab
vin100   26892  0.0  0.0  22008  1104 pts/1    S+   03:40   0:00 grep --color=auto 2339


Prints process status.

I only know ps aux, which displays every process. ps ux only shows jobs owned by root.

  • -C [cmd]: show process cmd only


Split/merge/rotate/encrypt/decrypt PDF files.

I don’t have to explore the encryption functionalities for the moment.

Generic usage:

  • --linearize: optimize file for web.

  • --rotate=[+|-]angle:page-range: possible parameters for angle are 90, 180 and 270.

  • --split-pages=n [INPDF] [OUTPDF]: split file into groups of n pages. n defaults to 1.

      $ qpdf --split-pages foo.pdf bar.pdf
      $ ls bar.pdf
      bar-01.pdf  bar-03.pdf  bar-05.pdf  bar-07.pdf  bar-09.pdf  bar-11.pdf
      bar-02.pdf  bar-04.pdf  bar-06.pdf  bar-08.pdf  bar-10.pdf  bar-12.pdf

    An optional string %d in OUTPDF is replaced with a zero-padded page.

      $ qpdf --split-pages twopage.pdf o%dut.pdf
      o1ut.pdf  o2ut.pdf
  • --empty: discard INPDF metadata

  • --pages [INPDF] [RANGE] [[INPDF2] [RANGE]...] --: extract pages

    • RANGE: apart from the basic 1-3,5, you may reverse selection order.

    • INPDF*: multiple files don’t need to be separated by --. (The command in Mankier is wrong.)

        $ qpdf --empty --pages input1.pdf 1,6-8 input2.pdf -- output.pdf


  1. Official manual
  2. Mankier


An RPM cheat sheet on Cyberciti.


Remote synchronization. Supports many protocols like FTP, SSH, etc.

General usage: rsync [FLAGS] [SRC] [DEST].

  • SRC: source
    • files
    • folders
      • with trailing /: send contents inside the folder
      • without trailing /: send whole folder
  • DEST: destination, a file path

Flags explanations: (in alphabetical order)

  • -a: archive, preserve file attributes, permissions and [acm]times

  • -u: update, only send file(s) in SRC newer than DEST

  • -v: verbose, print every operations to STDOUT

  • -z: zip, reduce transferred data.

  • --exclude='[SUBDIR]': exclude [SUBDIR]

    To quickly test a Hugo theme without git submodule, I use this command.

      $ rsync -auv --exclude='themes' --exclude-from='.git*' ~/bhdemo-public/ ~/bhdemo
  • --exclude-from='[SUBDIR-LIST]': exclude contents listed in the file [SUBDIR-LIST].

References: Six rsync examples from The Geek Stuff


Record a TTY session into a text file (with one single long line). Require sudo privileges.

$ sudo screendump 1 > ~/mylog.txt
[sudo] password for vin100:
$ cat ~/mylog.txt

Ubuntu 18.04.1 LTS vin100-LIFEBOOK-LH532 tty1

vin100-LIFEBOOK-LH532 login:



Stream editor

$ sed [FLAG(S)] '[range][cmd]'

Flags explanations: (in alphabetical order)

  • -e: extended mode, use for multiple expressions
  • -i: in place editing
  • -n: no normal output

[range] can refer to a line number ($ meaning the last line), or a scope /[PAT]/. The later can be used to remove empty lines.

$ git status
On branch master
Your branch is up to date with 'origin/master'.

Changes not staged for commit:
  (use "git add <file>..." to update what will be committed)
  (use "git checkout -- <file>..." to discard changes in working directory)

        modified:   content/page/bash-commands/
        modified:   content/page/sublime/

no changes added to commit (use "git add" and/or "git commit -a")
$ git status | sed '/^$/d'
On branch master
Your branch is up to date with 'origin/master'.
Changes not staged for commit:
  (use "git add <file>..." to update what will be committed)
  (use "git checkout -- <file>..." to discard changes in working directory)
        modified:   content/page/bash-commands/
        modified:   content/page/sublime/
no changes added to commit (use "git add" and/or "git commit -a")

A [range] can be inverted by !, so sed '$!d' works like tail -1.

Some common [cmd]:

  • a: append
  • d: delete
  • n: next: clear pattern space (PS) and go to next line
  • p: print
  • {...}: can be used with pattern like /PAT/{n;p} for conditional operations.

[cmd]’s are separated by semicolon ;.

Some less common [cmd]:

  • N: go to next line, but append newline ↵ and next line to PS, useful for multi-line regex match.
  • q: quit, allows setting status number
  • x: exchange PS and hold space (HS), can be used to detect if the text has been changed.


I’ve added Alfred Klomp’s simple Ghostscript wrapper to ~/bin, which has been added to $PATH. The > is optional. The third argument indicates the quality, and it defaults to 90, which is barely enough for pure text documents .

$ ./ foo.pdf > bar.pdf
$ ./ foo.pdf bar.pdf 150

quality/size tradeoff:

  • 150 suffices for text documents in emails 📧.
  • 200 suffices for documents with text and photos.
  • 300 for usual storage 💾
  • 450 / 600 for superior quality. (e.g. , , etc)


Shuffle input. (no repetition by default)

Flags explanations: (in alphabetical order)

  • -i: input range

      $ shuf -i 1-10 -n 3
  • -n: output number

  • -r: allow repetition


Start an idle process for n seconds, which n is the first argument.

Remarks: I can’t say that the shell is suspended despite its apparent effect as appending & to the command allows it to run in background.


Sort input file (in ascending alphabetical order by default).

  • -k[n]: sort according to column [n]
  • -n: use numeric order instead
  • -r: reverse order
  • -u: preserve only unique lines after sort


Secure shell: access remote desktop on the network through an encrypted “tunnel”.

  • simplest syntax: ssh [USER]@[DOMAIN]

      $ ssh vin100@

    The shell then prompts you to input the password for [USER] up. Upon success, you’ll logged in as [USER].

  • one-spot connection: ssh [USER]@[DOMAIN] [CMD]

      $ ssh vin100@ ls ~/quickstart
      archetypes  config.toml  content  layouts  public  static  themes


Generate an SSH key pair. They can be used for remote SSH access and Git service providers.

  • -t: algorithm used (rsa/ed25519/dsa/ecdsa). GitLab documentation suggests that one should favor ED25519 if possible. This encryption algorithm is currently supported by GitHub and GitLab.
  • -C: comment, say email address
  • -b: number of bits of the generated key (n/a for ED25519). A length of 4096 is recommended.
  • -f: specify the input file
  • -y: read input OpenSSH private key and display the OpenSSH public key
  • -o: output private key in a newer and more secure format. (ED25519 automatically implement the new format, so there’s no need to use this flag.)
  • -l: print the fingerprint of the (public/private) key file. (Public/private gives the same result.)
    • -v: show the ASCII art representation instead.
    • -f: see above. If missing, prompt for file path.
    • -E: specify hash algorithm (sha256 (default)/sha1/md5)
  • -p: change passphrase. (-P and -N are omitted, so that the passpharses won’t be logged.)


Generate an RSA key pair.

$ ssh-keygen -t rsa

Generate an RSA key pair for a Git service provider.

$ ssh-keygen -o -t rsa -b 4096 -C ""

Generate an ED25519 key pair for a Git service provider.

$ ssh-keygen -t ed25519 -C ""

Change passphrase.

$ ssh-keygen -p -o -f <keyname>

Generate the corresponding public RSA key from a private one.

$ ssh-keygen -yf ~/.ssh/id_rsa > ./

Get the MD5 hash of the fingerprint of a public SSH key. (Displayed in GitHub/GitLab’s account settings)

$ ssh-keygen -E md5 -lf ~/.ssh/id_ed25519

Safety precautions:

  1. Encrypting the private key with a secret passphrase can greatly enhance its security.
  2. Never compromise your secret key to others, including any remote service/server/storage. (e.g. Cloud/USB storage, email) The reason is that a secret key represents an identity. If your friend/relative needs remote SSH access, ask them to create a new key pair.
  3. Use one key pair per device, so that intrusion of a device doesn’t compromise the security of other devices.
  4. It won’t enhance the security by using one key pair per remote server because the only object to be protected is(are) the private key(s). If ~/.ssh is unluckily intruded, all private keys will be exposed.


Show info of TTY.

  • Return number of rows and columns

      $ stty size
      43 132
  • Set buffer sizes

      $ stty cols 80
      $ stty rows 32

See also: tty


Print last n lines of a file or STDOUT. (n = 10 by default) Opposite of head

  • -c [n]: output last n bytes. Useful for EOL detection at EOF.
  • -n [m]: output last m lines
  • -z: use null character \0 instead of newline ↵ as line delimiter. It can be used to process the output of find ... -print0.


Archive directories into a single file called tarball. Options:

  • c: create tarball
  • f: specify filename of the output tarball
  • v: verbose
  • x: extract

Common compression options:

  • j: bzip2
  • J: xz
  • z: gzip
  • Z: compress


Redirect command output into STDOUT and a file. To be used for inspecting and capturing command output simultaneously.

$ ls | tee eles
$ cat eles

> captures the command output without showing it (except errors).


Test expressions or file types. Two types of syntax are possible.

$ test {EXP}
$ [ [OPTIONS ] {EXP} ]

See the last part of bash for chaining commands with && and ||.

  1. string comparison

     $ [ STRING1 = STRING2 ] && echo true || echo false
     $ [ STRING1 = STRING1 ] && echo true || echo false
     $ [ STRING1 != STRING2 ] && echo true || echo false
     $ [ -n a ] && echo true || echo false  # test string with nonzero length
     $ [ -z "" ] && echo true || echo false  # test string with zero length

    Note that to compare the equality of two strings, a pair of single/double quotes ''/"" aren’t necessary, and only one equality sign = is needed.

    For the last two commands, the spaces are very important, and they can’t be omitted.

    The option -n a is equivalent to a. An application of this is the detection of newline \n at EOF.

     $ test `tail -c1 .gitignore` && echo 'missing EOF!' || echo 'has EOF'
     has EOF
     $ test `tail -c1 themes/beautifulhugo/static/js/katex.min.js` && \
     echo 'missing EOF' || echo 'has EOF'
     missing EOF

    Explanation: (thanks to Oguz Ismail’s answer on Stack Overflow)

    1. The command tail -c1 {FILE} inside the pair of backticks is executed in a subshell environment.
    2. tail -c1
      • the {FILE} has a \n the EOF: gives a trailing \n;
      • otherwise: gives the last (non \n) character.
    3. When the command substitution ... gets replaced by STDOUT of tail -c1 in step #2, any trailing \n is trimmed off, resulting in
      • first case in #2: an empty string
      • second case in #2: a nonempty string (the character is untouched)
    4. test evaluates the string with the -n option.
      • first case in #2: false (omitted {EXP} defaults to false)
        1. jump to ||
        2. execute echo 'has EOF'
      • second case in #2: true
        1. proceed to &&
        2. execute echo 'missing EOF'. This command exits normally and it gives the status code zero.
        3. meet || and terminate.
  2. compare integers

     $ [ 6 -eq 4 ] && echo equal || echo not equal
     not equal

    Possible binary operators are:

    • -eq: =
    • -ne: ≠
    • -gt: >
    • -lt: <
    • -ge: ⩾
    • -le: ⩽
  3. compare two files’ modification date (-nt, -ot)

     $ [ .gitignore -ot config.toml ] && echo older || echo newer
  4. test existence of files

    • -d: {FILE} exists and is a directory
    • -e: {FILE} exists
    • -f: {FILE} exists and is a regular file
    • -h/-L: {FILE} exists and is a symbolic link

    In case that {FILE} is a symbolic link, test dereferences it except for -h and -L (because these two options test the existence of symbolic links, so there’s no point dereferencing the links).


Record the time taken for running a command.

General usage: time [command]

$ time sleep 5

real    0m5.002s
user    0m0.002s
sys     0m0.000s


Output absolute file path of the current terminal. (no argument needed)

  • GUI: /dev/pts/1
  • TTYn: /dev/tty[n]

See also: stty


Translate or remove a certain characters. Like other GNU coreutils (e.g. cat, grep, etc), it accepts STDIN and/or input file(s), and write to STDOUT.

General usage:

  • Replace character

      $ tr ' ' '_'
      foo bar
  • Delete character

      $ tr -d ' '
      foo bar


Output “locallyunique lines. i.e. Remove neighbouring duplicate lines of input file(s) or STDIN.

$ echo '1\n1\n1\n2' | uniq
$ echo '1\n2\n1\n2' | uniq

See also: sort -u


Improved text-editor from vi, which is preloaded on every GNU/Linux and FreeBSD distro. (even on Mac OS)

  • -R: read-only mode
Normal mode key Function
<C-b> Scroll one page backward
<C-f> Scroll one page forward
<C-d> Scroll half page down
<C-u> Scroll half page up
g Jump to first line
G Jump to last line
h Move the cursor one character left
j Move the cursor one character down
k Move the cursor one character up
l Move the cursor one character right
/ Forward search
? Backword search
n Next match
N Previous match
i Insert character under the cursor
q Quit

P.S. It was my favorite editor.


Word count

  1. Use files: output character, word and line counts, followed by file name

     $ wc .gitmodules
       4  11 133 .gitmodules
  2. Use STDOUT: also show these three counts, but without file name

     $ cat .gitmodules | wc
           4      11     133
  • c: character count
  • w: word count
  • l: line count


From web, get stuff (with its resources).

  • -c: works like -C in curl
  • -N/--timestamping: retrieve file only if the server’s copy is newer than the local one. (Thanks to Steven Penny’s answer.)
  • -O: works like -o in curl
  • -q: quiet, don’t output to STDOUT

To get the external IP address, try

$ wget -qO- ; echo

See also: curl

You may refer to the TLDR page for more useful commands like

  1. Download page with its resources
  2. Download full website
  3. Download recursively a remote folder
  4. Download via authenticated FTP


Rearrange and/or execute arguments.

Output of ls without -l flag is ascending column-wise.

$ ls -A .git
branches        description  hooks  logs     ORIG_HEAD
COMMIT_EDITMSG  FETCH_HEAD   index  modules  packed-refs
config          HEAD         info   objects  refs

xargs -n [num] treats input as arguments delimited by space ␣, tab ↹ and/or newline ↵. It outputs [num] arguments delimited by space ␣ on each line.

$ ls -A .git | xargs -n 3
branches COMMIT_EDITMSG config
description FETCH_HEAD HEAD
hooks index info
logs modules objects
ORIG_HEAD packed-refs refs

Observe the difference of the output below with the first block in this section.

$ ls -A .git | xargs -n 3 | xargs -n 5
branches COMMIT_EDITMSG config description FETCH_HEAD
HEAD hooks index info logs
modules objects ORIG_HEAD packed-refs refs


Take screenshot of graphical desktop from TTY. (require sudo priviledges)

This can be useful for capturing the login screen.

The following only works for LightDM.

I’ve refined Neroshan’s command on Ask Ubuntu into a shell script.

# USAGE: ./ [file-name]

chvt 7 # On Xubuntu 18.04
#chvt 1# On Ubuntu 18.04
DISPLAY=:0 XAUTHORITY=/var/run/lightdm/root/:0 xwd -root -out ~/screenshot.xwd
convert ~/screenshot.xwd $1
rm ~/screenshot.xwd
chvt `tty | sed 's:/dev/tty::'`

This script requires one single argument: output file name (with extension name), which is passed to $1. The idea is simple.

  1. Switch to GUI from CLI (TTY1–TTY6 on Xubuntu 18.04; TTY2–TTY7 on Ubuntu 18.04)
  2. Add necessary shell variables. (Adapt it to GDM or other display manager)
  3. Create a temporary XWD file.
  4. Convert this file to a file with your specified file name.
  5. Remove the temporary XWD file.
  6. Switch back to CLI.
Xubuntu 18.04 error after login

Screenshot by xwd

Taken with the above script from TTY on Xubuntu 18.04

(Last modified on October 20, 2022)