Advanced Bash-Scripting Guide (PART 5)

Part 5. Advanced Topics

   At this point, we are ready to delve into certain of the difficult
   and unusual aspects of scripting. Along the way, we will attempt to
   "push the envelope" in various ways and examine boundary conditions
   (what happens when we move into uncharted territory?).

   Table of Contents
   18. Regular Expressions

        18.1. A Brief Introduction to Regular Expressions
        18.2. Globbing

   19. Here Documents

        19.1. Here Strings

   20. I/O Redirection

        20.1. Using exec
        20.2. Redirecting Code Blocks
        20.3. Applications

   21. Subshells
   22. Restricted Shells
   23. Process Substitution
   24. Functions

        24.1. Complex Functions and Function Complexities
        24.2. Local Variables
        24.3. Recursion Without Local Variables

   25. Aliases
   26. List Constructs
   27. Arrays
   28. Indirect References
   29. /dev and /proc

        29.1. /dev
        29.2. /proc

   30. Network Programming
   31. Of Zeros and Nulls
   32. Debugging
   33. Options
   34. Gotchas
   35. Scripting With Style

        35.1. Unofficial Shell Scripting Stylesheet

   36. Miscellany

        36.1. Interactive and non-interactive shells and scripts
        36.2. Shell Wrappers
        36.3. Tests and Comparisons: Alternatives
        36.4. Recursion: a script calling itself
        36.5. "Colorizing" Scripts
        36.6. Optimizations
        36.7. Assorted Tips
        36.8. Security Issues
        36.9. Portability Issues
        36.10. Shell Scripting Under Windows

   37. Bash, versions 2, 3, and 4

        37.1. Bash, version 2
        37.2. Bash, version 3
        37.3. Bash, version 4

Chapter 18. Regular Expressions


   . . . the intellectual activity associated with software development
   is largely one of gaining insight.

   --Stowe Boyd

   To fully utilize the power of shell scripting, you need to master
   Regular Expressions. Certain commands and utilities commonly used in
   scripts, such as grep, expr, sed and awk, interpret and use REs. As
   of version 3, Bash has acquired its own RE-match operator: =~.

18.1. A Brief Introduction to Regular Expressions

   An expression is a string of characters. Those characters having an
   interpretation above and beyond their literal meaning are called
   metacharacters. A quote symbol, for example, may denote speech by a
   person, ditto, or a meta-meaning [93] for the symbols that follow.
   Regular Expressions are sets of characters and/or metacharacters that
   match (or specify) patterns.

   A Regular Expression contains one or more of the following:

     * A character set. These are the characters retaining their literal
       meaning. The simplest type of Regular Expression consists only of
       a character set, with no metacharacters.
     * An anchor. These designate (anchor) the position in the line of
       text that the RE is to match. For example, ^, and $ are anchors.
     * Modifiers. These expand or narrow (modify) the range of text the
       RE is to match. Modifiers include the asterisk, brackets, and the

   The main uses for Regular Expressions (REs) are text searches and
   string manipulation. An RE matches a single character or a set of
   characters -- a string or a part of a string.

     * The asterisk -- * -- matches any number of repeats of the
       character string or RE preceding it, including zero instances.
       "1133*" matches 11 + one or more 3's: 113, 1133, 1133333, and so
     * The dot -- . -- matches any one character, except a newline. [94]
       "13." matches 13 + at least one of any character (including a
       space): 1133, 11333, but not 13 (additional character missing).
       See Example 16-18 for a demonstration of dot single-character
     * The caret -- ^ -- matches the beginning of a line, but sometimes,
       depending on context, negates the meaning of a set of characters
       in an RE.
     * The dollar sign -- $ -- at the end of an RE matches the end of a
       "XXX$" matches XXX at the end of a line.
       "^$" matches blank lines.
     * Brackets -- [...] -- enclose a set of characters to match in a
       single RE.
       "[xyz]" matches any one of the characters x, y, or z.
       "[c-n]" matches any one of the characters in the range c to n.
       "[B-Pk-y]" matches any one of the characters in the ranges B to P
       and k to y.
       "[a-z0-9]" matches any single lowercase letter or any digit.
       "[^b-d]" matches any character except those in the range b to d.
       This is an instance of ^ negating or inverting the meaning of the
       following RE (taking on a role similar to ! in a different
       Combined sequences of bracketed characters match common word
       patterns. "[Yy][Ee][Ss]" matches yes, Yes, YES, yEs, and so
       forth. "[0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9][0-9][0-9]" matches
       any Social Security number.
     * The backslash -- \ -- escapes a special character, which means
       that character gets interpreted literally (and is therefore no
       longer special).
       A "\$" reverts back to its literal meaning of "$", rather than
       its RE meaning of end-of-line. Likewise a "\\" has the literal
       meaning of "\".
     * Escaped "angle brackets" -- \<...\> -- mark word boundaries.
       The angle brackets must be escaped, since otherwise they have
       only their literal character meaning.
       "\<the\>" matches the word "the," but not the words "them,"
       "there," "other," etc.

bash$ cat textfile
This is line 1, of which there is only one instance.
 This is the only instance of line 2.
 This is line 3, another line.
 This is line 4.

bash$ grep 'the' textfile
This is line 1, of which there is only one instance.
 This is the only instance of line 2.
 This is line 3, another line.

bash$ grep '\<the\>' textfile
This is the only instance of line 2.

   The only way to be certain that a particular RE works is to test it.

TEST FILE: tstfile                          # No match.
                                            # No match.
Run   grep "1133*"  on this file.           # Match.
                                            # No match.
                                            # No match.
This line contains the number 113.          # Match.
This line contains the number 13.           # No match.
This line contains the number 133.          # No match.
This line contains the number 1133.         # Match.
This line contains the number 113312.       # Match.
This line contains the number 1112.         # No match.
This line contains the number 113312312.    # Match.
This line contains no numbers at all.       # No match.

bash$ grep "1133*" tstfile
Run   grep "1133*"  on this file.           # Match.
 This line contains the number 113.          # Match.
 This line contains the number 1133.         # Match.
 This line contains the number 113312.       # Match.
 This line contains the number 113312312.    # Match.

     * Extended REs. Additional metacharacters added to the basic set.
       Used in egrep, awk, and Perl.
     * The question mark -- ? -- matches zero or one of the previous RE.
       It is generally used for matching single characters.
     * The plus -- + -- matches one or more of the previous RE. It
       serves a role similar to the *, but does not match zero

# GNU versions of sed and awk can use "+",
# but it needs to be escaped.

echo a111b | sed -ne '/a1\+b/p'
echo a111b | grep 'a1\+b'
echo a111b | gawk '/a1+b/'
# All of above are equivalent.

# Thanks, S.C.

     * Escaped "curly brackets" -- \{ \} -- indicate the number of
       occurrences of a preceding RE to match.
       It is necessary to escape the curly brackets since they have only
       their literal character meaning otherwise. This usage is
       technically not part of the basic RE set.
       "[0-9]\{5\}" matches exactly five digits (characters in the range
       of 0 to 9).


   Curly brackets are not available as an RE in the "classic" (non-POSIX
   compliant) version of awk. However, the GNU extended version of awk,
   gawk, has the --re-interval option that permits them (without being
bash$ echo 2222 | gawk --re-interval '/2{3}/'

   Perl and some egrep versions do not require escaping the curly
     * Parentheses -- ( ) -- enclose a group of REs. They are useful
       with the following "|" operator and in substring extraction using
     * The -- | -- "or" RE operator matches any of a set of alternate

bash$ egrep 're(a|e)d' misc.txt
People who read seem to be better informed than those who do not.
 The clarinet produces sound by the vibration of its reed.


   Some versions of sed, ed, and ex support escaped versions of the
   extended Regular Expressions described above, as do the GNU

     * POSIX Character Classes. [:class:]
       This is an alternate method of specifying a range of characters
       to match.
     * [:alnum:] matches alphabetic or numeric characters. This is
       equivalent to A-Za-z0-9.
     * [:alpha:] matches alphabetic characters. This is equivalent to
     * [:blank:] matches a space or a tab.
     * [:cntrl:] matches control characters.
     * [:digit:] matches (decimal) digits. This is equivalent to 0-9.
     * [:graph:] (graphic printable characters). Matches characters in
       the range of ASCII 33 - 126. This is the same as [:print:],
       below, but excluding the space character.
     * [:lower:] matches lowercase alphabetic characters. This is
       equivalent to a-z.
     * [:print:] (printable characters). Matches characters in the range
       of ASCII 32 - 126. This is the same as [:graph:], above, but
       adding the space character.
     * [:space:] matches whitespace characters (space and horizontal
     * [:upper:] matches uppercase alphabetic characters. This is
       equivalent to A-Z.
     * [:xdigit:] matches hexadecimal digits. This is equivalent to


   POSIX character classes generally require quoting or double brackets
   ([[ ]]).

bash$ grep [[:digit:]] test.file

# ...
if [[ $arow =~ [[:digit:]] ]]   #  Numerical input?
then       #  POSIX char class
  if [[ $acol =~ [[:alpha:]] ]] # Number followed by a letter? Illegal!
# ...
# From example script.

       These character classes may even be used with globbing, to a
       limited extent.

bash$ ls -l ?[[:digit:]][[:digit:]]?
-rw-rw-r--    1 bozo  bozo         0 Aug 21 14:47 a33b

       POSIX character classes are used in Example 16-21 and Example

   Sed, awk, and Perl, used as filters in scripts, take REs as arguments
   when "sifting" or transforming files or I/O streams. See Example A-12
   and Example A-16 for illustrations of this.

   The standard reference on this complex topic is Friedl's Mastering
   Regular Expressions. Sed & Awk, by Dougherty and Robbins, also gives
   a very lucid treatment of REs. See the Bibliography for more
   information on these books.

18.2. Globbing

   Bash itself cannot recognize Regular Expressions. Inside scripts, it
   is commands and utilities -- such as sed and awk -- that interpret

   Bash does carry out filename expansion [95] -- a process known as
   globbing -- but this does not use the standard RE set. Instead,
   globbing recognizes and expands wild cards. Globbing interprets the
   standard wild card characters [96] -- * and ?, character lists in
   square brackets, and certain other special characters (such as ^ for
   negating the sense of a match). There are important limitations on
   wild card characters in globbing, however. Strings containing * will
   not match filenames that start with a dot, as, for example, .bashrc.
   [97] Likewise, the ? has a different meaning in globbing than as part
   of an RE.

bash$ ls -l
total 2
 -rw-rw-r--    1 bozo  bozo         0 Aug  6 18:42 a.1
 -rw-rw-r--    1 bozo  bozo         0 Aug  6 18:42 b.1
 -rw-rw-r--    1 bozo  bozo         0 Aug  6 18:42 c.1
 -rw-rw-r--    1 bozo  bozo       466 Aug  6 17:48
 -rw-rw-r--    1 bozo  bozo       758 Jul 30 09:02 test1.txt

bash$ ls -l t?.sh
-rw-rw-r--    1 bozo  bozo       466 Aug  6 17:48

bash$ ls -l [ab]*
-rw-rw-r--    1 bozo  bozo         0 Aug  6 18:42 a.1
 -rw-rw-r--    1 bozo  bozo         0 Aug  6 18:42 b.1

bash$ ls -l [a-c]*
-rw-rw-r--    1 bozo  bozo         0 Aug  6 18:42 a.1
 -rw-rw-r--    1 bozo  bozo         0 Aug  6 18:42 b.1
 -rw-rw-r--    1 bozo  bozo         0 Aug  6 18:42 c.1

bash$ ls -l [^ab]*
-rw-rw-r--    1 bozo  bozo         0 Aug  6 18:42 c.1
 -rw-rw-r--    1 bozo  bozo       466 Aug  6 17:48
 -rw-rw-r--    1 bozo  bozo       758 Jul 30 09:02 test1.txt

bash$ ls -l {b*,c*,*est*}
-rw-rw-r--    1 bozo  bozo         0 Aug  6 18:42 b.1
 -rw-rw-r--    1 bozo  bozo         0 Aug  6 18:42 c.1
 -rw-rw-r--    1 bozo  bozo       758 Jul 30 09:02 test1.txt

   Bash performs filename expansion on unquoted command-line arguments.
   The echo command demonstrates this.

bash$ echo *
a.1 b.1 c.1 test1.txt

bash$ echo t* test1.txt

bash$ echo t?.sh


   It is possible to modify the way Bash interprets special characters
   in globbing. A set -f command disables globbing, and the nocaseglob
   and nullglob options to shopt change globbing behavior.

   See also Example 11-4.

Chapter 19. Here Documents


   Here and now, boys.

   --Aldous Huxley, Island

   A here document is a special-purpose code block. It uses a form of
   I/O redirection to feed a command list to an interactive program or a
   command, such as ftp, cat, or the ex text editor.

COMMAND <<InputComesFromHERE

   A limit string delineates (frames) the command list. The special
   symbol << designates the limit string. This has the effect of
   redirecting the output of a file into the stdin of the program or
   command. It is similar to interactive-program < command-file, where
   command-file contains
command #1
command #2

   The here document alternative looks like this:

interactive-program <<LimitString
command #1
command #2

   Choose a limit string sufficiently unusual that it will not occur
   anywhere in the command list and confuse matters.

   Note that here documents may sometimes be used to good effect with
   non-interactive utilities and commands, such as, for example, wall.

   Example 19-1. broadcast: Sends message to everyone logged in

wall <<zzz23EndOfMessagezzz23
E-mail your noontime orders for pizza to the system administrator.
    (Add an extra dollar for anchovy or mushroom topping.)
# Additional message text goes here.
# Note: 'wall' prints comment lines.

# Could have been done more efficiently by
#         wall <message-file
#  However, embedding the message template in a script
#+ is a quick-and-dirty one-off solution.


   Even such unlikely candidates as the vi text editor lend themselves
   to here documents.

   Example 19-2. dummyfile: Creates a 2-line dummy file

# Noninteractive use of 'vi' to edit a file.
# Emulates 'sed'.


if [ -z "$1" ]
  echo "Usage: `basename $0` filename"
  exit $E_BADARGS


# Insert 2 lines in file, then save.
#--------Begin here document-----------#
vi $TARGETFILE <<x23LimitStringx23
This is line 1 of the example file.
This is line 2 of the example file.
#----------End here document-----------#

#  Note that ^[ above is a literal escape
#+ typed by Control-V <Esc>.

#  Bram Moolenaar points out that this may not work with 'vim'
#+ because of possible problems with terminal interaction.


   The above script could just as effectively have been implemented with
   ex, rather than vi. Here documents containing a list of ex commands
   are common enough to form their own category, known as ex scripts.
#  Replace all instances of "Smith" with "Jones"
#+ in files with a ".txt" filename suffix.


for word in $(fgrep -l $ORIGINAL *.txt)
  # -------------------------------------
  ex $word <<EOF
  # :%s is the "ex" substitution command.
  # :wq is write-and-quit.
  # -------------------------------------

   Analogous to "ex scripts" are cat scripts.

   Example 19-3. Multi-line message using cat

#  'echo' is fine for printing single line messages,
#+  but somewhat problematic for for message blocks.
#   A 'cat' here document overcomes this limitation.

cat <<End-of-message
This is line 1 of the message.
This is line 2 of the message.
This is line 3 of the message.
This is line 4 of the message.
This is the last line of the message.

#  Replacing line 7, above, with
#+   cat > $Newfile <<End-of-message
#+       ^^^^^^^^^^
#+ writes the output to the file $Newfile, rather than to stdout.

exit 0

# Code below disabled, due to "exit 0" above.

# S.C. points out that the following also works.
echo "-------------------------------------
This is line 1 of the message.
This is line 2 of the message.
This is line 3 of the message.
This is line 4 of the message.
This is the last line of the message.
# However, text may not include double quotes unless they are escaped.

   The - option to mark a here document limit string (<<-LimitString)
   suppresses leading tabs (but not spaces) in the output. This may be
   useful in making a script more readable.

   Example 19-4. Multi-line message, with tabs suppressed
# Same as previous example, but...

#  The - option to a here document <<-
#+ suppresses leading tabs in the body of the document,
#+ but *not* spaces.

        This is line 1 of the message.
        This is line 2 of the message.
        This is line 3 of the message.
        This is line 4 of the message.
        This is the last line of the message.
# The output of the script will be flush left.
# Leading tab in each line will not show.

# Above 5 lines of "message" prefaced by a tab, not spaces.
# Spaces not affected by   <<-  .

# Note that this option has no effect on *embedded* tabs.

exit 0

   A here document supports parameter and command substitution. It is
   therefore possible to pass different parameters to the body of the
   here document, changing its output accordingly.

   Example 19-5. Here document with replaceable parameters
# Another 'cat' here document, using parameter substitution.

# Try it with no command-line parameters,   ./scriptname
# Try it with one command-line parameter,   ./scriptname Mortimer
# Try it with one two-word quoted command-line parameter,
#                           ./scriptname "Mortimer Jones"

CMDLINEPARAM=1     #  Expect at least command-line parameter.

if [ $# -ge $CMDLINEPARAM ]
  NAME=$1          #  If more than one command-line param,
                   #+ then just take the first.
  NAME="John Doe"  #  Default, if no command-line parameter.

RESPONDENT="the author of this fine script"

cat <<Endofmessage

Hello, there, $NAME.
Greetings to you, $NAME, from $RESPONDENT.

# This comment shows up in the output (why?).


# Note that the blank lines show up in the output.
# So does the comment.


   This is a useful script containing a here document with parameter

   Example 19-6. Upload a file pair to Sunsite incoming directory

#  Upload file pair (Filename.lsm, Filename.tar.gz)
#+ to incoming directory at Sunsite/UNC (
#  Filename.tar.gz is the tarball itself.
#  Filename.lsm is the descriptor file.
#  Sunsite requires "lsm" file, otherwise will bounce contributions.


if [ -z "$1" ]
  echo "Usage: `basename $0` Filename-to-upload"
  exit $E_ARGERROR

Filename=`basename $1`           # Strips pathname out of file name.

#  These need not be hard-coded into script,
#+ but may instead be changed to command-line argument.

Password="your.e-mail.address"   # Change above to suit.

ftp -n $Server <<End-Of-Session
# -n option disables auto-logon

user anonymous "$Password"       #  If this doesn't work, then try:
                                 #  quote user anonymous "$Password"
bell                             # Ring 'bell' after each file transfer.
cd $Directory
put "$Filename.lsm"
put "$Filename.tar.gz"

exit 0

   Quoting or escaping the "limit string" at the head of a here document
   disables parameter substitution within its body. The reason for this
   is that quoting/escaping the limit string effectively escapes the $,
   `, and \ special characters, and causes them to be interpreted
   literally. (Thank you, Allen Halsey, for pointing this out.)

   Example 19-7. Parameter substitution turned off
#  A 'cat' here-document, but with parameter substitution disabled.

NAME="John Doe"
RESPONDENT="the author of this fine script"

cat <<'Endofmessage'

Hello, there, $NAME.
Greetings to you, $NAME, from $RESPONDENT.


#   No parameter substitution when the "limit string" is quoted or escaped.
#   Either of the following at the head of the here document would have
#+  the same effect.
#   cat <<"Endofmessage"
#   cat <<\Endofmessage

#   And, likewise:

cat <<"SpecialCharTest"

Directory listing would follow
if limit string were not quoted.
`ls -l`

Arithmetic expansion would take place
if limit string were not quoted.
$((5 + 3))

A a single backslash would echo
if limit string were not quoted.



   Disabling parameter substitution permits outputting literal text.
   Generating scripts or even program code is one use for this.

   Example 19-8. A script that generates another script
# Based on an idea by Albert Reiner.         # Name of the file to generate.

# -----------------------------------------------------------
# 'Here document containing the body of the generated script.
cat <<'EOF'

echo "This is a generated shell script."
#  Note that since we are inside a subshell,
#+ we can't access variables in the "outside" script.

echo "Generated file will be named: $OUTFILE"
#  Above line will not work as normally expected
#+ because parameter expansion has been disabled.
#  Instead, the result is literal output.


let "c = $a * $b"
echo "c = $c"

exit 0
# -----------------------------------------------------------

#  Quoting the 'limit string' prevents variable expansion
#+ within the body of the above 'here document.'
#  This permits outputting literal strings in the output file.

if [ -f "$OUTFILE" ]
  chmod 755 $OUTFILE
  # Make the generated file executable.
  echo "Problem in creating file: \"$OUTFILE\""

#  This method can also be used for generating
#+ C programs, Perl programs, Python programs, Makefiles,
#+ and the like.

exit 0

   It is possible to set a variable from the output of a here document.
   This is actually a devious form of command substitution.
variable=$(cat <<SETVAR
This variable
runs over multiple lines.

echo "$variable"

   A here document can supply input to a function in the same script.

   Example 19-9. Here documents and functions

GetPersonalData ()
  read firstname
  read lastname
  read address
  read city
  read state
  read zipcode
} # This certainly looks like an interactive function, but...

# Supply input to the above function.
GetPersonalData <<RECORD001
2726 Nondescript Dr.

echo "$firstname $lastname"
echo "$address"
echo "$city, $state $zipcode"

exit 0

   It is possible to use : as a dummy command accepting output from a
   here document. This, in effect, creates an "anonymous" here document.

   Example 19-10. "Anonymous" Here Document

${HOSTNAME?}${USER?}${MAIL?}  # Print error message if one of the variables no
t set.

exit $?


   A variation of the above technique permits "commenting out" blocks of

   Example 19-11. Commenting out a block of code

echo "This line will not echo."
This is a comment line missing the "#" prefix.
This is another comment line missing the "#" prefix.

The above line will cause no error message,
because the Bash interpreter will ignore it.

echo "Exit value of above \"COMMENTBLOCK\" is $?."   # 0
# No error shown.

#  The above technique also comes in useful for commenting out
#+ a block of working code for debugging purposes.
#  This saves having to put a "#" at the beginning of each line,
#+ then having to go back and delete each "#" later.
#  Note that the use of of colon, above, is optional.

echo "Just before commented-out code block."
#  The lines of code between the double-dashed lines will not execute.
#  ===================================================================
for file in *
 cat "$file"
#  ===================================================================
echo "Just after commented-out code block."

exit 0

#  Note, however, that if a bracketed variable is contained within
#+ the commented-out code block,
#+ then this could cause problems.
#  for example:


  echo "This line will not echo."
  $(rm -rf /tmp/foobar/)
  $(touch my_build_directory/cups/Makefile)

$ sh line 3: foo_bar_bazz: parameter null or not set

# The remedy for this is to strong-quote the 'COMMENTBLOCK' in line 49, above.


# Thank you, Kurt Pfeifle, for pointing this out.


   Yet another twist of this nifty trick makes "self-documenting"
   scripts possible.

   Example 19-12. A self-documenting script
# self-documenting script
# Modification of "".


if [ "$1" = "-h"  -o "$1" = "--help" ]     # Request help.
  echo; echo "Usage: $0 [directory-name]"; echo
  sed --silent -e '/DOCUMENTATIONXX$/,/^DOCUMENTATIONXX$/p' "$0" |
  sed -e '/DOCUMENTATIONXX$/d'; exit $DOC_REQUEST; fi

List the statistics of a specified directory in tabular format.
The command-line parameter gives the directory to be listed.
If no directory specified or directory specified cannot be read,
then list the current working directory.


if [ -z "$1" -o ! -r "$1" ]

echo "Listing of "$directory":"; echo
; ls -l "$directory" | sed 1d) | column -t

exit 0

   Using a cat script is an alternate way of accomplishing this.


if [ "$1" = "-h"  -o "$1" = "--help" ]     # Request help.
then                                       # Use a "cat script" . . .
List the statistics of a specified directory in tabular format.
The command-line parameter gives the directory to be listed.
If no directory specified or directory specified cannot be read,
then list the current working directory.


   See also Example A-28, Example A-40, Example A-41, and Example A-42
   for more examples of self-documenting scripts.


   Here documents create temporary files, but these files are deleted
   after opening and are not accessible to any other process.

bash$ bash -c 'lsof -a -p $$ -d0' << EOF
lsof    1213 bozo    0r   REG    3,5    0 30386 /tmp/t1213-0-sh (deleted)


   Some utilities will not work inside a here document.


   The closing limit string, on the final line of a here document, must
   start in the first character position. There can be no leading
   whitespace. Trailing whitespace after the limit string likewise
   causes unexpected behavior. The whitespace prevents the limit string
   from being recognized. [98]


echo "----------------------------------------------------------------------"

cat <<LimitString
echo "This is line 1 of the message inside the here document."
echo "This is line 2 of the message inside the here document."
echo "This is the final line of the message inside the here document."
#^^^^Indented limit string. Error! This script will not behave as expected.

echo "----------------------------------------------------------------------"

#  These comments are outside the 'here document',
#+ and should not echo.

echo "Outside the here document."

exit 0

echo "This line had better not echo."  # Follows an 'exit' command.


   Some people very cleverly use a single ! as a limit string. But,
   that's not necessarily a good idea.

# This works.
cat <<!
! Three more exclamations !!!

# But . . .
cat <<!
Single exclamation point follows!
# Crashes with an error message.

# However, the following will work.
cat <<EOF
Single exclamation point follows!
# It's safer to use a multi-character limit string.

   For those tasks too complex for a here document, consider using the
   expect scripting language, which was specifically designed for
   feeding input into interactive programs.

19.1. Here Strings

     A here string can be considered as a stripped-down form of a here
     It consists of nothing more than COMMAND <<< $WORD,
     where $WORD is expanded and fed to the stdin of COMMAND.

   As a simple example, consider this alternative to the echo-grep

# Instead of:
if echo "$VAR" | grep -q txt   # if [[ $VAR = *txt* ]]
# etc.

# Try:
if grep -q "txt" <<< "$VAR"
then   #         ^^^
   echo "$VAR contains the substring sequence \"txt\""
# Thank you, Sebastian Kaminski, for the suggestion.

   Or, in combination with read:

String="This is a string of words."

read -r -a Words <<< "$String"
#  The -a option to "read"
#+ assigns the resulting values to successive members of an array.

echo "First word in String is:    ${Words[0]}"   # This
echo "Second word in String is:   ${Words[1]}"   # is
echo "Third word in String is:    ${Words[2]}"   # a
echo "Fourth word in String is:   ${Words[3]}"   # string
echo "Fifth word in String is:    ${Words[4]}"   # of
echo "Sixth word in String is:    ${Words[5]}"   # words.
echo "Seventh word in String is:  ${Words[6]}"   # (null)
                                                 # Past end of $String.

# Thank you, Francisco Lobo, for the suggestion.

   Example 19-13. Prepending a line to a file
# Add text at beginning of file.
#  Example contributed by Kenny Stauffer,
#+ and slightly modified by document author.


read -p "File: " file   # -p arg to 'read' displays prompt.
if [ ! -e "$file" ]
then   # Bail out if no such file.
  echo "File $file not found."

read -p "Title: " title
cat - $file <<<$title > $

echo "Modified file is $"

exit  # Ends script execution.

  from 'man bash':
  Here Strings
        A variant of here documents, the format is:


        The word is expanded and supplied to the command on its standard input

  Of course, the following also works:
   sed -e '1i\
   Title: ' $file

   Example 19-14. Parsing a mailbox
#  Script by Francisco Lobo,
#+ and slightly modified and commented by ABS Guide author.
#  Used in ABS Guide with permission. (Thank you!)

# This script will not run under Bash versions < 3.0.

if [ -z "$1" ]
  echo "Usage: $0 mailbox-file"

mbox_grep()  # Parse mailbox file.
    declare -i body=0 match=0
    declare -a date sender
    declare mail header value

    while IFS= read -r mail
#         ^^^^                 Reset $IFS.
#  Otherwise "read" will strip leading & trailing space from its input.

       if [[ $mail =~ "^From " ]]   # Match "From" field in message.
          (( body  = 0 ))           # "Zero out" variables.
          (( match = 0 ))
          unset date

       elif (( body ))
            (( match ))
            # echo "$mail"
            # Uncomment above line if you want entire body of message to displ

       elif [[ $mail ]]; then
          IFS=: read -r header value <<< "$mail"
          #                          ^^^  "here string"

          case "$header" in
          [Ff][Rr][Oo][Mm] ) [[ $value =~ "$2" ]] && (( match++ )) ;;
          # Match "From" line.
          [Dd][Aa][Tt][Ee] ) read -r -a date <<< "$value" ;;
          #                                  ^^^
          # Match "Date" line.
          [Rr][Ee][Cc][Ee][Ii][Vv][Ee][Dd] ) read -r -a sender <<< "$value" ;;
          #                                                    ^^^
          # Match IP Address (may be spoofed).

          (( body++ ))
          (( match  )) &&
          echo "MESSAGE ${date:+of: ${date[*]} }"
       #    Entire $date array             ^
          echo "IP address of sender: ${sender[1]}"
       #    Second field of "Received" line    ^


    done < "$1" # Redirect stdout of file into loop.

mbox_grep "$1"  # Send mailbox file to function.

exit $?

# Exercises:
# ---------
# 1) Break the single function, above, into multiple functions,
#+   for the sake of readability.
# 2) Add additional parsing to the script, checking for various keywords.

$ scam_mail
  MESSAGE of Thu, 5 Jan 2006 08:00:56 -0500 (EST)
  IP address of sender:

   Exercise: Find other uses for here strings, such as, for example,
   feeding input to dc.

Chapter 20. I/O Redirection

   There are always three default files [99] open, stdin (the keyboard),
   stdout (the screen), and stderr (error messages output to the
   screen). These, and any other open files, can be redirected.
   Redirection simply means capturing output from a file, command,
   program, script, or even code block within a script (see Example 3-1
   and Example 3-2) and sending it as input to another file, command,
   program, or script.

   Each open file gets assigned a file descriptor. [100] The file
   descriptors for stdin, stdout, and stderr are 0, 1, and 2,
   respectively. For opening additional files, there remain descriptors
   3 to 9. It is sometimes useful to assign one of these additional file
   descriptors to stdin, stdout, or stderr as a temporary duplicate
   link. [101] This simplifies restoration to normal after complex
   redirection and reshuffling (see Example 20-1).

      # Redirect stdout to a file.
      # Creates the file if not present, otherwise overwrites it.

      ls -lR > dir-tree.list
      # Creates a file containing a listing of the directory tree.

   : > filename
      # The > truncates file "filename" to zero length.
      # If file not present, creates zero-length file (same effect as 'touch')
      # The : serves as a dummy placeholder, producing no output.

   > filename
      # The > truncates file "filename" to zero length.
      # If file not present, creates zero-length file (same effect as 'touch')
      # (Same result as ": >", above, but this does not work with some shells.

      # Redirect stdout to a file.
      # Creates the file if not present, otherwise appends to it.

      # Single-line redirection commands (affect only the line they are on):
      # --------------------------------------------------------------------

      # Redirect stdout to file "filename."
      # Redirect and append stdout to file "filename."
      # Redirect stderr to file "filename."
      # Redirect and append stderr to file "filename."
      # Redirect both stdout and stderr to file "filename."
      # This operator is now functional, as of Bash 4, final release.

     # "M" is a file descriptor, which defaults to 1, if not explicitly set.
     # "N" is a filename.
     # File descriptor "M" is redirect to file "N."
     # "M" is a file descriptor, which defaults to 1, if not set.
     # "N" is another file descriptor.


      # Redirecting stdout, one line at a time.

      echo "This statement is sent to the log file, \"$LOGFILE\"." 1>$LOGFILE
      echo "This statement is appended to \"$LOGFILE\"." 1>>$LOGFILE
      echo "This statement is also appended to \"$LOGFILE\"." 1>>$LOGFILE
      echo "This statement is echoed to stdout, and will not appear in \"$LOGF
      # These redirection commands automatically "reset" after each line.

      # Redirecting stderr, one line at a time.

      bad_command1 2>$ERRORFILE       #  Error message sent to $ERRORFILE.
      bad_command2 2>>$ERRORFILE      #  Error message appended to $ERRORFILE.
      bad_command3                    #  Error message echoed to stderr,
                                      #+ and does not appear in $ERRORFILE.
      # These redirection commands also automatically "reset" after each line.

      # Redirects stderr to stdout.
      # Error messages get sent to same place as standard output.
        >>filename 2>&1
            bad_command >>filename 2>&1
            # Appends both stdout and stderr to the file "filename" ...
        2>&1 | [command(s)]
            bad_command 2>&1 | awk '{print $5}'   # found
            # Sends stderr through a pipe.
            # |& was added to Bash 4 as an abbreviation for 2>&.

      # Redirects file descriptor i to j.
      # All output of file pointed to by i gets sent to file pointed to by j.

      # Redirects, by default, file descriptor 1 (stdout) to j.
      # All stdout gets sent to file pointed to by j.

      # Accept input from a file.
      # Companion command to ">", and often used in combination with it.
      # grep search-word <filename

      #  Open file "filename" for reading and writing,
      #+ and assign file descriptor "j" to it.
      #  If "filename" does not exist, create it.
      #  If file descriptor "j" is not specified, default to fd 0, stdin.
      #  An application of this is writing at a specified place in a file.
      echo 1234567890 > File    # Write string to "File".
      exec 3<> File             # Open "File" and assign fd 3 to it.
      read -n 4 <&3             # Read only 4 characters.
      echo -n . >&3             # Write a decimal point there.
      exec 3>&-                 # Close fd 3.
      cat File                  # ==> 1234.67890
      #  Random access, by golly.

      # Pipe.
      # General purpose process and command chaining tool.
      # Similar to ">", but more general in effect.
      # Useful for chaining commands, scripts, files, and programs together.
      cat *.txt | sort | uniq > result-file
      # Sorts the output of all the .txt files and deletes duplicate lines,
      # finally saves results to "result-file".

   Multiple instances of input and output redirection and/or pipes can
   be combined in a single command line.
command < input-file > output-file

command1 | command2 | command3 > output-file

   See Example 16-31 and Example A-14.

   Multiple output streams may be redirected to one file.
ls -yz >> command.log 2>&1
#  Capture result of illegal options "yz" in file "command.log."
#  Because stderr is redirected to the file,
#+ any error messages will also be there.

#  Note, however, that the following does *not* give the same result.
ls -yz 2>&1 >> command.log
#  Outputs an error message, but does not write to file.
#  More precisely, the command output (in this case, null)
#+ writes to the file, but the error message goes only to stdout.

#  If redirecting both stdout and stderr,
#+ the order of the commands makes a difference.

   Closing File Descriptors

          Close input file descriptor n.

   0<&-, <&-
          Close stdin.

          Close output file descriptor n.

   1>&-, >&-
          Close stdout.

   Child processes inherit open file descriptors. This is why pipes
   work. To prevent an fd from being inherited, close it.
# Redirecting only stderr to a pipe.

exec 3>&1                              # Save current "value" of stdout.
ls -l 2>&1 >&3 3>&- | grep bad 3>&-    # Close fd 3 for 'grep' (but not 'ls').
#              ^^^^   ^^^^
exec 3>&-                              # Now close it for the remainder of the

# Thanks, S.C.

   For a more detailed introduction to I/O redirection see Appendix E.

20.1. Using exec

   An exec <filename command redirects stdin to a file. From that point
   on, all stdin comes from that file, rather than its normal source
   (usually keyboard input). This provides a method of reading a file
   line by line and possibly parsing each line of input using sed and/or

   Example 20-1. Redirecting stdin using exec
# Redirecting stdin using 'exec'.

exec 6<&0          # Link file descriptor #6 with stdin.
                   # Saves stdin.

exec < data-file   # stdin replaced by file "data-file"

read a1            # Reads first line of file "data-file".
read a2            # Reads second line of file "data-file."

echo "Following lines read from file."
echo "-------------------------------"
echo $a1
echo $a2

echo; echo; echo

exec 0<&6 6<&-
#  Now restore stdin from fd #6, where it had been saved,
#+ and close fd #6 ( 6<&- ) to free it for other processes to use.
# <&6 6<&-    also works.

echo -n "Enter data  "
read b1  # Now "read" functions as expected, reading from normal stdin.
echo "Input read from stdin."
echo "----------------------"
echo "b1 = $b1"


exit 0

   Similarly, an exec >filename command redirects stdout to a designated
   file. This sends all command output that would normally go to stdout
   to that file.


   exec N > filename affects the entire script or current shell.
   Redirection in the PID of the script or shell from that point on has
   changed. However . . .

   N > filename affects only the newly-forked process, not the entire
   script or shell.

   Thank you, Ahmed Darwish, for pointing this out.

   Example 20-2. Redirecting stdout using exec


exec 6>&1           # Link file descriptor #6 with stdout.
                    # Saves stdout.

exec > $LOGFILE     # stdout replaced with file "logfile.txt".

# ----------------------------------------------------------- #
# All output from commands in this block sent to file $LOGFILE.

echo -n "Logfile: "
echo "-------------------------------------"

echo "Output of \"ls -al\" command"
ls -al
echo; echo
echo "Output of \"df\" command"

# ----------------------------------------------------------- #

exec 1>&6 6>&-      # Restore stdout and close file descriptor #6.

echo "== stdout now restored to default == "
ls -al

exit 0

   Example 20-3. Redirecting both stdin and stdout in the same script
   with exec
# Converts a specified input file to uppercase.


if [ ! -r "$1" ]     # Is specified input file readable?
  echo "Can't read from input file!"
  echo "Usage: $0 input-file output-file"
fi                   #  Will exit with same error
                     #+ even if input file ($1) not specified (why?).

if [ -z "$2" ]
  echo "Need to specify output file."
  echo "Usage: $0 input-file output-file"
  exit $E_WRONG_ARGS

exec 4<&0
exec < $1            # Will read from input file.

exec 7>&1
exec > $2            # Will write to output file.
                     # Assumes output file writable (add check?).

# -----------------------------------------------
    cat - | tr a-z A-Z   # Uppercase conversion.
#   ^^^^^                # Reads from stdin.
#           ^^^^^^^^^^   # Writes to stdout.
# However, both stdin and stdout were redirected.
# Note that the 'cat' can be omitted.
# -----------------------------------------------

exec 1>&7 7>&-       # Restore stout.
exec 0<&4 4<&-       # Restore stdin.

# After restoration, the following line prints to stdout as expected.
echo "File \"$1\" written to \"$2\" as uppercase conversion."

exit 0

   I/O redirection is a clever way of avoiding the dreaded inaccessible
   variables within a subshell problem.

   Example 20-4. Avoiding a subshell
# Suggested by Matthew Walker.



cat myfile.txt | while read line;
                 do {
                   echo $line
                   (( Lines++ ));  #  Incremented values of this variable
                                   #+ inaccessible outside loop.
                                   #  Subshell problem.

echo "Number of lines read = $Lines"     # 0
                                         # Wrong!

echo "------------------------"

exec 3<> myfile.txt
while read line <&3
do {
  echo "$line"
  (( Lines++ ));                   #  Incremented values of this variable
                                   #+ accessible outside loop.
                                   #  No subshell, no problem.
exec 3>&-

echo "Number of lines read = $Lines"     # 8


exit 0

# Lines below not seen by script.

$ cat myfile.txt

Line 1.
Line 2.
Line 3.
Line 4.
Line 5.
Line 6.
Line 7.
Line 8.

20.2. Redirecting Code Blocks

   Blocks of code, such as while, until, and for loops, even if/then
   test blocks can also incorporate redirection of stdin. Even a
   function may use this form of redirection (see Example 24-11). The <
   operator at the end of the code block accomplishes this.

   Example 20-5. Redirected while loop

if [ -z "$1" ]
then       # Default, if no filename specified.
#+ Filename=${}
#  can replace the above test (parameter substitution).



while [ "$name" != Smith ]  # Why is variable $name in quotes?
  read name                 # Reads from $Filename, rather than stdin.
  echo $name
  let "count += 1"
done <"$Filename"           # Redirects stdin to file $Filename.
#    ^^^^^^^^^^^^

echo; echo "$count names read"; echo

exit 0

#  Note that in some older shell scripting languages,
#+ the redirected loop would run as a subshell.
#  Therefore, $count would return 0, the initialized value outside the loop.
#  Bash and ksh avoid starting a subshell *whenever possible*,
#+ so that this script, for example, runs correctly.
#  (Thanks to Heiner Steven for pointing this out.)

#  However . . .
#  Bash *can* sometimes start a subshell in a PIPED "while-read" loop,
#+ as distinct from a REDIRECTED "while" loop.

echo -e "1\n2\n3" | while read l
     do abc="$l"
        echo $abc
echo $abc

#  Thanks, Bruno de Oliveira Schneider, for demonstrating this
#+ with the above snippet of code.
#  And, thanks, Brian Onn, for correcting an annotation error.

   Example 20-6. Alternate form of redirected while loop

# This is an alternate form of the preceding script.

#  Suggested by Heiner Steven
#+ as a workaround in those situations when a redirect loop
#+ runs as a subshell, and therefore variables inside the loop
# +do not keep their values upon loop termination.

if [ -z "$1" ]
then     # Default, if no filename specified.

exec 3<&0                 # Save stdin to file descriptor 3.
exec 0<"$Filename"        # Redirect standard input.


while [ "$name" != Smith ]
  read name               # Reads from redirected stdin ($Filename).
  echo $name
  let "count += 1"
done                      #  Loop reads from file $Filename
                          #+ because of line 20.

#  The original version of this script terminated the "while" loop with
#+      done <"$Filename"
#  Exercise:
#  Why is this unnecessary?

exec 0<&3                 # Restore old stdin.
exec 3<&-                 # Close temporary fd 3.

echo; echo "$count names read"; echo

exit 0

   Example 20-7. Redirected until loop
# Same as previous example, but with "until" loop.

if [ -z "$1" ]
then         # Default, if no filename specified.

# while [ "$name" != Smith ]
until [ "$name" = Smith ]     # Change  !=  to =.
  read name                   # Reads from $Filename, rather than stdin.
  echo $name
done <"$Filename"             # Redirects stdin to file $Filename.
#    ^^^^^^^^^^^^

# Same results as with "while" loop in previous example.

exit 0

   Example 20-8. Redirected for loop

if [ -z "$1" ]
then          # Default, if no filename specified.

line_count=`wc $Filename | awk '{ print $1 }'`
#           Number of lines in target file.
#  Very contrived and kludgy, nevertheless shows that
#+ it's possible to redirect stdin within a "for" loop...
#+ if you're clever enough.
# More concise is     line_count=$(wc -l < "$Filename")

for name in `seq $line_count`  # Recall that "seq" prints sequence of numbers.
# while [ "$name" != Smith ]   --   more complicated than a "while" loop   --
  read name                    # Reads from $Filename, rather than stdin.
  echo $name
  if [ "$name" = Smith ]       # Need all this extra baggage here.
done <"$Filename"              # Redirects stdin to file $Filename.
#    ^^^^^^^^^^^^

exit 0

   We can modify the previous example to also redirect the output of the

   Example 20-9. Redirected for loop (both stdin and stdout redirected)

if [ -z "$1" ]
then          # Default, if no filename specified.

Savefile=$         # Filename to save results in.
FinalName=Jonah                # Name to terminate "read" on.

line_count=`wc $Filename | awk '{ print $1 }'`  # Number of lines in target fi

for name in `seq $line_count`
  read name
  echo "$name"
  if [ "$name" = "$FinalName" ]
done < "$Filename" > "$Savefile"     # Redirects stdin to file $Filename,
#    ^^^^^^^^^^^^^^^^^^^^^^^^^^^       and saves it to backup file.

exit 0

   Example 20-10. Redirected if/then test

if [ -z "$1" ]
then   # Default, if no filename specified.


if [ "$TRUE" ]          # if true    and   if :   also work.
 read name
 echo $name
fi <"$Filename"
#  ^^^^^^^^^^^^

# Reads only first line of file.
# An "if/then" test has no way of iterating unless embedded in a loop.

exit 0

   Example 20-11. Data file for above examples

#  This is a data file for
#+ "", "", "", "", "".

   Redirecting the stdout of a code block has the effect of saving its
   output to a file. See Example 3-2.

   Here documents are a special case of redirected code blocks. That
   being the case, it should be possible to feed the output of a here
   document into the stdin for a while loop.

# This example by Albert Siersema
# Used with permission (thanks!).

function doesOutput()
 # Could be an external command too, of course.
 # Here we show you can use a function as well.
  ls -al *.jpg | awk '{print $5,$9}'

nr=0          #  We want the while loop to be able to manipulate these and
totalSize=0   #+ to be able to see the changes after the while finished.

while read fileSize fileName ; do
  echo "$fileName is $fileSize bytes"
  let nr++
  totalSize=$((totalSize+fileSize))   # Or: "let totalSize+=fileSize"

echo "$nr files totaling $totalSize bytes"

20.3. Applications

   Clever use of I/O redirection permits parsing and stitching together
   snippets of command output (see Example 15-7). This permits
   generating report and log files.

   Example 20-12. Logging events
# Author: Stephane Chazelas.
# Used in ABS Guide with permission.

# Event logging to a file.
# Must be run as root (for write access in /var/log).

ROOT_UID=0     # Only users with $UID 0 have root privileges.
E_NOTROOT=67   # Non-root exit error.

if [ "$UID" -ne "$ROOT_UID" ]
  echo "Must be root to run this script."
  exit $E_NOTROOT


# === Uncomment one of the two lines below to activate script. ===

log()  # Writes time and date to log file.
echo "$(date)  $*" >&7     # This *appends* the date to the file.
#     ^^^^^^^  command substitution
                           # See below.

case $LOG_LEVEL in
 1) exec 3>&2         4> /dev/null 5> /dev/null;;
 2) exec 3>&2         4>&2         5> /dev/null;;
 3) exec 3>&2         4>&2         5>&2;;
 *) exec 3> /dev/null 4> /dev/null 5> /dev/null;;

if [[ $LOG_VARS ]]
then exec 6>> /var/log/vars.log
else exec 6> /dev/null                     # Bury output.

if [[ $LOG_EVENTS ]]
  # exec 7 >(exec gawk '{print strftime(), $0}' >> /var/log/event.log)
  # Above line fails in versions of Bash more recent than 2.04. Why?
  exec 7>> /var/log/event.log              # Append to "event.log".
  log                                      # Write time and date.
else exec 7> /dev/null                     # Bury output.

echo "DEBUG3: beginning" >&${FD_DEBUG3}

ls -l >&5 2>&4                             # command1 >&5 2>&4

echo "Done"                                # command2

echo "sending mail" >&${FD_LOGEVENTS}
# Writes "sending mail" to file descriptor #7.

exit 0

Chapter 21. Subshells

   Running a shell script launches a new process, a subshell.

   Definition: A subshell is a child process launched by a shell (or
   shell script).

   A subshell is a separate instance of the command processor -- the
   shell that gives you the prompt at the console or in an xterm window.
   Just as your commands are interpreted at the command-line prompt,
   similarly does a script batch-process a list of commands. Each shell
   script running is, in effect, a subprocess (child process) of the
   parent shell.

   A shell script can itself launch subprocesses. These subshells let
   the script do parallel processing, in effect executing multiple
   subtasks simultaneously.


# Inside parentheses, and therefore a subshell . . .
while [ 1 ]   # Endless loop.
  echo "Subshell running . . ."

#  Script will run forever,
#+ or at least until terminated by a Ctl-C.

exit $?  # End of script (but will never get here).

Now, run the script:

And, while the script is running, from a different xterm:
ps -ef | grep

500       2698  2502  0 14:26 pts/4    00:00:00 sh
500       2699  2698 21 14:26 pts/4    00:00:24 sh


PID 2698, the script, launched PID 2699, the subshell.

Note: The "UID ..." line would be filtered out by the "grep" command,
but is shown here for illustrative purposes.

   In general, an external command in a script forks off a subprocess,
   [102] whereas a Bash builtin does not. For this reason, builtins
   execute more quickly and use fewer system resources than their
   external command equivalents.

   Command List within Parentheses

   ( command1; command2; command3; ... )
          A command list embedded between parentheses runs as a

   Variables in a subshell are not visible outside the block of code in
   the subshell. They are not accessible to the parent process, to the
   shell that launched the subshell. These are, in effect, variables
   local to the child process.

   Example 21-1. Variable scope in a subshell


echo "We are outside the subshell."
echo "Subshell level OUTSIDE subshell = $BASH_SUBSHELL"
# Bash, version 3, adds the new         $BASH_SUBSHELL variable.
echo; echo

#  Define global variable for "storage" of
#+ value of subshell variable.

echo "We are inside the subshell."
echo "Subshell level INSIDE subshell = $BASH_SUBSHELL"

echo "From inside subshell, \"inner_variable\" = $inner_variable"
echo "From inside subshell, \"outer\" = $outer_variable"

global_variable="$inner_variable"   #  Will this allow "exporting"
                                    #+ a subshell variable?

echo; echo
echo "We are outside the subshell."
echo "Subshell level OUTSIDE subshell = $BASH_SUBSHELL"

if [ -z "$inner_variable" ]
  echo "inner_variable undefined in main body of shell"
  echo "inner_variable defined in main body of shell"

echo "From main body of shell, \"inner_variable\" = $inner_variable"
#  $inner_variable will show as blank (uninitialized)
#+ because variables defined in a subshell are "local variables".
#  Is there a remedy for this?
echo "global_variable = "$global_variable""  # Why doesn't this work?


# =======================================================================

# Additionally ...

echo "-----------------"; echo

var=41                                                 # Global variable.

( let "var+=1"; echo "\$var INSIDE subshell = $var" )  # 42

echo "\$var OUTSIDE subshell = $var"                   # 41
#  Variable operations inside a subshell, even to a GLOBAL variable
#+ do not affect the value of the variable outside the subshell!

exit 0

#  Question:
#  --------
#  Once having exited a subshell,
#+ is there any way to reenter that very same subshell
#+ to modify or access the subshell variables?

   See also $BASHPID and Example 34-2.

   Definition: The scope of a variable is the context in which it has
   meaning, in which it has a value that can be referenced. For example,
   the scope of a local variable lies only within the function, block of
   code, or subshell within which it is defined, while the scope of a
   global variable is the entire script in which it appears.


   While the $BASH_SUBSHELL internal variable indicates the nesting
   level of a subshell, the $SHLVL variable shows no change within a

echo " \$BASH_SUBSHELL outside subshell       = $BASH_SUBSHELL"           # 0
  ( echo " \$BASH_SUBSHELL inside subshell        = $BASH_SUBSHELL" )     # 1
  ( ( echo " \$BASH_SUBSHELL inside nested subshell = $BASH_SUBSHELL" ) ) # 2
# ^ ^                           *** nested ***                        ^ ^


echo " \$SHLVL outside subshell = $SHLVL"       # 3
( echo " \$SHLVL inside subshell  = $SHLVL" )   # 3 (No change!)

   Directory changes made in a subshell do not carry over to the parent

   Example 21-2. List User Profiles
# Print all user profiles.

# This script written by Heiner Steven, and modified by the document author.

FILE=.bashrc  #  File containing user profile,
              #+ was ".profile" in original script.

for home in `awk -F: '{print $6}' /etc/passwd`
  [ -d "$home" ] || continue    # If no home directory, go to next.
  [ -r "$home" ] || continue    # If not readable, go to next.
  (cd $home; [ -e $FILE ] && less $FILE)

#  When script terminates, there is no need to 'cd' back to original directory
#+ because 'cd $home' takes place in a subshell.

exit 0

   A subshell may be used to set up a "dedicated environment" for a
   command group.
  unset TERMINFO
  set -C
  shift 5
  exit 3 # Only exits the subshell!
# The parent shell has not been affected, and the environment is preserved.

   As seen here, the exit command only terminates the subshell in which
   it is running, not the parent shell or script.

   One application of such a "dedicated environment" is testing whether
   a variable is defined.
if (set -u; : $variable) 2> /dev/null
  echo "Variable is set."
fi     #  Variable has been set in current script,
       #+ or is an an internal Bash variable,
       #+ or is present in environment (has been exported).

# Could also be written [[ ${variable-x} != x || ${variable-y} != y ]]
# or                    [[ ${variable-x} != x$variable ]]
# or                    [[ ${variable+x} = x ]]
# or                    [[ ${variable-x} != x ]]

   Another application is checking for a lock file:
if (set -C; : > lock_file) 2> /dev/null
  :   # lock_file didn't exist: no user running the script
  echo "Another user is already running that script."
exit 65

#  Code snippet by StИphane Chazelas,
#+ with modifications by Paulo Marcel Coelho Aragao.


   Processes may execute in parallel within different subshells. This
   permits breaking a complex task into subcomponents processed

   Example 21-3. Running parallel processes in subshells
        (cat list1 list2 list3 | sort | uniq > list123) &
        (cat list4 list5 list6 | sort | uniq > list456) &
        # Merges and sorts both sets of lists simultaneously.
        # Running in background ensures parallel execution.
        # Same effect as
        #   cat list1 list2 list3 | sort | uniq > list123 &
        #   cat list4 list5 list6 | sort | uniq > list456 &

        wait   # Don't execute the next command until subshells finish.

        diff list123 list456

   Redirecting I/O to a subshell uses the "|" pipe operator, as in ls
   -al | (command).


   A code block between curly brackets does not launch a subshell.

   { command1; command2; command3; . . . commandN; }

echo "$var1"   # 23

{ var1=76; }
echo "$var1"   # 76

Chapter 22. Restricted Shells

   Disabled commands in restricted shells

          . Running a script or portion of a script in restricted mode
          disables certain commands that would otherwise be available.
          This is a security measure intended to limit the privileges of
          the script user and to minimize possible damage from running
          the script.

   The following commands and actions are disabled:

     * Using cd to change the working directory.
     * Changing the values of the $PATH, $SHELL, $BASH_ENV, or $ENV
       environmental variables.
     * Reading or changing the $SHELLOPTS, shell environmental options.
     * Output redirection.
     * Invoking commands containing one or more /'s.
     * Invoking exec to substitute a different process for the shell.
     * Various other commands that would enable monkeying with or
       attempting to subvert the script for an unintended purpose.
     * Getting out of restricted mode within the script.

   Example 22-1. Running a script in restricted mode

#  Starting the script with "#!/bin/bash -r"
#+ runs entire script in restricted mode.


echo "Changing directory."
cd /usr/local
echo "Now in `pwd`"
echo "Coming back home."
echo "Now in `pwd`"

# Everything up to here in normal, unrestricted mode.

set -r
# set --restricted    has same effect.
echo "==> Now in restricted mode. <=="


echo "Attempting directory change in restricted mode."
cd ..
echo "Still in `pwd`"


echo "\$SHELL = $SHELL"
echo "Attempting to change shell in restricted mode."
echo "\$SHELL= $SHELL"


echo "Attempting to redirect output in restricted mode."
ls -l /usr/bin > bin.files
ls -l bin.files    # Try to list attempted file creation effort.


exit 0

Chapter 23. Process Substitution

   Piping the stdout of a command into the stdin of another is a
   powerful technique. But, what if you need to pipe the stdout of
   multiple commands? This is where process substitution comes in.

   Process substitution feeds the output of a process (or processes)
   into the stdin of another process.


   Command list enclosed within parentheses


          Process substitution uses /dev/fd/<n> files to send the
          results of the process(es) within parentheses to another
          process. [103]


   There is no space between the the "<" or ">" and the parentheses.
   Space there would give an error message.

bash$ echo >(true)

bash$ echo <(true)

bash$ echo >(true) <(true)
/dev/fd/63 /dev/fd/62

bash$ wc <(cat /usr/share/dict/linux.words)
 483523  483523 4992010 /dev/fd/63

bash$ grep script /usr/share/dict/linux.words | wc
    262     262    3601

bash$ wc <(grep script /usr/share/dict/linux.words)
    262     262    3601 /dev/fd/63


   Bash creates a pipe with two file descriptors, --fIn and fOut--. The
   stdin of true connects to fOut (dup2(fOut, 0)), then Bash passes a
   /dev/fd/fIn argument to echo. On systems lacking /dev/fd/<n> files,
   Bash may use temporary files. (Thanks, S.C.)

   Process substitution can compare the output of two different
   commands, or even the output of different options to the same
bash$ comm <(ls -l) <(ls -al)
total 12
-rw-rw-r--    1 bozo bozo       78 Mar 10 12:58 File0
-rw-rw-r--    1 bozo bozo       42 Mar 10 12:58 File2
-rw-rw-r--    1 bozo bozo      103 Mar 10 12:58
        total 20
        drwxrwxrwx    2 bozo bozo     4096 Mar 10 18:10 .
        drwx------   72 bozo bozo     4096 Mar 10 17:58 ..
        -rw-rw-r--    1 bozo bozo       78 Mar 10 12:58 File0
        -rw-rw-r--    1 bozo bozo       42 Mar 10 12:58 File2
        -rw-rw-r--    1 bozo bozo      103 Mar 10 12:58

   Process substitution can compare the contents of two directories --
   to see which filenames are in one, but not the other.

   diff <(ls $first_directory) <(ls $second_directory)

   Some other usages and uses of process substitution:

read -a list < <( od -Ad -w24 -t u2 /dev/urandom )
#  Read a list of random numbers from /dev/urandom,
#+ process with "od"
#+ and feed into stdin of "read" . . .

#  From "insertion-sort.bash" example script.
#  Courtesy of JuanJo Ciarlante.

cat <(ls -l)
# Same as     ls -l | cat

sort -k 9 <(ls -l /bin) <(ls -l /usr/bin) <(ls -l /usr/X11R6/bin)
# Lists all the files in the 3 main 'bin' directories, and sorts by filename.
# Note that three (count 'em) distinct commands are fed to 'sort'.

diff <(command1) <(command2)    # Gives difference in command output.

tar cf >(bzip2 -c > file.tar.bz2) $directory_name
# Calls "tar cf /dev/fd/?? $directory_name", and "bzip2 -c > file.tar.bz2".
# Because of the /dev/fd/<n> system feature,
# the pipe between both commands does not need to be named.
# This can be emulated.
bzip2 -c < pipe > file.tar.bz2&
tar cf pipe $directory_name
rm pipe
#        or
exec 3>&1
tar cf /dev/fd/4 $directory_name 4>&1 >&3 3>&- | bzip2 -c > file.tar.bz2 3>&-
exec 3>&-

# Thanks, StИphane Chazelas

   Here is a method of circumventing the problem of an echo piped to a
   while-read loop running in a subshell.

   Example 23-1. Code block redirection without forking
# wr-ps.bash: while-read loop with process substitution.

# This example contributed by Tomas Pospisek.
# (Heavily edited by the ABS Guide author.)


echo "random input" | while read i
  global=3D": Not available outside the loop."
  # ... because it runs in a subshell.

echo "\$global (from outside the subprocess) = $global"
# $global (from outside the subprocess) =

echo; echo "--"; echo

while read i
  echo $i
  global=3D": Available outside the loop."
  # ... because it does *not* run in a subshell.
done < <( echo "random input" )
#    ^ ^

echo "\$global (using process substitution) = $global"
# Random input
# $global (using process substitution) = 3D: Available outside the loop.

echo; echo "##########"; echo

# And likewise . . .

declare -a inloop
cat $0 | while read line
  # It runs in a subshell, so ...
echo "OUTPUT = "
echo ${inloop[*]}           # ... nothing echoes.

echo; echo "--"; echo

declare -a outloop
while read line
  # It does *not* run in a subshell, so ...
done < <( cat $0 )
echo "OUTPUT = "
echo ${outloop[*]}          # ... the entire script echoes.

exit $?

   This is a similar example.

   Example 23-2. Redirecting the output of process substitution into a
# psub.bash

# As inspired by Diego Molina (thanks!).

declare -a array0
while read
done < <( sed -e 's/bash/CRASH-BANG!/' $0 | grep bin | awk '{print $1}' )

echo "${array0[@]}"

exit $?

# ====================================== #

bash psub.bash

#!/bin/CRASH-BANG! done #!/bin/CRASH-BANG!

   A reader sent in the following interesting example of process

# Script fragment taken from SuSE distribution:

# --------------------------------------------------------------#
while read  des what mask iface; do
# Some commands ...
done < <(route -n)
#    ^ ^  First < is redirection, second is process substitution.

# To test it, let's make it do something.
while read  des what mask iface; do
  echo $des $what $mask $iface
done < <(route -n)

# Output:
# Kernel IP routing table
# Destination Gateway Genmask Flags Metric Ref Use Iface
# U 0 0 0 lo
# --------------------------------------------------------------#

#  As StИphane Chazelas points out,
#+ an easier-to-understand equivalent is:
route -n |
  while read des what mask iface; do   # Variables set from output of pipe.
    echo $des $what $mask $iface
  done  #  This yields the same output as above.
        #  However, as Ulrich Gayer points out . . .
        #+ this simplified equivalent uses a subshell for the while loop,
        #+ and therefore the variables disappear when the pipe terminates.

# --------------------------------------------------------------#

#  However, Filip Moritz comments that there is a subtle difference
#+ between the above two examples, as the following shows.

route -n | while read x; do ((y++)); done
echo $y # $y is still unset

while read x; do ((y++)); done < <(route -n)
echo $y # $y has the number of lines of output of route -n

More generally spoken
: | x=x
# seems to start a subshell like
: | ( x=x )
# while
x=x < <(:)
# does not

# This is useful, when parsing csv and the like.
# That is, in effect, what the original SuSE code fragment does.

Chapter 24. Functions

   Like "real" programming languages, Bash has functions, though in a
   somewhat limited implementation. A function is a subroutine, a code
   block that implements a set of operations, a "black box" that
   performs a specified task. Wherever there is repetitive code, when a
   task repeats with only slight variations in procedure, then consider
   using a function.

   function function_name {

   function_name () {

   This second form will cheer the hearts of C programmers (and is more

   As in C, the function's opening bracket may optionally appear on the
   second line.

   function_name ()


   A function may be "compacted" into a single line.

fun () { echo "This is a function"; echo; }
#                                 ^     ^

   In this case, however, a semicolon must follow the final command in
   the function.

fun () { echo "This is a function"; echo } # Error!
#                                       ^

   Functions are called, triggered, simply by invoking their names. A
   function call is equivalent to a command.

   Example 24-1. Simple functions


funky ()
{ # This is about as simple as functions get.
  echo "This is a funky function."
  echo "Now exiting funky function."
} # Function declaration must precede call.

fun ()
{ # A somewhat more complex function.

  echo "And now the fun really begins."

  sleep $JUST_A_SECOND    # Hey, wait a second!
  while [ $i -lt $REPEATS ]
    echo "----------FUNCTIONS---------->"
    echo "<------------ARE-------------"
    echo "<------------FUN------------>"
    let "i+=1"

  # Now, call the functions.


exit 0

   The function definition must precede the first call to it. There is
   no method of "declaring" the function, as, for example, in C.

# Will give an error message, since function "f1" not yet defined.

declare -f f1      # This doesn't help either.
f1                 # Still an error message.

# However...

f1 ()
  echo "Calling function \"f2\" from within function \"f1\"."

f2 ()
  echo "Function \"f2\"."

f1  #  Function "f2" is not actually called until this point,
    #+ although it is referenced before its definition.
    #  This is permissible.

    # Thanks, S.C.


   Functions may not be empty!

empty ()

exit 0  # Will not exit here!

# $ sh
# line 6: syntax error near unexpected token `}'
# line 6: `}'

# $ echo $?
# 2

# However ...

not_quite_empty ()
} #  A script containing this function will *not* bomb
  #+ as long as the function is not called.

# Thank you, Thiemo Kellner, for pointing this out.

   It is even possible to nest a function within another function,
   although this is not very useful.

f1 ()

  f2 () # nested
    echo "Function \"f2\", inside \"f1\"."


f2  #  Gives an error message.
    #  Even a preceding "declare -f f2" wouldn't help.


f1  #  Does nothing, since calling "f1" does not automatically call "f2".
f2  #  Now, it's all right to call "f2",
    #+ since its definition has been made visible by calling "f1".

    # Thanks, S.C.

   Function declarations can appear in unlikely places, even where a
   command would otherwise go.

ls -l | foo() { echo "foo"; }  # Permissible, but useless.

if [ "$USER" = bozo ]
  bozo_greet ()   # Function definition embedded in an if/then construct.
    echo "Hello, Bozo."

bozo_greet        # Works only for Bozo, and other users get an error.

# Something like this might be useful in some contexts.
NO_EXIT=1   # Will enable function definition below.

[[ $NO_EXIT -eq 1 ]] && exit() { true; }     # Function definition in an "and-
# If $NO_EXIT is 1, declares "exit ()".
# This disables the "exit" builtin by aliasing it to "true".

exit  # Invokes "exit ()" function, not "exit" builtin.

# Or, similarly:

[ -f "$filename" ] &&
foo () { rm -f "$filename"; echo "File "$filename" deleted."; } ||
foo () { echo "File "$filename" not found."; touch bar; }


# Thanks, S.C. and Christopher Head

   Functions can take strange forms.

  _(){ for i in {1..10}; do echo -n "$FUNCNAME"; done; echo; }
# ^^^         No space between function name and parentheses.
#             This doesn't always work. Why not?

# Now, let's invoke the function.
  _         # __________
#             ^^^^^^^^^^   10 underscores (10 x function name)!
# A "naked" underscore is an acceptable function name.


   What happens when different versions of the same function appear in a
#  As Yan Chen points out,
#  when a function is defined multiple times,
#  the final version is what is invoked.
#  This is not, however, particularly useful.

func ()
  echo "First version of func ()."

func ()
  echo "Second version of func ()."

func   # Second version of func ().

exit $?

#  It is even possible to use functions to override
#+ or preempt system commands.
#  Of course, this is *not* advisable.

24.1. Complex Functions and Function Complexities

   Functions may process arguments passed to them and return an exit
   status to the script for further processing.
   function_name $arg1 $arg2

   The function refers to the passed arguments by position (as if they
   were positional parameters), that is, $1, $2, and so forth.

   Example 24-2. Function Taking Parameters
# Functions and parameters

DEFAULT=default                             # Default param value.

func2 () {
   if [ -z "$1" ]                           # Is parameter #1 zero length?
     echo "-Parameter #1 is zero length.-"  # Or no parameter passed.
     echo "-Param #1 is \"$1\".-"

   variable=${1-$DEFAULT}                   #  What does
   echo "variable = $variable"              #+ parameter substitution show?
                                            #  ---------------------------
                                            #  It distinguishes between
                                            #+ no param and a null param.

   if [ "$2" ]
     echo "-Parameter #2 is \"$2\".-"

   return 0


echo "Nothing passed."
func2                          # Called with no params

echo "Zero-length parameter passed."
func2 ""                       # Called with zero-length param

echo "Null parameter passed."
func2 "$uninitialized_param"   # Called with uninitialized param

echo "One parameter passed."
func2 first           # Called with one param

echo "Two parameters passed."
func2 first second    # Called with two params

echo "\"\" \"second\" passed."
func2 "" second       # Called with zero-length first parameter
echo                  # and ASCII string as a second one.

exit 0


   The shift command works on arguments passed to functions (see Example

   But, what about command-line arguments passed to the script? Does a
   function see them? Well, let's clear up the confusion.

   Example 24-3. Functions and command-line args passed to the script
#  Call this script with a command-line argument,
#+ something like $0 arg1.

func ()

echo "$1"

echo "First call to function: no arg passed."
echo "See if command-line arg is seen."
# No! Command-line arg not seen.

echo "============================================================"
echo "Second call to function: command-line arg passed explicitly."
func $1
# Now it's seen!

exit 0

   In contrast to certain other programming languages, shell scripts
   normally pass only value parameters to functions. Variable names
   (which are actually pointers), if passed as parameters to functions,
   will be treated as string literals. Functions interpret their
   arguments literally.

   Indirect variable references (see Example 37-2) provide a clumsy sort
   of mechanism for passing variable pointers to functions.

   Example 24-4. Passing an indirect reference to a function
# Passing an indirect reference to a function.

echo_var ()
echo "$1"


echo_var "$message"        # Hello
# Now, let's pass an indirect reference to the function.
echo_var "${!message}"     # Goodbye

echo "-------------"

# What happens if we change the contents of "hello" variable?
Hello="Hello, again!"
echo_var "$message"        # Hello
echo_var "${!message}"     # Hello, again!

exit 0

   The next logical question is whether parameters can be dereferenced
   after being passed to a function.

   Example 24-5. Dereferencing a parameter passed to a function
# Dereferencing parameter passed to a function.
# Script by Bruce W. Clare.

dereference ()
     y=\$"$1"   # Name of variable.
     echo $y    # $Junk

     x=`eval "expr \"$y\" "`
     echo $1=$x
     eval "$1=\"Some Different Text \""  # Assign new value.

Junk="Some Text"
echo $Junk "before"    # Some Text before

dereference Junk
echo $Junk "after"     # Some Different Text after

exit 0

   Example 24-6. Again, dereferencing a parameter passed to a function
# Dereferencing a parameter passed to a function.
#                (Complex Example)

ITERATIONS=3  # How many times to get input.

my_read () {
  #  Called with my_read varname,
  #+ outputs the previous value between brackets as the default value,
  #+ then asks for a new value.

  local local_var

  echo -n "Enter a value "
  eval 'echo -n "[$'$1'] "'  #  Previous value.
# eval echo -n "[\$$1] "     #  Easier to understand,
                             #+ but loses trailing space in user prompt.
  read local_var
  [ -n "$local_var" ] && eval $1=\$local_var

  # "And-list": if "local_var" then set "$1" to its value.


while [ "$icount" -le "$ITERATIONS" ]
  my_read var
  echo "Entry #$icount = $var"
  let "icount += 1"

# Thanks to Stephane Chazelas for providing this instructive example.

exit 0

   Exit and Return

   exit status
          Functions return a value, called an exit status. This is
          analogous to the exit status returned by a command. The exit
          status may be explicitly specified by a return statement,
          otherwise it is the exit status of the last command in the
          function (0 if successful, and a non-zero error code if not).
          This exit status may be used in the script by referencing it
          as $?. This mechanism effectively permits script functions to
          have a "return value" similar to C functions.

          Terminates a function. A return command [104] optionally takes
          an integer argument, which is returned to the calling script
          as the "exit status" of the function, and this exit status is
          assigned to the variable $?.

          Example 24-7. Maximum of two numbers

# Maximum of two integers.

E_PARAM_ERR=250    # If less than 2 params passed to function.
EQUAL=251          # Return value if both params equal.
#  Error values out of range of any
#+ params that might be fed to the function.

max2 ()             # Returns larger of two numbers.
{                   # Note: numbers compared must be less than 250.
if [ -z "$2" ]
  return $E_PARAM_ERR

if [ "$1" -eq "$2" ]
  return $EQUAL
  if [ "$1" -gt "$2" ]
    return $1
    return $2

max2 33 34

if [ "$return_val" -eq $E_PARAM_ERR ]
  echo "Need to pass two parameters to the function."
elif [ "$return_val" -eq $EQUAL ]
    echo "The two numbers are equal."
    echo "The larger of the two numbers is $return_val."

exit 0

#  Exercise (easy):
#  ---------------
#  Convert this to an interactive script,
#+ that is, have the script ask for input (two numbers).


   For a function to return a string or array, use a dedicated variable.
  [[ -r /etc/passwd ]] && REPLY=$(echo $(wc -l < /etc/passwd))
  #  If /etc/passwd is readable, set REPLY to line count.
  #  Returns both a parameter value and status information.
  #  The 'echo' seems unnecessary, but . . .
  #+ it removes excess whitespace from the output.

if count_lines_in_etc_passwd
  echo "There are $REPLY lines in /etc/passwd."
  echo "Cannot count lines in /etc/passwd."

# Thanks, S.C.

          Example 24-8. Converting numbers to Roman numerals


# Arabic number to Roman numeral conversion
# Range: 0 - 200
# It's crude, but it works.

# Extending the range and otherwise improving the script is left as an exercis

# Usage: roman number-to-convert


if [ -z "$1" ]
  echo "Usage: `basename $0` number-to-convert"
  exit $E_ARG_ERR

if [ "$num" -gt $LIMIT ]
  echo "Out of range!"
  exit $E_OUT_OF_RANGE

to_roman ()   # Must declare function before first call to it.
let "remainder = number - factor"
while [ "$remainder" -ge 0 ]
  echo -n $rchar
  let "number -= factor"
  let "remainder = number - factor"

return $number
       # Exercises:
       # ---------
       # 1) Explain how this function works.
       #    Hint: division by successive subtraction.
       # 2) Extend to range of the function.
       #    Hint: use "echo" and command-substitution capture.

to_roman $num 100 C
to_roman $num 90 LXXXX
to_roman $num 50 L
to_roman $num 40 XL
to_roman $num 10 X
to_roman $num 9 IX
to_roman $num 5 V
to_roman $num 4 IV
to_roman $num 1 I
# Successive calls to conversion function!
# Is this really necessary??? Can it be simplified?



          See also Example 11-28.


   The largest positive integer a function can return is 255. The return
   command is closely tied to the concept of exit status, which accounts
   for this particular limitation. Fortunately, there are various
   workarounds for those situations requiring a large integer return
   value from a function.
   Example 24-9. Testing large return values in a function

# The largest positive value a function can return is 255.

return_test ()         # Returns whatever passed to it.
  return $1

return_test 27         # o.k.
echo $?                # Returns 27.

return_test 255        # Still o.k.
echo $?                # Returns 255.

return_test 257        # Error!
echo $?                # Returns 1 (return code for miscellaneous error).

# ======================================================
return_test -151896    # Do large negative numbers work?
echo $?                # Will this return -151896?
                       # No! It returns 168.
#  Version of Bash before 2.05b permitted
#+ large negative integer return values.
#  Newer versions of Bash plug this loophole.
#  This may break older scripts.
#  Caution!
# ======================================================

exit 0

   A workaround for obtaining large integer "return values" is to simply
   assign the "return value" to a global variable.
Return_Val=   # Global variable to hold oversize return value of function.

alt_return_test ()
  return   # Returns 0 (success).

alt_return_test 1
echo $?                              # 0
echo "return value = $Return_Val"    # 1

alt_return_test 256
echo "return value = $Return_Val"    # 256

alt_return_test 257
echo "return value = $Return_Val"    # 257

alt_return_test 25701
echo "return value = $Return_Val"    #25701

   A more elegant method is to have the function echo its "return value
   to stdout," and then capture it by command substitution. See the
   discussion of this in Section 36.7.
   Example 24-10. Comparing two large integers
# Maximum of two LARGE integers.

#  This is the previous "" example,
#+ modified to permit comparing large integers.

EQUAL=0             # Return value if both params equal.
E_PARAM_ERR=-99999  # Not enough params passed to function.
#           ^^^^^^    Out of range of any params that might be passed.

max2 ()             # "Returns" larger of two numbers.
if [ -z "$2" ]
  echo $E_PARAM_ERR

if [ "$1" -eq "$2" ]
  echo $EQUAL
  if [ "$1" -gt "$2" ]

echo $retval        # Echoes (to stdout), rather than returning value.
                    # Why?

return_val=$(max2 33001 33997)
#            ^^^^             Function name
#                 ^^^^^ ^^^^^ Params passed
#  This is actually a form of command substitution:
#+ treating a function as if it were a command,
#+ and assigning the stdout of the function to the variable "return_val."

# ========================= OUTPUT ========================
if [ "$return_val" -eq "$E_PARAM_ERR" ]
  echo "Error in parameters passed to comparison function!"
elif [ "$return_val" -eq "$EQUAL" ]
    echo "The two numbers are equal."
    echo "The larger of the two numbers is $return_val."
# =========================================================

exit 0

#  Exercises:
#  ---------
#  1) Find a more elegant way of testing
#+    the parameters passed to the function.
#  2) Simplify the if/then structure at "OUTPUT."
#  3) Rewrite the script to take input from command-line parameters.

   Here is another example of capturing a function "return value."
   Understanding it requires some knowledge of awk.

month_length ()  # Takes month number as an argument.
{                # Returns number of days in month.
monthD="31 28 31 30 31 30 31 31 30 31 30 31"  # Declare as local?
echo "$monthD" | awk '{ print $'"${1}"' }'    # Tricky.
#                             ^^^^^^^^^
# Parameter passed to function  ($1 -- month number), then to awk.
# Awk sees this as "print $1 . . . print $12" (depending on month number)
# Template for passing a parameter to embedded awk script:
#                                 $'"${script_parameter}"'

#  Needs error checking for correct parameter range (1-12)
#+ and for February in leap year.

# ----------------------------------------------
# Usage example:
month=4        # April, for example (4th month).
days_in=$(month_length $month)
echo $days_in  # 30
# ----------------------------------------------

          See also Example A-7 and Example A-37.

          Exercise: Using what we have just learned, extend the previous
          Roman numerals example to accept arbitrarily large input.


   Redirecting the stdin of a function
          A function is essentially a code block, which means its stdin
          can be redirected (as in Example 3-1).

          Example 24-11. Real name from username

# From username, gets "real name" from /etc/passwd.

ARGCOUNT=1       # Expect one arg.


if [ $# -ne "$ARGCOUNT" ]
  echo "Usage: `basename $0` USERNAME"

file_excerpt ()    #  Scan file for pattern,
{                  #+ then print relevant portion of line.
  while read line  # "while" does not necessarily need [ condition ]
    echo "$line" | grep $1 | awk -F":" '{ print $5 }'
    # Have awk use ":" delimiter.
} <$file  # Redirect into function's stdin.

file_excerpt $pattern

# Yes, this entire script could be reduced to
#       grep PATTERN /etc/passwd | awk -F":" '{ print $5 }'
# or
#       awk -F: '/PATTERN/ {print $5}'
# or
#       awk -F: '($1 == "username") { print $5 }' # real name from username
# However, it might not be as instructive.

exit 0

          There is an alternate, and perhaps less confusing method of
          redirecting a function's stdin. This involves redirecting the
          stdin to an embedded bracketed code block within the function.

# Instead of:
Function ()
 } < file

# Try this:
Function ()
   } < file

# Similarly,

Function ()  # This works.
   echo $*
  } | tr a b

Function ()  # This doesn't work.
  echo $*
} | tr a b   # A nested code block is mandatory here.

# Thanks, S.C.


   Emmanuel Rouat's sample bashrc file contains some instructive
   examples of functions.

24.2. Local Variables

   What makes a variable local?

   local variables
          A variable declared as local is one that is visible only
          within the block of code in which it appears. It has local
          scope. In a function, a local variable has meaning only within
          that function block.

          Example 24-12. Local variable visibility

# Global and local variables inside a function.

func ()
  local loc_var=23       # Declared as local variable.
  echo                   # Uses the 'local' builtin.
  echo "\"loc_var\" in function = $loc_var"
  global_var=999         # Not declared as local.
                         # Defaults to global.
  echo "\"global_var\" in function = $global_var"


# Now, to see if local variable "loc_var" exists outside function.

echo "\"loc_var\" outside function = $loc_var"
                                      # $loc_var outside function =
                                      # No, $loc_var not visible globally.
echo "\"global_var\" outside function = $global_var"
                                      # $global_var outside function = 999
                                      # $global_var is visible globally.

exit 0
#  In contrast to C, a Bash variable declared inside a function
#+ is local *only* if declared as such.


   Before a function is called, all variables declared within the
   function are invisible outside the body of the function, not just
   those explicitly declared as local.

func ()
global_var=37    #  Visible only within the function block
                 #+ before the function has been called.
}                #  END OF FUNCTION

echo "global_var = $global_var"  # global_var =
                                 #  Function "func" has not yet been called,
                                 #+ so $global_var is not visible here.

echo "global_var = $global_var"  # global_var = 37
                                 # Has been set by function call.


   As Evgeniy Ivanov points out, when declaring and setting a local
   variable in a single command, apparently the order of operations is
   to first set the variable, and only afterwards restrict it to local
   scope. This is reflected in the return value.

echo "==OUTSIDE Function (global)=="
t=$(exit 1)
echo $?      # 1
             # As expected.

function0 ()

echo "==INSIDE Function=="
echo "Global"
t0=$(exit 1)
echo $?      # 1
             # As expected.

echo "Local declared & assigned in same command."
local t1=$(exit 1)
echo $?      # 0
             # Unexpected!
#  Apparently, the variable assignment takes place before
#+ the local declaration.
#+ The return value is for the latter.

echo "Local declared, then assigned (separate commands)."
local t2
t2=$(exit 1)
echo $?      # 1
             # As expected.



24.2.1. Local variables and recursion.

   Recursion is an interesting and sometimes useful form of
   self-reference. Herbert Mayer defines it as ". . . expressing an
   algorithm by using a simpler version of that same algorithm . . ."

   Consider a definition defined in terms of itself, [105] an expression
   implicit in its own expression, [106] a snake swallowing its own
   tail, [107] or . . . a function that calls itself. [108]

   Example 24-13. Demonstration of a simple recursive function
# Demonstration of recursion.

RECURSIONS=9   # How many times to recurse.
r_count=0      # Must be global. Why?

recurse ()

  while [ "$var" -ge 0 ]
    echo "Recursion count = "$r_count"  +-+  \$var = "$var""
    (( var-- )); (( r_count++ ))
    recurse "$var"  #  Function calls itself (recurses)
  done              #+ until what condition is met?


exit $?

   Example 24-14. Another simple demonstration
# A script that defines "recursion" in a rather graphic way.

sp=" "

define_recursion ()
  sp="$sp"" "
  echo -n "$sp"
  echo "\"The act of recurring ... \""   # Per 1913 Webster's dictionary.

  while [ $r_count -le $RECURSIONS ]

echo "Recursion: "

exit $?

   Local variables are a useful tool for writing recursive code, but
   this practice generally involves a great deal of computational
   overhead and is definitely not recommended in a shell script. [109]

   Example 24-15. Recursion, using a local variable

#               factorial
#               ---------

# Does bash permit recursion?
# Well, yes, but...
# It's so slow that you gotta have rocks in your head to try it.


if [ -z "$1" ]
  echo "Usage: `basename $0` number"
  exit $E_WRONG_ARGS

if [ "$1" -gt $MAX_ARG ]
  echo "Out of range ($MAX_ARG is maximum)."
  #  Let's get real now.
  #  If you want greater range than this,
  #+ rewrite it in a Real Programming Language.
  exit $E_RANGE_ERR

fact ()
  local number=$1
  #  Variable "number" must be declared as local,
  #+ otherwise this doesn't work.
  if [ "$number" -eq 0 ]
    factorial=1    # Factorial of 0 = 1.
    let "decrnum = number - 1"
    fact $decrnum  # Recursive function call (the function calls itself).
    let "factorial = $number * $?"

  return $factorial

fact $1
echo "Factorial of $1 is $?."

exit 0

   Also see Example A-15 for an example of recursion in a script. Be
   aware that recursion is resource-intensive and executes slowly, and
   is therefore generally not appropriate in a script.

24.3. Recursion Without Local Variables

   A function may recursively call itself even without use of local

   Example 24-16. The Fibonacci Sequence
# : Fibonacci sequence (recursive)
# Author: M. Cooper
# License: GPL3

# ---------------------------------
# Fibo(0) = 0
# Fibo(1) = 1
# else
#   Fibo(j) = Fibo(j-1) + Fibo(j-2)
# ---------------------------------

MAXTERM=15       # Number of terms (+1) to generate.
MINIDX=2         # If idx is less than 2, then Fibo(idx) = idx.

Fibonacci ()
  idx=$1   # Doesn't need to be local. Why not?
  if [ "$idx" -lt "$MINIDX" ]
    echo "$idx"  # First two terms are 0 1 ... see above.
    (( --idx ))  # j-1
    term1=$( Fibonacci $idx )   #  Fibo(j-1)

    (( --idx ))  # j-2
    term2=$( Fibonacci $idx )   #  Fibo(j-2)

    echo $(( term1 + term2 ))
  #  An ugly, ugly kludge.
  #  The more elegant implementation of recursive fibo in C
  #+ is a straightforward translation of the algorithm in lines 7 - 10.

for i in $(seq 0 $MAXTERM)
do  # Calculate $MAXTERM+1 terms.
  FIBO=$(Fibonacci $i)
  echo -n "$FIBO "
# 0 1 1 2 3 5 8 13 21 34 55 89 144 233 377 610
# Takes a while, doesn't it? Recursion in a script is slow.


exit 0

   Example 24-17. The Towers of Hanoi
#! /bin/bash
# The Towers Of Hanoi
# Bash script
# Copyright (C) 2000 Amit Singh. All Rights Reserved.
# Tested under Bash version 2.05b.0(13)-release.
# Also works under Bash version 3.x.
#  Used in "Advanced Bash Scripting Guide"
#+ with permission of script author.
#  Slightly modified and commented by ABS author.

#  The Tower of Hanoi is a mathematical puzzle attributed to
#+ Edouard Lucas, a nineteenth-century French mathematician.
#  There are three vertical posts set in a base.
#  The first post has a set of annular rings stacked on it.
#  These rings are disks with a hole drilled out of the center,
#+ so they can slip over the posts and rest flat.
#  The rings have different diameters, and they stack in ascending
#+ order, according to size.
#  The smallest ring is on top, and the largest on the bottom.
#  The task is to transfer the stack of rings
#+ to one of the other posts.
#  You can move only one ring at a time to another post.
#  You are permitted to move rings back to the original post.
#  You may place a smaller ring atop a larger one,
#+ but *not* vice versa.
#  Again, it is forbidden to place a larger ring atop a smaller one.
#  For a small number of rings, only a few moves are required.
#+ For each additional ring,
#+ the required number of moves approximately doubles,
#+ and the "strategy" becomes increasingly complicated.
#  For more information, see
#+ or pp. 186-92 of _The Armchair Universe_ by A.K. Dewdney.
#         ...                   ...                    ...
#         | |                   | |                    | |
#        _|_|_                  | |                    | |
#       |_____|                 | |                    | |
#      |_______|                | |                    | |
#     |_________|               | |                    | |
#    |___________|              | |                    | |
#   |             |             | |                    | |
# .--------------------------------------------------------------.
# |**************************************************************|
#          #1                   #2                      #3

E_NOPARAM=66  # No parameter passed to script.
E_BADPARAM=67 # Illegal number of disks passed to script.
Moves=        # Global variable holding number of moves.
              # Modification to original script.

dohanoi() {   # Recursive function.
    case $1 in
        dohanoi "$(($1-1))" $2 $4 $3
        echo move $2 "-->" $3
        ((Moves++))          # Modification to original script.
        dohanoi "$(($1-1))" $4 $3 $2

case $# in
    1) case $(($1>0)) in     # Must have at least one disk.
       1)  # Nested case statement.
           dohanoi $1 1 3 2
           echo "Total moves = $Moves"   # 2^n - 1, where n = # of disks.
           exit 0;
           echo "$0: illegal value for number of disks";
           exit $E_BADPARAM;
       echo "usage: $0 N"
       echo "       Where \"N\" is the number of disks."
       exit $E_NOPARAM;

# Exercises:
# ---------
# 1) Would commands beyond this point ever be executed?
#    Why not? (Easy)
# 2) Explain the workings of the workings of the "dohanoi" function.
#    (Difficult -- see the Dewdney reference, above.)

Chapter 25. Aliases

   A Bash alias is essentially nothing more than a keyboard shortcut, an
   abbreviation, a means of avoiding typing a long command sequence. If,
   for example, we include alias lm="ls -l | more" in the ~/.bashrc
   file, then each lm [110] typed at the command-line will automatically
   be replaced by a ls -l | more. This can save a great deal of typing
   at the command-line and avoid having to remember complex combinations
   of commands and options. Setting alias rm="rm -i" (interactive mode
   delete) may save a good deal of grief, since it can prevent
   inadvertently deleting important files.

   In a script, aliases have very limited usefulness. It would be nice
   if aliases could assume some of the functionality of the C
   preprocessor, such as macro expansion, but unfortunately Bash does
   not expand arguments within the alias body. [111] Moreover, a script
   fails to expand an alias itself within "compound constructs," such as
   if/then statements, loops, and functions. An added limitation is that
   an alias will not expand recursively. Almost invariably, whatever we
   would like an alias to do could be accomplished much more effectively
   with a function.

   Example 25-1. Aliases within a script

shopt -s expand_aliases
# Must set this option, else script will not expand aliases.

# First, some fun.
alias Jesse_James='echo "\"Alias Jesse James\" was a 1959 comedy starring Bob

echo; echo; echo;

alias ll="ls -l"
# May use either single (') or double (") quotes to define an alias.

echo "Trying aliased \"ll\":"
ll /usr/X11R6/bin/mk*   #* Alias works.


prefix=mk*  # See if wild card causes problems.
echo "Variables \"directory\" + \"prefix\" = $directory$prefix"

alias lll="ls -l $directory$prefix"

echo "Trying aliased \"lll\":"
lll         # Long listing of all files in /usr/X11R6/bin stating with mk.
# An alias can handle concatenated variables -- including wild card -- o.k.



if [ TRUE ]
  alias rr="ls -l"
  echo "Trying aliased \"rr\" within if/then statement:"
  rr /usr/X11R6/bin/mk*   #* Error message results!
  # Aliases not expanded within compound statements.
  echo "However, previously expanded alias still recognized:"
  ll /usr/X11R6/bin/mk*


while [ $count -lt 3 ]
  alias rrr="ls -l"
  echo "Trying aliased \"rrr\" within \"while\" loop:"
  rrr /usr/X11R6/bin/mk*   #* Alias will not expand here either.
                           # line 57: rrr: command not found
  let count+=1

echo; echo

alias xyz='cat $0'   # Script lists itself.
                     # Note strong quotes.
#  This seems to work,
#+ although the Bash documentation suggests that it shouldn't.
#  However, as Steve Jacobson points out,
#+ the "$0" parameter expands immediately upon declaration of the alias.

exit 0

   The unalias command removes a previously set alias.

   Example 25-2. unalias: Setting and unsetting an alias

shopt -s expand_aliases  # Enables alias expansion.

alias llm='ls -al | more'


unalias llm              # Unset alias.
# Error message results, since 'llm' no longer recognized.

exit 0

bash$ ./
total 6
drwxrwxr-x    2 bozo     bozo         3072 Feb  6 14:04 .
drwxr-xr-x   40 bozo     bozo         2048 Feb  6 14:04 ..
-rwxr-xr-x    1 bozo     bozo          199 Feb  6 14:04

./ llm: command not found

Chapter 26. List Constructs

   The and list and or list constructs provide a means of processing a
   number of commands consecutively. These can effectively replace
   complex nested if/then or even case statements.

   Chaining together commands

   and list

command-1 && command-2 && command-3 && ... command-n

          Each command executes in turn, provided that the previous
          command has given a return value of true (zero). At the first
          false (non-zero) return, the command chain terminates (the
          first command returning false is the last one to execute).

          Example 26-1. Using an and list to test for command-line

# and list

if [ ! -z "$1" ] && echo "Argument #1 = $1" && [ ! -z "$2" ] && \
#                ^^                         ^^               ^^
echo "Argument #2 = $2"
  echo "At least 2 arguments passed to script."
  # All the chained commands return true.
  echo "Fewer than 2 arguments passed to script."
  # At least one of the chained commands returns false.
# Note that "if [ ! -z $1 ]" works, but its alleged equivalent,
#   "if [ -n $1 ]" does not.
#     However, quoting fixes this.
#  if "[ -n "$1" ]" works.
#           ^  ^    Careful!
# It is always best to QUOTE the variables being tested.

# This accomplishes the same thing, using "pure" if/then statements.
if [ ! -z "$1" ]
  echo "Argument #1 = $1"
if [ ! -z "$2" ]
  echo "Argument #2 = $2"
  echo "At least 2 arguments passed to script."
  echo "Fewer than 2 arguments passed to script."
# It's longer and more ponderous than using an "and list".

exit $?

          Example 26-2. Another command-line arg test using an and list


ARGS=1        # Number of arguments expected.
E_BADARGS=85  # Exit value if incorrect number of args passed.

test $# -ne $ARGS && \
#    ^^^^^^^^^^^^ condition #1
echo "Usage: `basename $0` $ARGS argument(s)" && exit $E_BADARGS
#                                             ^^
#  If condition #1 tests true (wrong number of args passed to script),
#+ then the rest of the line executes, and script terminates.

# Line below executes only if the above test fails.
echo "Correct number of arguments passed to this script."

exit 0

# To check exit value, do a "echo $?" after script termination.

          Of course, an and list can also set variables to a default

arg1=$@ && [ -z "$arg1" ] && arg1=DEFAULT

              # Set $arg1 to command-line arguments, if any.
              # But . . . set to DEFAULT if not specified on command-line.

   or list

command-1 || command-2 || command-3 || ... command-n

          Each command executes in turn for as long as the previous
          command returns false. At the first true return, the command
          chain terminates (the first command returning true is the last
          one to execute). This is obviously the inverse of the "and

          Example 26-3. Using or lists in combination with an and list


#, a not-so-cunning file deletion utility.
#  Usage: delete filename


if [ -z "$1" ]
  echo "Usage: `basename $0` filename"
  exit $E_BADARGS  # No arg? Bail out.
  file=$1          # Set filename.

[ ! -f "$file" ] && echo "File \"$file\" not found. \
Cowardly refusing to delete a nonexistent file."
# AND LIST, to give error message if file not present.
# Note echo message continuing on to a second line after an escape.

[ ! -f "$file" ] || (rm -f $file; echo "File \"$file\" deleted.")
# OR LIST, to delete file if present.

# Note logic inversion above.
# AND LIST executes on true, OR LIST on false.

exit $?


   If the first command in an or list returns true, it will execute.

# ==> The following snippets from the /etc/rc.d/init.d/single
#+==> script by Miquel van Smoorenburg
#+==> illustrate use of "and" and "or" lists.
# ==> "Arrowed" comments added by document author.

[ -x /usr/bin/clear ] && /usr/bin/clear
  # ==> If /usr/bin/clear exists, then invoke it.
  # ==> Checking for the existence of a command before calling it
  #+==> avoids error messages and other awkward consequences.

  # ==> . . .

# If they want to run something in single user mode, might as well run it...
for i in /etc/rc1.d/S[0-9][0-9]* ; do
        # Check if the script is there.
        [ -x "$i" ] || continue
  # ==> If corresponding file in $PWD *not* found,
  #+==> then "continue" by jumping to the top of the loop.

        # Reject backup files and files generated by rpm.
        case "$1" in
        [ "$i" = "/etc/rc1.d/S00single" ] && continue
  # ==> Set script name, but don't execute it yet.
        $i start

  # ==> . . .


   The exit status of an and list or an or list is the exit status of
   the last command executed.

   Clever combinations of and and or lists are possible, but the logic
   may easily become convoluted and require close attention to operator
   precedence rules, and possibly extensive debugging.

false && true || echo false         # false

# Same result as
( false && true ) || echo false     # false
# But NOT
false && ( true || echo false )     # (nothing echoed)

#  Note left-to-right grouping and evaluation of statements,
#+ since the logic operators "&&" and "||" have equal precedence.

#  It's usually best to avoid such complexities.

#  Thanks, S.C.

   See Example A-7 and Example 7-4 for illustrations of using and / or
   list constructs to test variables.

Chapter 27. Arrays

   Newer versions of Bash support one-dimensional arrays. Array elements
   may be initialized with the variable[xx] notation. Alternatively, a
   script may introduce the entire array by an explicit declare -a
   variable statement. To dereference (retrieve the contents of) an
   array element, use curly bracket notation, that is, ${element[xx]}.

   Example 27-1. Simple array usage


#  Array members need not be consecutive or contiguous.

#  Some members of the array can be left uninitialized.
#  Gaps in the array are okay.
#  In fact, arrays with sparse data ("sparse arrays")
#+ are useful in spreadsheet-processing software.

echo -n "area[11] = "
echo ${area[11]}    #  {curly brackets} needed.

echo -n "area[13] = "
echo ${area[13]}

echo "Contents of area[51] are ${area[51]}."

# Contents of uninitialized array variable print blank (null variable).
echo -n "area[43] = "
echo ${area[43]}
echo "(area[43] unassigned)"


# Sum of two array variables assigned to third
area[5]=`expr ${area[11]} + ${area[13]}`
echo "area[5] = area[11] + area[13]"
echo -n "area[5] = "
echo ${area[5]}

area[6]=`expr ${area[11]} + ${area[51]}`
echo "area[6] = area[11] + area[51]"
echo -n "area[6] = "
echo ${area[6]}
# This fails because adding an integer to a string is not permitted.

echo; echo; echo

# -----------------------------------------------------------------
# Another array, "area2".
# Another way of assigning array variables...
# array_name=( XXX YYY ZZZ ... )

area2=( zero one two three four )

echo -n "area2[0] = "
echo ${area2[0]}
# Aha, zero-based indexing (first element of array is [0], not [1]).

echo -n "area2[1] = "
echo ${area2[1]}    # [1] is second element of array.
# -----------------------------------------------------------------

echo; echo; echo

# -----------------------------------------------
# Yet another array, "area3".
# Yet another way of assigning array variables...
# array_name=([xx]=XXX [yy]=YYY ...)

area3=([17]=seventeen [24]=twenty-four)

echo -n "area3[17] = "
echo ${area3[17]}

echo -n "area3[24] = "
echo ${area3[24]}
# -----------------------------------------------

exit 0

   As we have seen, a convenient way of initializing an entire array is
   the array=( element1 element2 ... elementN ) notation.

base64_charset=( {A..Z} {a..z} {0..9} + / = )
               #  Using extended brace expansion
               #+ to initialize the elements of the array.
               #  Excerpted from vladz's "" script
               #+ in the "Contributed Scripts" appendix.

   Bash permits array operations on variables, even if the variables are
   not explicitly declared as arrays.

echo ${string[@]}               # abcABC123ABCabc
echo ${string[*]}               # abcABC123ABCabc
echo ${string[0]}               # abcABC123ABCabc
echo ${string[1]}               # No output!
                                # Why?
echo ${#string[@]}              # 1
                                # One element in the array.
                                # The string itself.

# Thank you, Michael Zick, for pointing this out.

   Once again this demonstrates that Bash variables are untyped.

   Example 27-2. Formatting a poem
# Pretty-prints one of the ABS Guide author's favorite poems.

# Lines of the poem (single stanza).
Line[1]="I do not know which to prefer,"
Line[2]="The beauty of inflections"
Line[3]="Or the beauty of innuendoes,"
Line[4]="The blackbird whistling"
Line[5]="Or just after."
# Note that quoting permits embedding whitespace.

# Attribution.
Attrib[1]=" Wallace Stevens"
Attrib[2]="\"Thirteen Ways of Looking at a Blackbird\""
# This poem is in the Public Domain (copyright expired).


tput bold   # Bold print.

for index in 1 2 3 4 5    # Five lines.
  printf "     %s\n" "${Line[index]}"

for index in 1 2          # Two attribution lines.
  printf "          %s\n" "${Attrib[index]}"

tput sgr0   # Reset terminal.
            # See 'tput' docs.


exit 0

# Exercise:
# --------
# Modify this script to pretty-print a poem from a text data file.

   Array variables have a syntax all their own, and even standard Bash
   commands and operators have special options adapted for array use.

   Example 27-3. Various array operations
# More fun with arrays.

array=( zero one two three four five )
# Element 0   1   2    3     4    5

echo ${array[0]}       #  zero
echo ${array:0}        #  zero
                       #  Parameter expansion of first element,
                       #+ starting at position # 0 (1st character).
echo ${array:1}        #  ero
                       #  Parameter expansion of first element,
                       #+ starting at position # 1 (2nd character).

echo "--------------"

echo ${#array[0]}      #  4
                       #  Length of first element of array.
echo ${#array}         #  4
                       #  Length of first element of array.
                       #  (Alternate notation)

echo ${#array[1]}      #  3
                       #  Length of second element of array.
                       #  Arrays in Bash have zero-based indexing.

echo ${#array[*]}      #  6
                       #  Number of elements in array.
echo ${#array[@]}      #  6
                       #  Number of elements in array.

echo "--------------"

array2=( [0]="first element" [1]="second element" [3]="fourth element" )
#            ^     ^       ^     ^      ^       ^     ^      ^       ^
# Quoting permits embedding whitespace within individual array elements.

echo ${array2[0]}      # first element
echo ${array2[1]}      # second element
echo ${array2[2]}      #
                       # Skipped in initialization, and therefore null.
echo ${array2[3]}      # fourth element
echo ${#array2[0]}     # 13    (length of first element)
echo ${#array2[*]}     # 3     (number of elements in array)


   Many of the standard string operations work on arrays.

   Example 27-4. String operations on arrays
# String operations on arrays.

# Script by Michael Zick.
# Used in ABS Guide with permission.
# Fixups: 05 May 08, 04 Aug 08.

#  In general, any string operation using the ${name ... } notation
#+ can be applied to all string elements in an array,
#+ with the ${name[@] ... } or ${name[*] ...} notation.

arrayZ=( one two three four five five )


# Trailing Substring Extraction
echo ${arrayZ[@]:0}     # one two three four five five
#                ^        All elements.

echo ${arrayZ[@]:1}     # two three four five five
#                ^        All elements following element[0].

echo ${arrayZ[@]:1:2}   # two three
#                  ^      Only the two elements after element[0].

echo "---------"

# Substring Removal

# Removes shortest match from front of string(s).

echo ${arrayZ[@]#f*r}   # one two three five five
#               ^       # Applied to all elements of the array.
                        # Matches "four" and removes it.

# Longest match from front of string(s)
echo ${arrayZ[@]##t*e}  # one two four five five
#               ^^      # Applied to all elements of the array.
                        # Matches "three" and removes it.

# Shortest match from back of string(s)
echo ${arrayZ[@]%h*e}   # one two t four five five
#               ^       # Applied to all elements of the array.
                        # Matches "hree" and removes it.

# Longest match from back of string(s)
echo ${arrayZ[@]%%t*e}  # one two four five five
#               ^^      # Applied to all elements of the array.
                        # Matches "three" and removes it.

echo "----------------------"

# Substring Replacement

# Replace first occurrence of substring with replacement.
echo ${arrayZ[@]/fiv/XYZ}   # one two three four XYZe XYZe
#               ^           # Applied to all elements of the array.

# Replace all occurrences of substring.
echo ${arrayZ[@]//iv/YY}    # one two three four fYYe fYYe
                            # Applied to all elements of the array.

# Delete all occurrences of substring.
# Not specifing a replacement defaults to 'delete' ...
echo ${arrayZ[@]//fi/}      # one two three four ve ve
#               ^^          # Applied to all elements of the array.

# Replace front-end occurrences of substring.
echo ${arrayZ[@]/#fi/XY}    # one two three four XYve XYve
#                ^          # Applied to all elements of the array.

# Replace back-end occurrences of substring.
echo ${arrayZ[@]/%ve/ZZ}    # one two three four fiZZ fiZZ
#                ^          # Applied to all elements of the array.

echo ${arrayZ[@]/%o/XX}     # one twXX three four five five
#                ^          # Why?

echo "-----------------------------"

replacement() {
    echo -n "!!!"

echo ${arrayZ[@]/%e/$(replacement)}
#                ^  ^^^^^^^^^^^^^^
# on!!! two thre!!! four fiv!!! fiv!!!
# The stdout of replacement() is the replacement string.
# Q.E.D: The replacement action is, in effect, an 'assignment.'

echo "------------------------------------"

#  Accessing the "for-each":
echo ${arrayZ[@]//*/$(replacement optional_arguments)}
#                ^^ ^^^^^^^^^^^^^
# !!! !!! !!! !!! !!! !!!

#  Now, if Bash would only pass the matched string
#+ to the function being called . . .


exit 0

#  Before reaching for a Big Hammer -- Perl, Python, or all the rest --
#  recall:
#    $( ... ) is command substitution.
#    A function runs as a sub-process.
#    A function writes its output (if echo-ed) to stdout.
#    Assignment, in conjunction with "echo" and command substitution,
#+   can read a function's stdout.
#    The name[@] notation specifies (the equivalent of) a "for-each"
#+   operation.
#  Bash is more powerful than you think!

   Command substitution can construct the individual elements of an

   Example 27-5. Loading the contents of a script into an array
# Loads this script into an array.
# Inspired by an e-mail from Chris Martin (thanks!).

script_contents=( $(cat "$0") )  #  Stores contents of this script ($0)
                                 #+ in an array.

for element in $(seq 0 $((${#script_contents[@]} - 1)))
  do                #  ${#script_contents[@]}
                    #+ gives number of elements in the array.
                    #  Question:
                    #  Why is  seq 0  necessary?
                    #  Try changing it to seq 1.
  echo -n "${script_contents[$element]}"
                    # List each field of this script on a single line.
# echo -n "${script_contents[element]}" also works because of ${ ... }.
  echo -n " -- "    # Use " -- " as a field separator.


exit 0

# Exercise:
# --------
#  Modify this script so it lists itself
#+ in its original format,
#+ complete with whitespace, line breaks, etc.

   In an array context, some Bash builtins have a slightly altered
   meaning. For example, unset deletes array elements, or even an entire

   Example 27-6. Some special properties of arrays

declare -a colors
#  All subsequent commands in this script will treat
#+ the variable "colors" as an array.

echo "Enter your favorite colors (separated from each other by a space)."

read -a colors    # Enter at least 3 colors to demonstrate features below.
#  Special option to 'read' command,
#+ allowing assignment of elements in an array.


# Special syntax to extract number of elements in array.
#     element_count=${#colors[*]} works also.
#  The "@" variable allows word splitting within quotes
#+ (extracts variables separated by whitespace).
#  This corresponds to the behavior of "$@" and "$*"
#+ in positional parameters.


while [ "$index" -lt "$element_count" ]
do    # List all the elements in the array.
  echo ${colors[$index]}
  #    ${colors[index]} also works because it's within ${ ... } brackets.
  let "index = $index + 1"
  # Or:
  #    ((index++))
# Each array element listed on a separate line.
# If this is not desired, use  echo -n "${colors[$index]} "
# Doing it with a "for" loop instead:
#   for i in "${colors[@]}"
#   do
#     echo "$i"
#   done
# (Thanks, S.C.)


# Again, list all the elements in the array, but using a more elegant method.
  echo ${colors[@]}          # echo ${colors[*]} also works.


# The "unset" command deletes elements of an array, or entire array.
unset colors[1]              # Remove 2nd element of array.
                             # Same effect as   colors[1]=
echo  ${colors[@]}           # List array again, missing 2nd element.

unset colors                 # Delete entire array.
                             #  unset colors[*] and
                             #+ unset colors[@] also work.
echo; echo -n "Colors gone."
echo ${colors[@]}            # List array again, now empty.

exit 0

   As seen in the previous example, either ${array_name[@]} or
   ${array_name[*]} refers to all the elements of the array. Similarly,
   to get a count of the number of elements in an array, use either
   ${#array_name[@]} or ${#array_name[*]}. ${#array_name} is the length
   (number of characters) of ${array_name[0]}, the first element of the

   Example 27-7. Of empty arrays and empty elements

#  Thanks to Stephane Chazelas for the original example,
#+ and to Michael Zick and Omair Eshkenazi, for extending it.
#  And to Nathan Coulter for clarifications and corrections.

# An empty array is not the same as an array with empty elements.

  array0=( first second third )
  array1=( '' )   # "array1" consists of one empty element.
  array2=( )      # No elements . . . "array2" is empty.
  array3=(   )    # What about this array?

echo "Elements in array0:  ${array0[@]}"
echo "Elements in array1:  ${array1[@]}"
echo "Elements in array2:  ${array2[@]}"
echo "Elements in array3:  ${array3[@]}"
echo "Length of first element in array0 = ${#array0}"
echo "Length of first element in array1 = ${#array1}"
echo "Length of first element in array2 = ${#array2}"
echo "Length of first element in array3 = ${#array3}"
echo "Number of elements in array0 = ${#array0[*]}"  # 3
echo "Number of elements in array1 = ${#array1[*]}"  # 1  (Surprise!)
echo "Number of elements in array2 = ${#array2[*]}"  # 0
echo "Number of elements in array3 = ${#array3[*]}"  # 0

# ===================================================================


# Try extending those arrays.

# Adding an element to an array.
array0=( "${array0[@]}" "new1" )
array1=( "${array1[@]}" "new1" )
array2=( "${array2[@]}" "new1" )
array3=( "${array3[@]}" "new1" )


# or


# When extended as above, arrays are 'stacks' ...
# Above is the 'push' ...
# The stack 'height' is:
echo "Stack height for array2 = $height"

# The 'pop' is:
unset array2[${#array2[@]}-1]   #  Arrays are zero-based,
height=${#array2[@]}            #+ which means first element has index 0.
echo "POP"
echo "New stack height for array2 = $height"


# List only 2nd and 3rd elements of array0.
from=1              # Zero-based numbering.
array3=( ${array0[@]:1:2} )
echo "Elements in array3:  ${array3[@]}"

# Works like a string (array of characters).
# Try some other "string" forms.

# Replacement:
array4=( ${array0[@]/second/2nd} )
echo "Elements in array4:  ${array4[@]}"

# Replace all matching wildcarded string.
array5=( ${array0[@]//new?/old} )
echo "Elements in array5:  ${array5[@]}"

# Just when you are getting the feel for this . . .
array6=( ${array0[@]#*new} )
echo # This one might surprise you.
echo "Elements in array6:  ${array6[@]}"

array7=( ${array0[@]#new1} )
echo # After array6 this should not be a surprise.
echo "Elements in array7:  ${array7[@]}"

# Which looks a lot like . . .
array8=( ${array0[@]/new1/} )
echo "Elements in array8:  ${array8[@]}"

#  So what can one say about this?

#  The string operations are performed on
#+ each of the elements in var[@] in succession.
#  Therefore : Bash supports string vector operations.
#  If the result is a zero length string,
#+ that element disappears in the resulting assignment.
#  However, if the expansion is in quotes, the null elements remain.

#  Michael Zick:    Question, are those strings hard or soft quotes?
#  Nathan Coulter:  There is no such thing as "soft quotes."
#!    What's really happening is that
#!+   the pattern matching happens after
#!+   all the other expansions of [word]
#!+   in cases like ${parameter#word}.

array9=( ${array0[@]/$zap/} )
echo "Number of elements in array9:  ${#array9[@]}"
array9=( "${array0[@]/$zap/}" )
echo "Elements in array9:  ${array9[@]}"
# This time the null elements remain.
echo "Number of elements in array9:  ${#array9[@]}"

# Just when you thought you were still in Kansas . . .
array10=( ${array0[@]#$zap} )
echo "Elements in array10:  ${array10[@]}"
# But, the asterisk in zap won't be interpreted if quoted.
array10=( ${array0[@]#"$zap"} )
echo "Elements in array10:  ${array10[@]}"
# Well, maybe we _are_ still in Kansas . . .
# (Revisions to above code block by Nathan Coulter.)

#  Compare array7 with array10.
#  Compare array8 with array9.

#  Reiterating: No such thing as soft quotes!
#  Nathan Coulter explains:
#  Pattern matching of 'word' in ${parameter#word} is done after
#+ parameter expansion and *before* quote removal.
#  In the normal case, pattern matching is done *after* quote removal.


   The relationship of ${array_name[@]} and ${array_name[*]} is
   analogous to that between $@ and $*. This powerful array notation has
   a number of uses.

# Copying an array.
array2=( "${array1[@]}" )
# or
#  However, this fails with "sparse" arrays,
#+ arrays with holes (missing elements) in them,
#+ as Jochen DeSmet points out.
# ------------------------------------------
# array1[1] not assigned
  array2=( "${array1[@]}" )       # Copy it?

echo ${array2[0]}      # 0
echo ${array2[2]}      # (null), should be 2
# ------------------------------------------

# Adding an element to an array.
array=( "${array[@]}" "new element" )
# or
array[${#array[*]}]="new element"

# Thanks, S.C.


   The array=( element1 element2 ... elementN ) initialization
   operation, with the help of command substitution, makes it possible
   to load the contents of a text file into an array.



#            cat sample_file
#            1 a b c
#            2 d e fg

declare -a array1

array1=( `cat "$filename"`)                #  Loads contents
#         List file to stdout              #+ of $filename into array1.
#  array1=( `cat "$filename" | tr '\n' ' '`)
#                            change linefeeds in file to spaces.
#  Not necessary because Bash does word splitting,
#+ changing linefeeds to spaces.

echo ${array1[@]}            # List the array.
#                              1 a b c 2 d e fg
#  Each whitespace-separated "word" in the file
#+ has been assigned to an element of the array.

echo $element_count          # 8

   Clever scripting makes it possible to add array operations.

   Example 27-8. Initializing arrays
#! /bin/bash
# array-assign.bash

#  Array operations are Bash-specific,
#+ hence the ".bash" in the script name.

# Copyright (c) Michael S. Zick, 2003, All rights reserved.
# License: Unrestricted reuse in any form, for any purpose.
# Version: $ID$
# Clarification and additional comments by William Park.

#  Based on an example provided by Stephane Chazelas
#+ which appeared in an earlier version of the
#+ Advanced Bash Scripting Guide.

# Output format of the 'times' command:
# User CPU <space> System CPU
# User CPU of dead children <space> System CPU of dead children

#  Bash has two versions of assigning all elements of an array
#+ to a new array variable.
#  Both drop 'null reference' elements
#+ in Bash versions 2.04 and later.
#  An additional array assignment that maintains the relationship of
#+ [subscript]=value for arrays may be added to newer versions.

#  Constructs a large array using an internal command,
#+ but anything creating an array of several thousand elements
#+ will do just fine.

declare -a bigOne=( /dev/* )  # All the files in /dev . . .
echo 'Conditions: Unquoted, default IFS, All-Elements-Of'
echo "Number of elements in array is ${#bigOne[@]}"

# set -vx

echo '- - testing: =( ${array[@]} ) - -'
declare -a bigTwo=( ${bigOne[@]} )
# Note parens:    ^              ^

echo '- - testing: =${array[@]} - -'
declare -a bigThree=${bigOne[@]}
# No parentheses this time.

#  Comparing the numbers shows that the second form, pointed out
#+ by Stephane Chazelas, is faster.
#  As William Park explains:
#+ The bigTwo array assigned element by element (because of parentheses),
#+ whereas bigThree assigned as a single string.
#  So, in essence, you have:
#                   bigTwo=( [0]="..." [1]="..." [2]="..." ... )
#                   bigThree=( [0]="... ... ..." )
#  Verify this by:  echo ${bigTwo[0]}
#                   echo ${bigThree[0]}

#  I will continue to use the first form in my example descriptions
#+ because I think it is a better illustration of what is happening.

#  The reusable portions of my examples will actual contain
#+ the second form where appropriate because of the speedup.

# MSZ: Sorry about that earlier oversight folks.

#  Note:
#  ----
#  The "declare -a" statements in lines 32 and 44
#+ are not strictly necessary, since it is implicit
#+ in the  Array=( ... )  assignment form.
#  However, eliminating these declarations slows down
#+ the execution of the following sections of the script.
#  Try it, and see.

exit 0


   Adding a superfluous declare -a statement to an array declaration may
   speed up execution of subsequent operations on the array.

   Example 27-9. Copying and concatenating arrays
#! /bin/bash
# This script written by Michael Zick.
# Used here with permission.

#  How-To "Pass by Name & Return by Name"
#+ or "Building your own assignment statement".

CpArray_Mac() {

# Assignment Command Statement Builder

    echo -n 'eval '
    echo -n "$2"                    # Destination name
    echo -n '=( ${'
    echo -n "$1"                    # Source name
    echo -n '[@]} )'

# That could all be a single command.
# Matter of style only.

declare -f CopyArray                # Function "Pointer"
CopyArray=CpArray_Mac               # Statement Builder


# Hype the array named $1.
# (Splice it together with array containing "Really Rocks".)
# Return in array named $2.

    local -a TMP
    local -a hype=( Really Rocks )

    $($CopyArray $1 TMP)
    TMP=( ${TMP[@]} ${hype[@]} )
    $($CopyArray TMP $2)

declare -a before=( Advanced Bash Scripting )
declare -a after

echo "Array Before = ${before[@]}"

Hype before after

echo "Array After = ${after[@]}"

# Too much hype?

echo "What ${after[@]:3:2}?"

declare -a modest=( ${after[@]:2:1} ${after[@]:3:2} )
#                    ---- substring extraction ----

echo "Array Modest = ${modest[@]}"

# What happened to 'before' ?

echo "Array Before = ${before[@]}"

exit 0

   Example 27-10. More on concatenating arrays
#! /bin/bash
# array-append.bash

# Copyright (c) Michael S. Zick, 2003, All rights reserved.
# License: Unrestricted reuse in any form, for any purpose.
# Version: $ID$
# Slightly modified in formatting by M.C.

# Array operations are Bash-specific.
# Legacy UNIX /bin/sh lacks equivalents.

#  Pipe the output of this script to 'more'
#+ so it doesn't scroll off the terminal.
#  Or, redirect output to a file.

declare -a array1=( zero1 one1 two1 )
# Subscript packed.
declare -a array2=( [0]=zero2 [2]=two2 [3]=three2 )
# Subscript sparse -- [1] is not defined.

echo '- Confirm that the array is really subscript sparse. -'
echo "Number of elements: 4"        # Hard-coded for illustration.
for (( i = 0 ; i < 4 ; i++ ))
    echo "Element [$i]: ${array2[$i]}"
# See also the more general code example in basics-reviewed.bash.

declare -a dest

# Combine (append) two arrays into a third array.
echo 'Conditions: Unquoted, default IFS, All-Elements-Of operator'
echo '- Undefined elements not present, subscripts not maintained. -'
# # The undefined elements do not exist; they are not being dropped.

dest=( ${array1[@]} ${array2[@]} )
# dest=${array1[@]}${array2[@]}     # Strange results, possibly a bug.

# Now, list the result.
echo '- - Testing Array Append - -'

echo "Number of elements: $cnt"
for (( i = 0 ; i < cnt ; i++ ))
    echo "Element [$i]: ${dest[$i]}"

# Assign an array to a single array element (twice).

# List the result.
echo '- - Testing modified array - -'

echo "Number of elements: $cnt"
for (( i = 0 ; i < cnt ; i++ ))
    echo "Element [$i]: ${dest[$i]}"

# Examine the modified second element.
echo '- - Reassign and list second element - -'

declare -a subArray=${dest[1]}

echo "Number of elements: $cnt"
for (( i = 0 ; i < cnt ; i++ ))
    echo "Element [$i]: ${subArray[$i]}"

#  The assignment of an entire array to a single element
#+ of another array using the '=${ ... }' array assignment
#+ has converted the array being assigned into a string,
#+ with the elements separated by a space (the first character of IFS).

# If the original elements didn't contain whitespace . . .
# If the original array isn't subscript sparse . . .
# Then we could get the original array structure back again.

# Restore from the modified second element.
echo '- - Listing restored element - -'

declare -a subArray=( ${dest[1]} )

echo "Number of elements: $cnt"
for (( i = 0 ; i < cnt ; i++ ))
    echo "Element [$i]: ${subArray[$i]}"
echo '- - Do not depend on this behavior. - -'
echo '- - This behavior is subject to change - -'
echo '- - in versions of Bash newer than version 2.05b - -'

# MSZ: Sorry about any earlier confusion folks.

exit 0


   Arrays permit deploying old familiar algorithms as shell scripts.
   Whether this is necessarily a good idea is left for the reader to

   Example 27-11. The Bubble Sort
# Bubble sort, of sorts.

# Recall the algorithm for a bubble sort. In this particular version...

#  With each successive pass through the array to be sorted,
#+ compare two adjacent elements, and swap them if out of order.
#  At the end of the first pass, the "heaviest" element has sunk to bottom.
#  At the end of the second pass, the next "heaviest" one has sunk next to bot
#  And so forth.
#  This means that each successive pass needs to traverse less of the array.
#  You will therefore notice a speeding up in the printing of the later passes

  # Swaps two members of the array.
  local temp=${Countries[$1]} #  Temporary storage
                              #+ for element getting swapped out.


declare -a Countries  #  Declare array,
                      #+ optional here since it's initialized below.

#  Is it permissable to split an array variable over multiple lines
#+ using an escape (\)?
#  Yes.

Countries=(Netherlands Ukraine Zaire Turkey Russia Yemen Syria \
Brazil Argentina Nicaragua Japan Mexico Venezuela Greece England \
Israel Peru Canada Oman Denmark Wales France Kenya \
Xanadu Qatar Liechtenstein Hungary)

# "Xanadu" is the mythical place where, according to Coleridge,
#+ Kubla Khan did a pleasure dome decree.

clear                      # Clear the screen to start with.

echo "0: ${Countries[*]}"  # List entire array at pass 0.

let "comparisons = $number_of_elements - 1"

count=1 # Pass number.

while [ "$comparisons" -gt 0 ]          # Beginning of outer loop

  index=0  # Reset index to start of array after each pass.

  while [ "$index" -lt "$comparisons" ] # Beginning of inner loop
    if [ ${Countries[$index]} \> ${Countries[`expr $index + 1`]} ]
    #  If out of order...
    #  Recalling that \> is ASCII comparison operator
    #+ within single brackets.

    #  if [[ ${Countries[$index]} > ${Countries[`expr $index + 1`]} ]]
    #+ also works.
      exchange $index `expr $index + 1`  # Swap.
    let "index += 1"  # Or,   index+=1   on Bash, ver. 3.1 or newer.
  done # End of inner loop

# ----------------------------------------------------------------------
# Paulo Marcel Coelho Aragao suggests for-loops as a simpler altenative.
# for (( last = $number_of_elements - 1 ; last > 0 ; last-- ))
##                     Fix by C.Y. Hunt          ^   (Thanks!)
# do
#     for (( i = 0 ; i < last ; i++ ))
#     do
#         [[ "${Countries[$i]}" > "${Countries[$((i+1))]}" ]] \
#             && exchange $i $((i+1))
#     done
# done
# ----------------------------------------------------------------------

let "comparisons -= 1" #  Since "heaviest" element bubbles to bottom,
                       #+ we need do one less comparison each pass.

echo "$count: ${Countries[@]}"  # Print resultant array at end of each pass.
let "count += 1"                # Increment pass count.

done                            # End of outer loop
                                # All done.

exit 0


   Is it possible to nest arrays within arrays?

# "Nested" array.

#  Michael Zick provided this example,
#+ with corrections and clarifications by William Park.

AnArray=( $(ls --inode --ignore-backups --almost-all \
        --directory --full-time --color=none --time=status \
        --sort=time -l ${PWD} ) )  # Commands and options.

# Spaces are significant . . . and don't quote anything in the above.

SubArray=( ${AnArray[@]:11:1}  ${AnArray[@]:6:5} )
#  This array has six elements:
#+     SubArray=( [0]=${AnArray[11]} [1]=${AnArray[6]} [2]=${AnArray[7]}
#      [3]=${AnArray[8]} [4]=${AnArray[9]} [5]=${AnArray[10]} )
#  Arrays in Bash are (circularly) linked lists
#+ of type string (char *).
#  So, this isn't actually a nested array,
#+ but it's functionally similar.

echo "Current directory and date of last status change:"
echo "${SubArray[@]}"

exit 0


   Embedded arrays in combination with indirect references create some
   fascinating possibilities

   Example 27-12. Embedded arrays and indirect references
# Embedded arrays and indirect references.

# This script by Dennis Leeuw.
# Used with permission.
# Modified by document author.


        STRING="VAR1=value1 VAR2=value2 VAR3=value3"
)       # Embed ARRAY1 within this second array.

function print () {
        IFS=$'\n'       #  To print each array element
                        #+ on a separate line.
        local ${!TEST1} # See what happens if you delete this line.
        #  Indirect reference.
        #  This makes the components of $TEST1
        #+ accessible to this function.

        #  Let's see what we've got so far.
        echo "\$TEST1 = $TEST1"       #  Just the name of the variable.
        echo; echo
        echo "{\$TEST1} = ${!TEST1}"  #  Contents of the variable.
                                      #  That's what an indirect
                                      #+ reference does.
        echo "-------------------------------------------"; echo

        # Print variable
        echo "Variable VARIABLE: $VARIABLE"

        # Print a string element
        local ${!TEST2}      # Indirect reference (as above).
        echo "String element VAR2: $VAR2 from STRING"

        # Print an array element
        local ${!TEST2}      # Indirect reference (as above).
        echo "Array element VAR1_1: $VAR1_1 from ARRAY21"


exit 0

#   As the author of the script notes,
#+ "you can easily expand it to create named-hashes in bash."
#   (Difficult) exercise for the reader: implement this.


   Arrays enable implementing a shell script version of the Sieve of
   Eratosthenes. Of course, a resource-intensive application of this
   nature should really be written in a compiled language, such as C. It
   runs excruciatingly slowly as a script.

   Example 27-13. The Sieve of Eratosthenes
# (

# Sieve of Eratosthenes
# Ancient algorithm for finding prime numbers.

#  This runs a couple of orders of magnitude slower
#+ than the equivalent program written in C.

LOWER_LIMIT=1       # Starting with 1.
UPPER_LIMIT=1000    # Up to 1000.
# (You may set this higher . . . if you have time on your hands.)


# Optimization:
# Need to test numbers only halfway to upper limit. Why?

declare -a Primes
# Primes[] is an array.

initialize ()
# Initialize the array.

until [ "$i" -gt "$UPPER_LIMIT" ]
  let "i += 1"
#  Assume all array members guilty (prime)
#+ until proven innocent.

print_primes ()
# Print out the members of the Primes[] array tagged as prime.


until [ "$i" -gt "$UPPER_LIMIT" ]

  if [ "${Primes[i]}" -eq "$PRIME" ]
    printf "%8d" $i
    # 8 spaces per number gives nice, even columns.

  let "i += 1"



sift () # Sift out the non-primes.

let i=$LOWER_LIMIT+1
# Let's start with 2.

until [ "$i" -gt "$UPPER_LIMIT" ]

if [ "${Primes[i]}" -eq "$PRIME" ]
# Don't bother sieving numbers already sieved (tagged as non-prime).


  while [ "$t" -le "$UPPER_LIMIT" ]
    let "t += $i "
    # Tag as non-prime all multiples.


  let "i += 1"


# ==============================================
# main ()
# Invoke the functions sequentially.
# This is what they call structured programming.
# ==============================================


exit 0

# -------------------------------------------------------- #
# Code below line will not execute, because of 'exit.'

#  This improved version of the Sieve, by Stephane Chazelas,
#+ executes somewhat faster.

# Must invoke with command-line argument (limit of primes).

UPPER_LIMIT=$1                  # From command-line.
let SPLIT=UPPER_LIMIT/2         # Halfway to max number.

Primes=( '' $(seq $UPPER_LIMIT) )

until (( ( i += 1 ) > SPLIT ))  # Need check only halfway.
  if [[ -n $Primes[i] ]]
    until (( ( t += i ) > UPPER_LIMIT ))
echo ${Primes[*]}

exit $?

   Example 27-14. The Sieve of Eratosthenes, Optimized
# Optimized Sieve of Eratosthenes
# Script by Jared Martin, with very minor changes by ABS Guide author.
# Used in ABS Guide with permission (thanks!).

# Based on script in Advanced Bash Scripting Guide.
# (

# (reference)
# Check results against

# Necessary but not sufficient would be, e.g.,
#     (($(sieve 7919 | wc -w) == 1000)) && echo "7919 is the 1000th prime"

UPPER_LIMIT=${1:?"Need an upper limit of primes to search."}

Primes=( '' $(seq ${UPPER_LIMIT}) )

typeset -i i t
Primes[i=1]='' # 1 is not a prime.
until (( ( i += 1 ) > (${UPPER_LIMIT}/i) ))  # Need check only ith-way.
  do                                         # Why?
    if ((${Primes[t=i*(i-1), i]}))
    # Obscure, but instructive, use of arithmetic expansion in subscript.
      until (( ( t += i ) > ${UPPER_LIMIT} ))
        do Primes[t]=; done

# echo ${Primes[*]}
echo   # Change to original script for pretty-printing (80-col. display).
printf "%8d" ${Primes[*]}
echo; echo

exit $?

   Compare these array-based prime number generators with alternatives
   that do not use arrays, Example A-15, and Example 16-46.


   Arrays lend themselves, to some extent, to emulating data structures
   for which Bash has no native support.

   Example 27-15. Emulating a push-down stack
# push-down stack simulation

#  Similar to the CPU stack, a push-down stack stores data items
#+ sequentially, but releases them in reverse order, last-in first-out.

BP=100            #  Base Pointer of stack array.
                  #  Begin at element 100.

SP=$BP            #  Stack Pointer.
                  #  Initialize it to "base" (bottom) of stack.

Data=             #  Contents of stack location.
                  #  Must use global variable,
                  #+ because of limitation on function return range.

                  # 100     Base pointer       <-- Base Pointer
                  #  99     First data item
                  #  98     Second data item
                  # ...     More data
                  #         Last data item     <-- Stack pointer

declare -a stack

push()            # Push item on stack.
if [ -z "$1" ]    # Nothing to push?

let "SP -= 1"     # Bump stack pointer.


pop()                    # Pop item off stack.
Data=                    # Empty out data item.

if [ "$SP" -eq "$BP" ]   # Stack empty?
fi                       #  This also keeps SP from getting past 100,
                         #+ i.e., prevents a runaway stack.

let "SP += 1"            # Bump stack pointer.

status_report()          # Find out what's happening.
echo "-------------------------------------"
echo "REPORT"
echo "Stack Pointer = $SP"
echo "Just popped \""$Data"\" off the stack."
echo "-------------------------------------"

# =======================================================
# Now, for some fun.


# See if you can pop anything off empty stack.


push garbage
status_report     # Garbage in, garbage out.

value1=23;        push $value1
value2=skidoo;    push $value2
value3=LAST;      push $value3

pop               # LAST
pop               # skidoo
pop               # 23
status_report     # Last-in, first-out!

#  Notice how the stack pointer decrements with each push,
#+ and increments with each pop.


exit 0

# =======================================================

# Exercises:
# ---------

# 1)  Modify the "push()" function to permit pushing
#   + multiple element on the stack with a single function call.

# 2)  Modify the "pop()" function to permit popping
#   + multiple element from the stack with a single function call.

# 3)  Add error checking to the critical functions.
#     That is, return an error code, depending on
#   + successful or unsuccessful completion of the operation,
#   + and take appropriate action.

# 4)  Using this script as a starting point,
#   + write a stack-based 4-function calculator.


   Fancy manipulation of array "subscripts" may require intermediate
   variables. For projects involving this, again consider using a more
   powerful programming language, such as Perl or C.

   Example 27-16. Complex array application: Exploring a weird
   mathematical series

# Douglas Hofstadter's notorious "Q-series":

# Q(1) = Q(2) = 1
# Q(n) = Q(n - Q(n-1)) + Q(n - Q(n-2)), for n>2

#  This is a "chaotic" integer series with strange
#+ and unpredictable behavior.
#  The first 20 terms of the series are:
#  1 1 2 3 3 4 5 5 6 6 6 8 8 8 10 9 10 11 11 12

#  See Hofstadter's book, _Goedel, Escher, Bach: An Eternal Golden Braid_,
#+ p. 137, ff.

LIMIT=100     # Number of terms to calculate.
LINEWIDTH=20  # Number of terms printed per line.

Q[1]=1        # First two terms of series are 1.

echo "Q-series [$LIMIT terms]:"
echo -n "${Q[1]} "             # Output first two terms.
echo -n "${Q[2]} "

for ((n=3; n <= $LIMIT; n++))  # C-like loop expression.
do   # Q[n] = Q[n - Q[n-1]] + Q[n - Q[n-2]]  for n>2
#    Need to break the expression into intermediate terms,
#+   since Bash doesn't handle complex array arithmetic very well.

  let "n1 = $n - 1"        # n-1
  let "n2 = $n - 2"        # n-2

  t0=`expr $n - ${Q[n1]}`  # n - Q[n-1]
  t1=`expr $n - ${Q[n2]}`  # n - Q[n-2]

  T0=${Q[t0]}              # Q[n - Q[n-1]]
  T1=${Q[t1]}              # Q[n - Q[n-2]]

Q[n]=`expr $T0 + $T1`      # Q[n - Q[n-1]] + Q[n - Q[n-2]]
echo -n "${Q[n]} "

if [ `expr $n % $LINEWIDTH` -eq 0 ]    # Format output.
then   #      ^ modulo
  echo # Break lines into neat chunks.



exit 0

#  This is an iterative implementation of the Q-series.
#  The more intuitive recursive implementation is left as an exercise.
#  Warning: calculating this series recursively takes a VERY long time
#+ via a script. C/C++ would be orders of magnitude faster.


   Bash supports only one-dimensional arrays, though a little trickery
   permits simulating multi-dimensional ones.

   Example 27-17. Simulating a two-dimensional array, then tilting it
# Simulating a two-dimensional array.

# A one-dimensional array consists of a single row.
# A two-dimensional array stores rows sequentially.

# 5 X 5 Array.

declare -a alpha     # char alpha [Rows] [Columns];
                     # Unnecessary declaration. Why?

load_alpha ()
local rc=0
local index

for i in A B C D E F G H I J K L M N O P Q R S T U V W X Y
do     # Use different symbols if you like.
  local row=`expr $rc / $Columns`
  local column=`expr $rc % $Rows`
  let "index = $row * $Rows + $column"
# alpha[$row][$column]
  let "rc += 1"

#  Simpler would be
#+   declare -a alpha=( A B C D E F G H I J K L M N O P Q R S T U V W X Y )
#+ but this somehow lacks the "flavor" of a two-dimensional array.

print_alpha ()
local row=0
local index


while [ "$row" -lt "$Rows" ]   #  Print out in "row major" order:
do                             #+ columns vary,
                               #+ while row (outer loop) remains the same.
  local column=0

  echo -n "       "            #  Lines up "square" array with rotated one.

  while [ "$column" -lt "$Columns" ]
    let "index = $row * $Rows + $column"
    echo -n "${alpha[index]} "  # alpha[$row][$column]
    let "column += 1"

  let "row += 1"


# The simpler equivalent is
#     echo ${alpha[*]} | xargs -n $Columns


filter ()     # Filter out negative array indices.

echo -n "  "  # Provides the tilt.
              # Explain how.

if [[ "$1" -ge 0 &&  "$1" -lt "$Rows" && "$2" -ge 0 && "$2" -lt "$Columns" ]]
    let "index = $1 * $Rows + $2"
    # Now, print it rotated.
    echo -n " ${alpha[index]}"
    #           alpha[$row][$column]


rotate ()  #  Rotate the array 45 degrees --
{          #+ "balance" it on its lower lefthand corner.
local row
local column

for (( row = Rows; row > -Rows; row-- ))
  do       # Step through the array backwards. Why?

  for (( column = 0; column < Columns; column++ ))

    if [ "$row" -ge 0 ]
      let "t1 = $column - $row"
      let "t2 = $column"
      let "t1 = $column"
      let "t2 = $column + $row"

    filter $t1 $t2   # Filter out negative array indices.
                     # What happens if you don't do this?

  echo; echo


#  Array rotation inspired by examples (pp. 143-146) in
#+ "Advanced C Programming on the IBM PC," by Herbert Mayer
#+ (see bibliography).
#  This just goes to show that much of what can be done in C
#+ can also be done in shell scripting.


#--------------- Now, let the show begin. ------------#
load_alpha     # Load the array.
print_alpha    # Print it out.
rotate         # Rotate it 45 degrees counterclockwise.

exit 0

# This is a rather contrived, not to mention inelegant simulation.

# Exercises:
# ---------
# 1)  Rewrite the array loading and printing functions
#     in a more intuitive and less kludgy fashion.
# 2)  Figure out how the array rotation functions work.
#     Hint: think about the implications of backwards-indexing an array.
# 3)  Rewrite this script to handle a non-square array,
#     such as a 6 X 4 one.
#     Try to minimize "distortion" when the array is rotated.

   A two-dimensional array is essentially equivalent to a
   one-dimensional one, but with additional addressing modes for
   referencing and manipulating the individual elements by row and
   column position.

   For an even more elaborate example of simulating a two-dimensional
   array, see Example A-10.


   For more interesting scripts using arrays, see:

     * Example 12-3
     * Example 16-46
     * Example A-22
     * Example A-44
     * Example A-41
     * Example A-42

Chapter 28. Indirect References

   We have seen that referencing a variable, $var, fetches its value.
   But, what about the value of a value? What about $$var?

   The actual notation is \$$var, usually preceded by an eval (and
   sometimes an echo). This is called an indirect reference.

   Example 28-1. Indirect Variable References
# Indirect variable referencing.
# Accessing the contents of the contents of a variable.

# First, let's fool around a little.


echo "\$var   = $var"           # $var   = 23
# So far, everything as expected. But ...

echo "\$\$var  = $$var"         # $$var  = 4570var
#  Not useful ...
#  \$\$ expanded to PID of the script
#  -- refer to the entry on the $$ variable --
#+ and "var" is echoed as plain text.
#  (Thank you, Jakob Bohm, for pointing this out.)

echo "\\\$\$var = \$$var"       # \$$var = $23
#  As expected. The first $ is escaped and pasted on to
#+ the value of var ($var = 23 ).
#  Meaningful, but still not useful.

# Now, let's start over and do it the right way.

# ============================================== #

a=letter_of_alphabet   # Variable "a" holds the name of another variable.


# Direct reference.
echo "a = $a"          # a = letter_of_alphabet

# Indirect reference.
  eval a=\$$a
# ^^^        Forcing an eval(uation), and ...
#        ^   Escaping the first $ ...
# ------------------------------------------------------------------------
# The 'eval' forces an update of $a, sets it to the updated value of \$$a.
# So, we see why 'eval' so often shows up in indirect reference notation.
# ------------------------------------------------------------------------
  echo "Now a = $a"    # Now a = z


# Now, let's try changing the second-order reference.

echo "\"table_cell_3\" = $table_cell_3"            # "table_cell_3" = 24
echo -n "dereferenced \"t\" = "; eval echo \$$t    # dereferenced "t" = 24
# In this simple case, the following also works (why?).
#         eval t=\$$t; echo "\"t\" = $t"


echo "Changing value of \"table_cell_3\" to $NEW_VAL."
echo "\"table_cell_3\" now $table_cell_3"
echo -n "dereferenced \"t\" now "; eval echo \$$t
# "eval" takes the two arguments "echo" and "\$$t" (set equal to $table_cell_3


# (Thanks, Stephane Chazelas, for clearing up the above behavior.)

#   A more straightforward method is the ${!t} notation, discussed in the
#+ "Bash, version 2" section.
#   See also

exit 0

   Indirect referencing in Bash is a multi-step process. First, take the
   name of a variable: varname. Then, reference it: $varname. Then,
   reference the reference: $$varname. Then, escape the first $:
   \$$varname. Finally, force a reevaluation of the expression and
   assign it: eval newvar=\$$varname.

   Of what practical use is indirect referencing of variables? It gives
   Bash a little of the functionality of pointers in C, for instance, in
   table lookup. And, it also has some other very interesting
   applications. . . .

   Nils Radtke shows how to build "dynamic" variable names and evaluate
   their contents. This can be useful when sourcing configuration files.


# ---------------------------------------------
# This could be "sourced" from a separate file.
# ---------------------------------------------

remoteNet=$(eval "echo \$$(echo isdn${isdnOnlineService}RemoteNet)")
remoteNet=$(eval "echo \$$(echo isdnMyProviderRemoteNet)")
remoteNet=$(eval "echo \$isdnMyProviderRemoteNet")
remoteNet=$(eval "echo $isdnMyProviderRemoteNet")

echo "$remoteNet"    #

# ================================================================

#  And, it gets even better.

#  Consider the following snippet given a variable named getSparc,
#+ but no such variable getIa64:

chkMirrorArchs () {
  if [ "$(eval "echo \${$(echo get$(echo -ne $arch |
       sed 's/^\(.\).*/\1/g' | tr 'a-z' 'A-Z'; echo $arch |
       sed 's/^.\(.*\)/\1/g')):-false}")" = true ]
     return 0;
     return 1;

unset getIa64
chkMirrorArchs sparc
echo $?        # 0
               # True

chkMirrorArchs Ia64
echo $?        # 1
               # False

# Notes:
# -----
# Even the to-be-substituted variable name part is built explicitly.
# The parameters to the chkMirrorArchs calls are all lower case.
# The variable name is composed of two parts: "get" and "Sparc" . . .

   Example 28-2. Passing an indirect reference to awk

#  Another version of the "column totaler" script
#+ that adds up a specified column (of numbers) in the target file.
#  This one uses indirect references.


if [ $# -ne "$ARGS" ] # Check for proper number of command-line args.
   echo "Usage: `basename $0` filename column-number"
   exit $E_WRONGARGS

filename=$1         # Name of file to operate on.
column_number=$2    # Which column to total up.

#===== Same as original script, up to this point =====#

# A multi-line awk script is invoked by
#   awk "
#   ...
#   ...
#   ...
#   "

# Begin awk script.
# -------------------------------------------------
awk "

{ total += \$${column_number} # Indirect reference
     print total

     " "$filename"
# Note that awk doesn't need an eval preceding \$$.
# -------------------------------------------------
# End awk script.

#  Indirect variable reference avoids the hassles
#+ of referencing a shell variable within the embedded awk script.
#  Thanks, Stephane Chazelas.

exit $?


   This method of indirect referencing is a bit tricky. If the second
   order variable changes its value, then the first order variable must
   be properly dereferenced (as in the above example). Fortunately, the
   ${!variable} notation introduced with version 2 of Bash (see Example
   37-2 and Example A-22) makes indirect referencing more intuitive.

   Bash does not support pointer arithmetic, and this severely limits
   the usefulness of indirect referencing. In fact, indirect referencing
   in a scripting language is, at best, something of an afterthought.

Chapter 29. /dev and /proc

   A Linux or UNIX filesystem typically has the /dev and /proc
   special-purpose directories.

29.1. /dev

   The /dev directory contains entries for the physical devices that may
   or may not be present in the hardware. [112] Appropriately enough,
   these are called device files. As an example, the hard drive
   partitions containing the mounted filesystem(s) have entries in /dev,
   as df shows.

bash$ df
Filesystem           1k-blocks      Used Available Use%
 Mounted on
 /dev/hda6               495876    222748    247527  48% /
 /dev/hda1                50755      3887     44248   9% /boot
 /dev/hda8               367013     13262    334803   4% /home
 /dev/hda5              1714416   1123624    503704  70% /usr

   Among other things, the /dev directory contains loopback devices,
   such as /dev/loop0. A loopback device is a gimmick that allows an
   ordinary file to be accessed as if it were a block device. [113] This
   permits mounting an entire filesystem within a single large file. See
   Example 17-8 and Example 17-7.

   A few of the pseudo-devices in /dev have other specialized uses, such
   as /dev/null, /dev/zero, /dev/urandom, /dev/sda1 (hard drive
   partition), /dev/udp (User Datagram Packet port), and /dev/tcp.

   For instance:

   To manually mount a USB flash drive, append the following line to
   /etc/fstab. [114]
   /dev/sda1    /mnt/flashdrive    auto    noauto,user,noatime    0 0

   (See also Example A-23.)

   Checking whether a disk is in the CD-burner (soft-linked to
head -1 /dev/hdc

#  head: cannot open '/dev/hdc' for reading: No medium found
#  (No disc in the drive.)

#  head: error reading '/dev/hdc': Input/output error
#  (There is a disk in the drive, but it can't be read;
#+  possibly it's an unrecorded CDR blank.)

#  Stream of characters and assorted gibberish
#  (There is a pre-recorded disk in the drive,
#+ and this is raw output -- a stream of ASCII and binary data.)
#  Here we see the wisdom of using 'head' to limit the output
#+ to manageable proportions, rather than 'cat' or something similar.

#  Now, it's just a matter of checking/parsing the output and taking
#+ appropriate action.

   When executing a command on a /dev/tcp/$host/$port pseudo-device
   file, Bash opens a TCP connection to the associated socket.

   A socket is a communications node associated with a specific I/O
   port. (This is analogous to a hardware socket, or receptacle, for a
   connecting cable.) It permits data transfer between hardware devices
   on the same machine, between machines on the same network, between
   machines across different networks, and, of course, between machines
   at different locations on the Internet.

   The following examples assume an active Internet connection.

   Getting the time from
bash$ cat </dev/tcp/
53082 04-03-18 04:26:54 68 0 0 502.3 UTC(NIST) *

   [Mark contributed this example.]

   Generalizing the above into a script:



Time=$(cat </dev/tcp/"$URL")
UTC=$(echo "$Time" | awk '{print$3}')   # Third field is UTC (GMT) time.
# Exercise: modify this for different time zones.

echo "UTC Time = "$UTC""

   Downloading a URL:
bash$ exec 5<>/dev/tcp/
bash$ echo -e "GET / HTTP/1.0\n" >&5
bash$ cat <&5

   [Thanks, Mark and Mihai Maties.]

   Example 29-1. Using /dev/tcp for troubleshooting
# /dev/tcp redirection to check Internet connection.

# Script by Troy Engel.
# Used with permission.   # A known spam-friendly ISP.
TCP_PORT=80                # Port 80 is http.

# Try to connect. (Somewhat similar to a 'ping' . . .)
echo "HEAD / HTTP/1.0" >/dev/tcp/${TCP_HOST}/${TCP_PORT}

If bash was compiled with --enable-net-redirections, it has the capability of
using a special character device for both TCP and UDP redirections. These
redirections are used identically as STDIN/STDOUT/STDERR. The device entries
are 30,36 for /dev/tcp:

  mknod /dev/tcp c 30 36

>From the bash reference:
    If host is a valid hostname or Internet address, and port is an integer
port number or service name, Bash attempts to open a TCP connection to the
corresponding socket.

if [ "X$MYEXIT" = "X0" ]; then
  echo "Connection successful. Exit code: $MYEXIT"
  echo "Connection unsuccessful. Exit code: $MYEXIT"

exit $MYEXIT

   Example 29-2. Playing music


# Author: Antonio Macchi
# Used in ABS Guide with permission

#  /dev/dsp default = 8000 frames per second, 8 bits per frame (1 byte),
#+ 1 channel (mono)

duration=2000       # If 8000 bytes = 1 second, then 2000 = 1/4 second.
volume=$'\xc0'      # Max volume = \xff (or \x00).
mute=$'\x80'        # No volume = \x80 (the middle).

function mknote ()  # $1=Note Hz in bytes (e.g. A = 440Hz ::
{                   #+ 8000 fps / 440 = 16 :: A = 16 bytes per second)
  for t in `seq 0 $duration`
    test $(( $t % $1 )) = 0 && echo -n $volume || echo -n $mute

e=`mknote 49`
g=`mknote 41`
a=`mknote 36`
b=`mknote 32`
c=`mknote 30`
cis=`mknote 29`
d=`mknote 27`
e2=`mknote 24`
n=`mknote 32767`
# European notation.

echo -n "$g$e2$d$c$d$c$a$g$n$g$e$n$g$e2$d$c$c$b$c$cis$n$cis$d \
$n$g$e2$d$c$d$c$a$g$n$g$e$n$g$a$d$c$b$a$b$c" > /dev/dsp
# dsp = Digital Signal Processor

exit      # A "bonny" example of a shell script!

29.2. /proc

   The /proc directory is actually a pseudo-filesystem. The files in
   /proc mirror currently running system and kernel processes and
   contain information and statistics about them.

bash$ cat /proc/devices
Character devices:
   1 mem
   2 pty
   3 ttyp
   4 ttyS
   5 cua
   7 vcs
  10 misc
  14 sound
  29 fb
  36 netlink
 128 ptm
 136 pts
 162 raw
 254 pcmcia

 Block devices:
   1 ramdisk
   2 fd
   3 ide0
   9 md

bash$ cat /proc/interrupts
   0:      84505          XT-PIC  timer
   1:       3375          XT-PIC  keyboard
   2:          0          XT-PIC  cascade
   5:          1          XT-PIC  soundblaster
   8:          1          XT-PIC  rtc
  12:       4231          XT-PIC  PS/2 Mouse
  14:     109373          XT-PIC  ide0
 NMI:          0
 ERR:          0

bash$ cat /proc/partitions
major minor  #blocks  name     rio rmerge rsect ruse wio wmerge wsect wuse run
ning use aveq

    3     0    3007872 hda 4472 22260 114520 94240 3551 18703 50384 549710 0 1
11550 644030
    3     1      52416 hda1 27 395 844 960 4 2 14 180 0 800 1140
    3     2          1 hda2 0 0 0 0 0 0 0 0 0 0 0
    3     4     165280 hda4 10 0 20 210 0 0 0 0 0 210 210

bash$ cat /proc/loadavg
0.13 0.42 0.27 2/44 1119

bash$ cat /proc/apm
1.16 1.2 0x03 0x01 0xff 0x80 -1% -1 ?

bash$ cat /proc/acpi/battery/BAT0/info
present:                 yes
 design capacity:         43200 mWh
 last full capacity:      36640 mWh
 battery technology:      rechargeable
 design voltage:          10800 mV
 design capacity warning: 1832 mWh
 design capacity low:     200 mWh
 capacity granularity 1:  1 mWh
 capacity granularity 2:  1 mWh
 model number:            IBM-02K6897
 serial number:            1133
 battery type:            LION
 OEM info:                Panasonic

bash$ fgrep Mem /proc/meminfo
MemTotal:       515216 kB
 MemFree:        266248 kB

   Shell scripts may extract data from certain of the files in /proc.

FS=iso                       # ISO filesystem support in kernel?

grep $FS /proc/filesystems   # iso9660

   kernel_version=$( awk '{ print $3 }' /proc/version )

CPU=$( awk '/model name/ {print $5}' < /proc/cpuinfo )

if [ "$CPU" = "Pentium(R)" ]

cpu_speed=$( fgrep "cpu MHz" /proc/cpuinfo | awk '{print $4}' )
#  Current operating speed (in MHz) of the cpu on your machine.
#  On a laptop this may vary, depending on use of battery
#+ or AC power.

# Get the command-line parameters of a process.


# Identify PID.
pid=$( echo $(pidof "$1") | awk '{ print $1 }' )
# Get only first            ^^^^^^^^^^^^^^^^^^ of multiple instances.

echo "Process ID of (first instance of) "$1" = $pid"
echo -n "Command-line arguments: "
cat /proc/"$pid"/"$OPTION" | xargs -0 echo
#   Formats output:        ^^^^^^^^^^^^^^^
#   (Thanks, Han Holl, for the fixup!)

echo; echo

# For example:
# sh xterm



bus_speed=$(fgrep -m 1 "$text" $devfile | awk '{print $9}')
#                 ^^^^ Stop after first match.

if [ "$bus_speed" = "$USB1" ]
  echo "USB 1.1 port found."
  # Do something appropriate for USB 1.1.


   It is even possible to control certain peripherals with commands sent
   to the /proc directory.
          root# echo on > /proc/acpi/ibm/light

   This turns on the Thinklight in certain models of IBM/Lenovo
   Thinkpads. (May not work on all Linux distros.)

   Of course, caution is advised when writing to /proc.

   The /proc directory contains subdirectories with unusual numerical
   names. Every one of these names maps to the process ID of a currently
   running process. Within each of these subdirectories, there are a
   number of files that hold useful information about the corresponding
   process. The stat and status files keep running statistics on the
   process, the cmdline file holds the command-line arguments the
   process was invoked with, and the exe file is a symbolic link to the
   complete path name of the invoking process. There are a few more such
   files, but these seem to be the most interesting from a scripting

   Example 29-3. Finding the process associated with a PID
# Gives complete path name to process associated with pid.

ARGNO=1  # Number of arguments the script expects.

if [ $# -ne $ARGNO ]
  echo "Usage: `basename $0` PID-number" >&2  # Error message >stderr.

pidno=$( ps ax | grep $1 | awk '{ print $1 }' | grep $1 )
# Checks for pid in "ps" listing, field #1.
# Then makes sure it is the actual process, not the process invoked by this sc
# The last "grep $1" filters out this possibility.
#    pidno=$( ps ax | awk '{ print $1 }' | grep $1 )
#    also works, as Teemu Huovila, points out.

if [ -z "$pidno" ]  #  If, after all the filtering, the result is a zero-lengt
h string,
then                #+ no running process corresponds to the pid given.
  echo "No such process running."

# Alternatively:
#   if ! ps $1 > /dev/null 2>&1
#   then                # no running process corresponds to the pid given.
#     echo "No such process running."
#    fi

# To simplify the entire process, use "pidof".

if [ ! -r "/proc/$1/$PROCFILE" ]  # Check for read permission.
  echo "Process $1 running, but..."
  echo "Can't get read permission on /proc/$1/$PROCFILE."
  exit $E_NOPERMISSION  # Ordinary user can't access some files in /proc.

# The last two tests may be replaced by:
#    if ! kill -0 $1 > /dev/null 2>&1 # '0' is not a signal, but
                                      # this will test whether it is possible
                                      # to send a signal to the process.
#    then echo "PID doesn't exist or you're not its owner" >&2
#    exit $E_BADPID
#    fi

exe_file=$( ls -l /proc/$1 | grep "exe" | awk '{ print $11 }' )
# Or       exe_file=$( ls -l /proc/$1/exe | awk '{print $11}' )
#  /proc/pid-number/exe is a symbolic link
#+ to the complete path name of the invoking process.

if [ -e "$exe_file" ]  #  If /proc/pid-number/exe exists,
then                   #+ then the corresponding process exists.
  echo "Process #$1 invoked by $exe_file."
  echo "No such process running."

#  This elaborate script can *almost* be replaced by
#       ps ax | grep $1 | awk '{ print $5 }'
#  However, this will not work...
#+ because the fifth field of 'ps' is argv[0] of the process,
#+ not the executable file path.
# However, either of the following would work.
#       find /proc/$1/exe -printf '%l\n'
#       lsof -aFn -p $1 -d txt | sed -ne 's/^n//p'

# Additional commentary by Stephane Chazelas.

exit 0

   Example 29-4. On-line connect status
#  Note that this script may need modification
#+ to work with a wireless connection.

PROCNAME=pppd        # ppp daemon
PROCFILENAME=status  # Where to look.
INTERVAL=2           # Update every 2 seconds.

pidno=$( ps ax | grep -v "ps ax" | grep -v grep | grep $PROCNAME |
awk '{ print $1 }' )

# Finding the process number of 'pppd', the 'ppp daemon'.
# Have to filter out the process lines generated by the search itself.
#  However, as Oleg Philon points out,
#+ this could have been considerably simplified by using "pidof".
#  pidno=$( pidof $PROCNAME )
#  Moral of the story:
#+ When a command sequence gets too complex, look for a shortcut.

if [ -z "$pidno" ]   # If no pid, then process is not running.
  echo "Not connected."
  echo "Connected."; echo

while [ true ]       # Endless loop, script can be improved here.

  if [ ! -e "/proc/$pidno/$PROCFILENAME" ]
  # While process running, then "status" file exists.
    echo "Disconnected."

netstat -s | grep "packets received"  # Get some connect statistics.
netstat -s | grep "packets delivered"

  sleep $INTERVAL
  echo; echo


exit 0

# As it stands, this script must be terminated with a Control-C.

#    Exercises:
#    ---------
#    Improve the script so it exits on a "q" keystroke.
#    Make the script more user-friendly in other ways.
#    Fix the script to work with wireless/DSL connections.


   In general, it is dangerous to write to the files in /proc, as this
   can corrupt the filesystem or crash the machine.

Chapter 30. Network Programming


   The Net's a cross between an elephant and a white elephant sale: it
   never forgets, and it's always crap.


   A Linux system has quite a number of tools for accessing,
   manipulating, and troubleshooting network connections. We can
   incorporate some of these tools into scripts -- scripts that expand
   our knowledge of networking, useful scripts that can facilitate the
   administration of a network.

   Here is a simple CGI script that demonstrates connecting to a remote

   Example 30-1. Print the server environment
# by Michael Zick
# Used with permission

# May have to change the location for your site.
# (At the ISP's servers, Bash may not be in the usual place.)
# Other places: /usr/bin or /usr/local/bin
# Might even try it without any path in sha-bang.

# Disable filename globbing.
set -f

# Header tells browser what to expect.
echo Content-type: text/plain

echo CGI/1.0 test script report:

echo environment settings:

echo whereis bash?
whereis bash

echo who are we?
echo ${BASH_VERSINFO[*]}

echo argc is $#. argv is "$*".

# CGI/1.0 expected environment variables.


exit 0

# Here document to give short instructions.

1) Drop this in your directory.
2) Then, open


   For security purposes, it may be helpful to identify the IP addresses
   a computer is accessing.

   Example 30-2. IP addresses
# List the IP addresses your computer is connected to.

#  Inspired by Greg Bledsoe's script,
#  Linux Journal, 09 March 2011.
#    URL:
#  Greg licensed his script under the GPL2,
#+ and as a derivative, this script is likewise GPL2.

connection_type=TCP      # Also try UDP.
field=2           # Which field of the output we're interested in.
no_match=LISTEN   # Filter out records containing this. Why?
lsof_args=-ni     # -i lists Internet-associated files.
                  # -n preserves numerical IP addresses.
                  # What happens without the -n option? Try it.
#       Delete the router info.

lsof "$lsof_args" | grep $connection_type | grep -v "$no_match" |
      awk '{print $9}' | cut -d : -f $field | sort | uniq |
      sed s/"^$router"//

#  Bledsoe's script assigns the output of a filtered IP list,
#  (similar to lines 19-22, above) to a variable.
#  He checks for multiple connections to a single IP address,
#  then uses:
#    iptables -I INPUT -s $ip -p tcp -j REJECT --reject-with tcp-reset
#  ... within a 60-second delay loop to bounce packets from DDOS attacks.

#  Exercise:
#  --------
#  Use the 'iptables' command to extend this script
#+ to reject connection attempts from well-known spammer IP domains.

   More examples of network programming:

    1. Getting the time from
    2. Downloading a URL
    3. A GRE tunnel
    4. Checking if an Internet server is up
    5. Example 16-41
    6. Example A-28
    7. Example A-29
    8. Example 29-1

   See also the networking commands in the System and Administrative
   Commands chapter and the communications commands in the External
   Filters, Programs and Commands chapter.

Chapter 31. Of Zeros and Nulls


   Faultily faultless, icily regular, splendidly null

   Dead perfection; no more.

   --Alfred Lord Tennyson

   /dev/zero ... /dev/null

   Uses of /dev/null
          Think of /dev/null as a black hole. It is essentially the
          equivalent of a write-only file. Everything written to it
          disappears. Attempts to read or output from it result in
          nothing. All the same, /dev/null can be quite useful from both
          the command-line and in scripts.

          Suppressing stdout.

cat $filename >/dev/null
# Contents of the file will not list to stdout.

          Suppressing stderr (from Example 16-3).

rm $badname 2>/dev/null
#           So error messages [stderr] deep-sixed.

          Suppressing output from both stdout and stderr.

cat $filename 2>/dev/null >/dev/null
# If "$filename" does not exist, there will be no error message output.
# If "$filename" does exist, the contents of the file will not list to stdout.
# Therefore, no output at all will result from the above line of code.
#  This can be useful in situations where the return code from a command
#+ needs to be tested, but no output is desired.
# cat $filename &>/dev/null
#     also works, as Baris Cicek points out.

          Deleting contents of a file, but preserving the file itself,
          with all attendant permissions (from Example 2-1 and Example

cat /dev/null > /var/log/messages
#  : > /var/log/messages   has same effect, but does not spawn a new process.

cat /dev/null > /var/log/wtmp

          Automatically emptying the contents of a logfile (especially
          good for dealing with those nasty "cookies" sent by commercial
          Web sites):

          Example 31-1. Hiding the cookie jar

# Obsolete Netscape browser.
# Same principle applies to newer browsers.

if [ -f ~/.netscape/cookies ]  # Remove, if exists.
  rm -f ~/.netscape/cookies

ln -s /dev/null ~/.netscape/cookies
# All cookies now get sent to a black hole, rather than saved to disk.

   Uses of /dev/zero
          Like /dev/null, /dev/zero is a pseudo-device file, but it
          actually produces a stream of nulls (binary zeros, not the
          ASCII kind). Output written to /dev/zero disappears, and it is
          fairly difficult to actually read the nulls emitted there,
          though it can be done with od or a hex editor. The chief use
          of /dev/zero is creating an initialized dummy file of
          predetermined length intended as a temporary swap file.

          Example 31-2. Setting up a swapfile using /dev/zero

# Creating a swap file.

#  A swap file provides a temporary storage cache
#+ which helps speed up certain filesystem operations.

ROOT_UID=0         # Root has $UID 0.
E_WRONG_USER=85    # Not root?


# This script must be run as root.
if [ "$UID" -ne "$ROOT_UID" ]
  echo; echo "You must be root to run this script."; echo
  exit $E_WRONG_USER

blocks=${1:-$MINBLOCKS}          #  Set to default of 40 blocks,
                                 #+ if nothing specified on command-line.
# This is the equivalent of the command block below.
# --------------------------------------------------
# if [ -n "$1" ]
# then
#   blocks=$1
# else
#   blocks=$MINBLOCKS
# fi
# --------------------------------------------------

if [ "$blocks" -lt $MINBLOCKS ]
  blocks=$MINBLOCKS              # Must be at least 40 blocks long.

echo "Creating swap file of size $blocks blocks (KB)."
dd if=/dev/zero of=$FILE bs=$BLOCKSIZE count=$blocks  # Zero out file.
mkswap $FILE $blocks             # Designate it a swap file.
swapon $FILE                     # Activate swap file.
retcode=$?                       # Everything worked?
#  Note that if one or more of these commands fails,
#+ then it could cause nasty problems.

#  Exercise:
#  Rewrite the above block of code so that if it does not execute
#+ successfully, then:
#    1) an error message is echoed to stderr,
#    2) all temporary files are cleaned up, and
#    3) the script exits in an orderly fashion with an
#+      appropriate error code.

echo "Swap file created and activated."

exit $retcode

          Another application of /dev/zero is to "zero out" a file of a
          designated size for a special purpose, such as mounting a
          filesystem on a loopback device (see Example 17-8) or
          "securely" deleting a file (see Example 16-60).

          Example 31-3. Creating a ramdisk


#  A "ramdisk" is a segment of system RAM memory
#+ which acts as if it were a filesystem.
#  Its advantage is very fast access (read/write time).
#  Disadvantages: volatility, loss of data on reboot or powerdown.
#+                less RAM available to system.
#  Of what use is a ramdisk?
#  Keeping a large dataset, such as a table or dictionary on ramdisk,
#+ speeds up data lookup, since memory access is much faster than disk access.

E_NON_ROOT_USER=70             # Must run as root.

SIZE=2000                      # 2K blocks (change as appropriate)
BLOCKSIZE=1024                 # 1K (1024 byte) block size
DEVICE=/dev/ram0               # First ram device

username=`id -nu`
if [ "$username" != "$ROOTUSER_NAME" ]
  echo "Must be root to run \"`basename $0`\"."

if [ ! -d "$MOUNTPT" ]         #  Test whether mount point already there,
then                           #+ so no error if this script is run
  mkdir $MOUNTPT               #+ multiple times.

dd if=/dev/zero of=$DEVICE count=$SIZE bs=$BLOCKSIZE  # Zero out RAM device.
                                                      # Why is this necessary?
mke2fs $DEVICE                 # Create an ext2 filesystem on it.
mount $DEVICE $MOUNTPT         # Mount it.
chmod 777 $MOUNTPT             # Enables ordinary user to access ramdisk.
                               # However, must be root to unmount it.
# Need to test whether above commands succeed. Could cause problems otherwise.
# Exercise: modify this script to make it safer.

echo "\"$MOUNTPT\" now available for use."
# The ramdisk is now accessible for storing files, even by an ordinary user.

#  Caution, the ramdisk is volatile, and its contents will disappear
#+ on reboot or power loss.
#  Copy anything you want saved to a regular directory.

# After reboot, run this script to again set up ramdisk.
# Remounting /mnt/ramdisk without the other steps will not work.

#  Suitably modified, this script can by invoked in /etc/rc.d/rc.local,
#+ to set up ramdisk automatically at bootup.
#  That may be appropriate on, for example, a database server.

exit 0

          In addition to all the above, /dev/zero is needed by ELF
          (Executable and Linking Format) UNIX/Linux binaries.

Chapter 32. Debugging


   Debugging is twice as hard as writing the code in the first place.
   Therefore, if you write the code as cleverly as possible, you are, by
   definition, not smart enough to debug it.

   --Brian Kernighan

   The Bash shell contains no built-in debugger, and only bare-bones
   debugging-specific commands and constructs. Syntax errors or outright
   typos in the script generate cryptic error messages that are often of
   no help in debugging a non-functional script.

   Example 32-1. A buggy script

# This is a buggy script.
# Where, oh where is the error?


if [$a -gt 27 ]
  echo $a

exit 0

   Output from script:
   ./ [37: command not found

   What's wrong with the above script? Hint: after the if.

   Example 32-2. Missing keyword
# What error message will this generate?

for a in 1 2 3
  echo "$a"
# done     # Required keyword 'done' commented out in line 7.

exit 0

   Output from script: line 10: syntax error: unexpected end of file

   Note that the error message does not necessarily reference the line
   in which the error occurs, but the line where the Bash interpreter
   finally becomes aware of the error.

   Error messages may disregard comment lines in a script when reporting
   the line number of a syntax error.

   What if the script executes, but does not work as expected? This is
   the all too familiar logic error.

   Example 32-3. test24: another buggy script

#  This script is supposed to delete all filenames in current directory
#+ containing embedded spaces.
#  It doesn't work.
#  Why not?

badname=`ls | grep ' '`

# Try this:
# echo "$badname"

rm "$badname"

exit 0

   Try to find out what's wrong with Example 32-3 by uncommenting the
   echo "$badname" line. Echo statements are useful for seeing whether
   what you expect is actually what you get.

   In this particular case, rm "$badname" will not give the desired
   results because $badname should not be quoted. Placing it in quotes
   ensures that rm has only one argument (it will match only one
   filename). A partial fix is to remove to quotes from $badname and to
   reset $IFS to contain only a newline, IFS=$'\n'. However, there are
   simpler ways of going about it.
# Correct methods of deleting filenames containing spaces.
rm *\ *
rm *" "*
rm *' '*
# Thank you. S.C.

   Summarizing the symptoms of a buggy script,

    1. It bombs with a "syntax error" message, or
    2. It runs, but does not work as expected (logic error).
    3. It runs, works as expected, but has nasty side effects (logic

   Tools for debugging non-working scripts include

    1. Inserting echo statements at critical points in the script to
       trace the variables, and otherwise give a snapshot of what is
       going on.


   Even better is an echo that echoes only when debug is on.
### debecho (debug-echo), by Stefano Falsetto ###
### Will echo passed parameters only if DEBUG is set to a value. ###
debecho () {
  if [ ! -z "$DEBUG" ]; then
     echo "$1" >&2
     #         ^^^ to stderr

debecho $Whatever   # whatnot

debecho $Whatever   # (Will not echo.)

    2. Using the tee filter to check processes or data flows at critical
    3. Setting option flags -n -v -x
       sh -n scriptname checks for syntax errors without actually
       running the script. This is the equivalent of inserting set -n or
       set -o noexec into the script. Note that certain types of syntax
       errors can slip past this check.
       sh -v scriptname echoes each command before executing it. This is
       the equivalent of inserting set -v or set -o verbose in the
       The -n and -v flags work well together. sh -nv scriptname gives a
       verbose syntax check.
       sh -x scriptname echoes the result each command, but in an
       abbreviated manner. This is the equivalent of inserting set -x or
       set -o xtrace in the script.
       Inserting set -u or set -o nounset in the script runs it, but
       gives an unbound variable error message at each attempt to use an
       undeclared variable.
    4. Using an "assert" function to test a variable or condition at
       critical points in a script. (This is an idea borrowed from C.)
       Example 32-4. Testing a condition with an assert


assert ()                 #  If condition false,
{                         #+ exit from script
                          #+ with appropriate error message.

  if [ -z "$2" ]          #  Not enough parameters passed
  then                    #+ to assert() function.
    return $E_PARAM_ERR   #  No damage done.


  if [ ! $1 ]
    echo "Assertion failed:  \"$1\""
    echo "File \"$0\", line $lineno"    # Give name of file and line number.
  # else
  #   return
  #   and continue executing the script.
} # Insert a similar assert() function into a script you need to debug.

condition="$a -lt $b"     #  Error message and exit from script.
                          #  Try setting "condition" to something else
                          #+ and see what happens.

assert "$condition" $LINENO
# The remainder of the script executes only if the "assert" does not fail.

# Some commands.
# Some more commands . . .
echo "This statement echoes only if the \"assert\" does not fail."
# . . .
# More commands . . .

exit $?

    5. Using the $LINENO variable and the caller builtin.
    6. Trapping at exit.
       The exit command in a script triggers a signal 0, terminating the
       process, that is, the script itself. [116] It is often useful to
       trap the exit, forcing a "printout" of variables, for example.
       The trap must be the first command in the script.

   Trapping signals

          Specifies an action on receipt of a signal; also useful for

   A signal is a message sent to a process, either by the kernel or
   another process, telling it to take some specified action (usually to
   terminate). For example, hitting a Control-C sends a user interrupt,
   an INT signal, to a running program.

          A simple instance:

trap '' 2
# Ignore interrupt 2 (Control-C), with no action specified.

trap 'echo "Control-C disabled."' 2
# Message when Control-C pressed.

   Example 32-5. Trapping at exit
# Hunting variables with a trap.

trap 'echo Variable Listing --- a = $a  b = $b' EXIT
#  EXIT is the name of the signal generated upon exit from a script.
#  The command specified by the "trap" doesn't execute until
#+ the appropriate signal is sent.

echo "This prints before the \"trap\" --"
echo "even though the script sees the \"trap\" first."



exit 0
#  Note that commenting out the 'exit' command makes no difference,
#+ since the script exits in any case after running out of commands.

   Example 32-6. Cleaning up after Control-C
# A quick 'n dirty script to check whether you are on-line yet.

umask 177  # Make sure temp files are not world readable.

#  Note that $LOGFILE must be readable
#+ (as root, chmod 644 /var/log/messages).
#  Create a "unique" temp file name, using process id of the script.
#     Using 'mktemp' is an alternative.
#     For example:
#     TEMPFILE=`mktemp temp.XXXXXX`
#  At logon, the line "remote IP address"
#                      appended to /var/log/messages.
#  How many lines in log file to check.

#  Cleans up the temp file if script interrupted by control-c.


while [ $TRUE ]  #Endless loop.
  #  Saves last 100 lines of system log file as temp file.
  #  Necessary, since newer kernels generate many log messages at log on.
  search=`grep $KEYWORD $TEMPFILE`
  #  Checks for presence of the "IP address" phrase,
  #+ indicating a successful logon.

  if [ ! -z "$search" ] #  Quotes necessary because of possible spaces.
     echo "On-line"
     rm -f $TEMPFILE    #  Clean up temp file.
     exit $ONLINE
     echo -n "."        #  The -n option to echo suppresses newline,
                        #+ so you get continuous rows of dots.

  sleep 1

#  Note: if you change the KEYWORD variable to "Exit",
#+ this script can be used while on-line
#+ to check for an unexpected logoff.

# Exercise: Change the script, per the above note,
#           and prettify it.

exit 0

# Nick Drage suggests an alternate method:

while true
  do ifconfig ppp0 | grep UP 1> /dev/null && echo "connected" && exit 0
  echo -n "."   # Prints dots (.....) until connected.
  sleep 2

# Problem: Hitting Control-C to terminate this process may be insufficient.
#+         (Dots may keep on echoing.)
# Exercise: Fix this.

# Stephane Chazelas has yet another alternative:


while ! tail -n 1 "$LOGFILE" | grep -q "$KEYWORD"
do echo -n .
echo "On-line"

# Exercise: Discuss the relative strengths and weaknesses
#           of each of these various approaches.


   The DEBUG argument to trap causes a specified action to execute after
   every command in a script. This permits tracing variables, for

   Example 32-7. Tracing a variable

trap 'echo "VARIABLE-TRACE> \$variable = \"$variable\""' DEBUG
# Echoes the value of $variable after every command.


echo "  Just initialized \$variable to $variable."

let "variable *= 3"
echo "  Just multiplied \$variable by 3."


#  The "trap 'command1 . . . command2 . . .' DEBUG" construct is
#+ more appropriate in the context of a complex script,
#+ where inserting multiple "echo $variable" statements might be
#+ awkward and time-consuming.

# Thanks, Stephane Chazelas for the pointer.

Output of script:

VARIABLE-TRACE> $variable = ""
VARIABLE-TRACE> $variable = "29"
  Just initialized $variable to 29.
VARIABLE-TRACE> $variable = "29"
VARIABLE-TRACE> $variable = "87"
  Just multiplied $variable by 3.
VARIABLE-TRACE> $variable = "87"

   Of course, the trap command has other uses aside from debugging, such
   as disabling certain keystrokes within a script (see Example A-43).

   Example 32-8. Running multiple processes (on an SMP box)
# Running multiple processes on an SMP box.
# Author: Tedman Eng

#  This is the first of two scripts,
#+ both of which must be present in the current working directory.

LIMIT=$1         # Total number of process to start
NUMPROC=4        # Number of concurrent threads (forks?)
PROCID=1         # Starting Process ID
echo "My PID is $$"

function start_thread() {
        if [ $PROCID -le $LIMIT ] ; then
                ./ $PROCID&
                let "PROCID++"
           echo "Limit reached."

while [ "$NUMPROC" -gt 0 ]; do
        let "NUMPROC--"

while true

trap "start_thread" SIGRTMIN


exit 0

# ======== Second script follows ========

# Running multiple processes on an SMP box.
# This script is called by
# Author: Tedman Eng

let "temp %= 5"
let "temp += 4"
echo "Starting $index  Time:$temp" "$@"
sleep ${temp}
echo "Ending $index"

exit 0

# ======================= SCRIPT AUTHOR'S NOTES ======================= #
#  It's not completely bug free.
#  I ran it with limit = 500 and after the first few hundred iterations,
#+ one of the concurrent threads disappeared!
#  Not sure if this is collisions from trap signals or something else.
#  Once the trap is received, there's a brief moment while executing the
#+ trap handler but before the next trap is set.  During this time, it may
#+ be possible to miss a trap signal, thus miss spawning a child process.

#  No doubt someone may spot the bug and will be writing
#+ . . . in the future.

# ===================================================================== #

# ----------------------------------------------------------------------#

# The following is the original script written by Vernia Damiano.
# Unfortunately, it doesn't work properly.


#  Must call script with at least one integer parameter
#+ (number of concurrent processes).
#  All other parameters are passed through to the processes started.

INDICE=8        # Total number of process to start
TEMPO=5         # Maximum sleep time per process
E_BADARGS=65    # No arg(s) passed to script.

if [ $# -eq 0 ] # Check for at least one argument passed to script.
  echo "Usage: `basename $0` number_of_processes [passed params]"
  exit $E_BADARGS

NUMPROC=$1              # Number of concurrent process
PARAMETRI=( "$@" )      # Parameters of each process

function avvia() {
         local temp
         local index
         let "temp %= $TEMPO"
         let "temp += 1"
         echo "Starting $index Time:$temp" "$@"
         sleep ${temp}
         echo "Ending $index"
         kill -s SIGRTMIN $$

function parti() {
         if [ $INDICE -gt 0 ] ; then
              avvia $INDICE "${PARAMETRI[@]}" &
                let "INDICE--"
                trap : SIGRTMIN

trap parti SIGRTMIN

while [ "$NUMPROC" -gt 0 ]; do
         let "NUMPROC--"


exit $?

I had the need to run a program, with specified options, on a number of
different files, using a SMP machine. So I thought [I'd] keep running
a specified number of processes and start a new one each time . . . one
of these terminates.

The "wait" instruction does not help, since it waits for a given process
or *all* process started in background. So I wrote [this] bash script
that can do the job, using the "trap" instruction.
  --Vernia Damiano


   trap '' SIGNAL (two adjacent apostrophes) disables SIGNAL for the
   remainder of the script. trap SIGNAL restores the functioning of
   SIGNAL once more. This is useful to protect a critical portion of a
   script from an undesirable interrupt.

        trap '' 2  # Signal 2 is Control-C, now disabled.
        trap 2     # Reenables Control-C

   Version 3 of Bash adds the following internal variables for use by
   the debugger.

    1. $BASH_ARGC
       Number of command-line arguments passed to script, similar to $#.
    2. $BASH_ARGV
       Final command-line parameter passed to script, equivalent ${!#}.
       Command currently executing.
       The option string following the -c option to Bash.
       In a function, indicates the line number of the function call.
       Array variable associated with =~ conditional regex matching.
       This is the name of the script, usually the same as $0.

Chapter 33. Options

   Options are settings that change shell and/or script behavior.

   The set command enables options within a script. At the point in the
   script where you want the options to take effect, use set -o
   option-name or, in short form, set -option-abbrev. These two forms
   are equivalent.


      set -o verbose
      # Echoes all commands before executing.


      set -v
      # Exact same effect as above.


   To disable an option within a script, use set +o option-name or set


      set -o verbose
      # Command echoing on.

      set +o verbose
      # Command echoing off.
      # Not echoed.

      set -v
      # Command echoing on.

      set +v
      # Command echoing off.

      exit 0

   An alternate method of enabling options in a script is to specify
   them immediately following the #! script header.

      #!/bin/bash -x
      # Body of script follows.

   It is also possible to enable script options from the command line.
   Some options that will not work with set are available this way.
   Among these are -i, force script to run interactive.

   bash -v script-name

   bash -o verbose script-name

   The following is a listing of some useful options. They may be
   specified in either abbreviated form (preceded by a single dash) or
   by complete name (preceded by a double dash or by -o).

   Table 33-1. Bash options
   Abbreviation Name Effect
   -B brace expansion Enable brace expansion (default setting = on)
   +B brace expansion Disable brace expansion
   -C noclobber Prevent overwriting of files by redirection (may be
   overridden by >|)
   -D (none) List double-quoted strings prefixed by $, but do not
   execute commands in script
   -a allexport Export all defined variables
   -b notify Notify when jobs running in background terminate (not of
   much use in a script)
   -c ... (none) Read commands from ...
   checkjobs   Informs user of any open jobs upon shell exit. Introduced
   in version 4 of Bash, and still "experimental." Usage: shopt -s
   checkjobs (Caution: may hang!)
   -e errexit Abort script at first error, when a command exits with
   non-zero status (except in until or while loops, if-tests, list
   -f noglob Filename expansion (globbing) disabled
   globstar globbing star-match Enables the ** globbing operator
   (version 4+ of Bash). Usage: shopt -s globstar
   -i interactive Script runs in interactive mode
   -n noexec Read commands in script, but do not execute them (syntax
   -o Option-Name (none) Invoke the Option-Name option
   -o posix POSIX Change the behavior of Bash, or invoked script, to
   conform to POSIX standard.
   -o pipefail pipe failure Causes a pipeline to return the exit status
   of the last command in the pipe that returned a non-zero return
   -p privileged Script runs as "suid" (caution!)
   -r restricted Script runs in restricted mode (see Chapter 22).
   -s stdin Read commands from stdin
   -t (none) Exit after first command
   -u nounset Attempt to use undefined variable outputs error message,
   and forces an exit
   -v verbose Print each command to stdout before executing it
   -x xtrace Similar to -v, but expands commands
   - (none) End of options flag. All other arguments are positional
   -- (none) Unset positional parameters. If arguments given (-- arg1
   arg2), positional parameters set to arguments.

Chapter 34. Gotchas


   Turandot: Gli enigmi sono tre, la morte una!

   Caleph: No, no! Gli enigmi sono tre, una la vita!


   Here are some (non-recommended!) scripting practices that will bring
   excitement into an otherwise dull life.

     * Assigning reserved words or characters to variable names.

case=value0       # Causes problems.
23skidoo=value1   # Also problems.
# Variable names starting with a digit are reserved by the shell.
# Try _23skidoo=value1. Starting variables with an underscore is okay.

# However . . .   using just an underscore will not work.
echo $_           # $_ is a special variable set to last arg of last command.
# But . . .       _ is a valid function name!

xyz((!*=value2    # Causes severe problems.
# As of version 3 of Bash, periods are not allowed within variable names.

     * Using a hyphen or other reserved characters in a variable name
       (or function name).

# Use 'var_1' instead.

function-whatever ()   # Error
# Use 'function_whatever ()' instead.

# As of version 3 of Bash, periods are not allowed within function names.
function.whatever ()   # Error
# Use 'functionWhatever ()' instead.

     * Using the same name for a variable and a function. This can make
       a script difficult to understand.

do_something ()
  echo "This function does something with \"$1\"."


do_something do_something

# All this is legal, but highly confusing.

     * Using whitespace inappropriately. In contrast to other
       programming languages, Bash can be quite finicky about

var1 = 23   # 'var1=23' is correct.
# On line above, Bash attempts to execute command "var1"
# with the arguments "=" and "23".

let c = $a - $b   # Instead:   let c=$a-$b   or   let "c = $a - $b"

if [ $a -le 5]    # if [ $a -le 5 ]   is correct.
#           ^^      if [ "$a" -le 5 ]   is even better.
                  # [[ $a -le 5 ]] also works.

     * Not terminating with a semicolon the final command in a code
       block within curly brackets.

{ ls -l; df; echo "Done." }
# bash: syntax error: unexpected end of file

{ ls -l; df; echo "Done."; }
#                        ^     ### Final command needs semicolon.

     * Assuming uninitialized variables (variables before a value is
       assigned to them) are "zeroed out". An uninitialized variable has
       a value of null, not zero.


echo "uninitialized_var = $uninitialized_var"
# uninitialized_var =

# However . . .
# if $BASH_VERSION >= 4.2; then

if [[ ! -v uninitialized_var ]]
  uninitialized_var=0   # Initialize it to zero!

     * Mixing up = and -eq in a test. Remember, = is for comparing
       literal variables and -eq for integers.

if [ "$a" = 273 ]      # Is $a an integer or string?
if [ "$a" -eq 273 ]    # If $a is an integer.

# Sometimes you can interchange -eq and = without adverse consequences.
# However . . .

a=273.0   # Not an integer.

if [ "$a" = 273 ]
  echo "Comparison works."
  echo "Comparison does not work."
fi    # Comparison does not work.

# Same with   a=" 273"  and a="0273".

# Likewise, problems trying to use "-eq" with non-integer values.

if [ "$a" -eq 273.0 ]
  echo "a = $a"
fi  # Aborts with an error message.
# [: 273.0: integer expression expected

     * Misusing string comparison operators.
       Example 34-1. Numerical and string comparison are not equivalent

# Trying to use a string comparison on integers.


#  The following while-loop has two errors:
#+ one blatant, and the other subtle.

while [ "$number" < 5 ]    # Wrong! Should be:  while [ "$number" -lt 5 ]
  echo -n "$number "
  let "number += 1"
#  Attempt to run this bombs with the error message:
#+ line 10: 5: No such file or directory
#  Within single brackets, "<" must be escaped,
#+ and even then, it's still wrong for comparing integers.

echo "---------------------"

while [ "$number" \< 5 ]    #  1 2 3 4
do                          #
  echo -n "$number "        #  It *seems* to work, but . . .
  let "number += 1"         #+ it actually does an ASCII comparison,
done                        #+ rather than a numerical one.

echo; echo "---------------------"

# This can cause problems. For example:


if [ "$greater" \< "$lesser" ]
  echo "$greater is less than $lesser"
fi                          # 105 is less than 5
#  In fact, "105" actually is less than "5"
#+ in a string comparison (ASCII sort order).


exit 0

     * Attempting to use let to set string variables.

let "a = hello, you"
echo "$a"   # 0

     * Sometimes variables within "test" brackets ([ ]) need to be
       quoted (double quotes). Failure to do so may cause unexpected
       behavior. See Example 7-6, Example 20-5, and Example 9-6.
     * Quoting a variable containing whitespace prevents splitting.
       Sometimes this produces unintended consequences.
     * Commands issued from a script may fail to execute because the
       script owner lacks execute permission for them. If a user cannot
       invoke a command from the command-line, then putting it into a
       script will likewise fail. Try changing the attributes of the
       command in question, perhaps even setting the suid bit (as root,
       of course).
     * Attempting to use - as a redirection operator (which it is not)
       will usually result in an unpleasant surprise.

command1 2> - | command2
# Trying to redirect error output of command1 into a pipe . . .
# . . . will not work.

command1 2>& - | command2  # Also futile.

Thanks, S.C.

     * Using Bash version 2+ functionality may cause a bailout with
       error messages. Older Linux machines may have version 1.XX of
       Bash as the default installation.


# Since Chet Ramey is constantly adding features to Bash,
# you may set $minimum_version to 2.XX, 3.XX, or whatever is appropriate.

if [ "$BASH_VERSION" \< "$minimum_version" ]
  echo "This script works only with Bash, version $minimum or greater."
  echo "Upgrade strongly recommended."


     * Using Bash-specific functionality in a Bourne shell script
       (#!/bin/sh) on a non-Linux machine may cause unexpected behavior.
       A Linux system usually aliases sh to bash, but this does not
       necessarily hold true for a generic UNIX machine.
     * Using undocumented features in Bash turns out to be a dangerous
       practice. In previous releases of this book there were several
       scripts that depended on the "feature" that, although the maximum
       value of an exit or return value was 255, that limit did not
       apply to negative integers. Unfortunately, in version 2.05b and
       later, that loophole disappeared. See Example 24-9.
     * In certain contexts, a misleading exit status may be returned.
       This may occur when setting a local variable within a function or
       when assigning an arithmetic value to a variable.
     * A script with DOS-type newlines (\r\n) will fail to execute,
       since #!/bin/bash\r\n is not recognized, not the same as the
       expected #!/bin/bash\n. The fix is to convert the script to
       UNIX-style newlines.


echo "Here"

unix2dos $0    # Script changes itself to DOS format.
chmod 755 $0   # Change back to execute permission.
               # The 'unix2dos' command removes execute permission.

./$0           # Script tries to run itself again.
               # But it won't work as a DOS file.

echo "There"

exit 0

     * A shell script headed by #!/bin/sh will not run in full
       Bash-compatibility mode. Some Bash-specific functions might be
       disabled. Scripts that need complete access to all the
       Bash-specific extensions should start with #!/bin/bash.
     * Putting whitespace in front of the terminating limit string of a
       here document will cause unexpected behavior in a script.
     * Putting more than one echo statement in a function whose output
       is captured.

add2 ()
  echo "Whatever ... "   # Delete this line!
  let "retval = $1 + $2"
    echo $retval

    echo "Sum of $num1 and $num2 = $(add2 $num1 $num2)"

#   Sum of 12 and 43 = Whatever ...
#   55

#        The "echoes" concatenate.

       This will not work.
     * A script may not export variables back to its parent process, the
       shell, or to the environment. Just as we learned in biology, a
       child process can inherit from a parent, but not vice versa.

exit 0

bash$ echo $WHATEVER


       Sure enough, back at the command prompt, $WHATEVER remains unset.
     * Setting and manipulating variables in a subshell, then attempting
       to use those same variables outside the scope of the subshell
       will result an unpleasant surprise.
       Example 34-2. Subshell Pitfalls

# Pitfalls of variables in a subshell.

echo "outer_variable = $outer_variable"

# Begin subshell

echo "outer_variable inside subshell = $outer_variable"
inner_variable=inner  # Set
echo "inner_variable inside subshell = $inner_variable"
outer_variable=inner  # Will value change globally?
echo "outer_variable inside subshell = $outer_variable"

# Will 'exporting' make a difference?
#    export inner_variable
#    export outer_variable
# Try it and see.

# End subshell

echo "inner_variable outside subshell = $inner_variable"  # Unset.
echo "outer_variable outside subshell = $outer_variable"  # Unchanged.

exit 0

# What happens if you uncomment lines 19 and 20?
# Does it make a difference?

     * Piping echo output to a read may produce unexpected results. In
       this scenario, the read acts as if it were running in a subshell.
       Instead, use the set command (as in Example 15-18).
       Example 34-3. Piping the output of echo to a read

#  Attempting to use 'echo and 'read'
#+ to assign variables non-interactively.

#   shopt -s lastpipe


echo "one two three" | read a b c
# Try to reassign a, b, and c.

echo "a = $a"  # a = aaa
echo "b = $b"  # b = bbb
echo "c = $c"  # c = ccc
# Reassignment failed.

### However . . .
##  Uncommenting line 6:
#   shopt -s lastpipe
##+ fixes the problem!
### This is a new feature in Bash, version 4.2.

# ------------------------------

# Try the following alternative.

var=`echo "one two three"`
set -- $var
a=$1; b=$2; c=$3

echo "-------"
echo "a = $a"  # a = one
echo "b = $b"  # b = two
echo "c = $c"  # c = three
# Reassignment succeeded.

# ------------------------------

#  Note also that an echo to a 'read' works within a subshell.
#  However, the value of the variable changes *only* within the subshell.

a=aaa          # Starting all over again.

echo; echo
echo "one two three" | ( read a b c;
echo "Inside subshell: "; echo "a = $a"; echo "b = $b"; echo "c = $c" )
# a = one
# b = two
# c = three
echo "-----------------"
echo "Outside subshell: "
echo "a = $a"  # a = aaa
echo "b = $b"  # b = bbb
echo "c = $c"  # c = ccc

exit 0

       In fact, as Anthony Richardson points out, piping to any loop can
       cause a similar problem.

# Loop piping troubles.
#  This example by Anthony Richardson,
#+ with addendum by Wilbert Berendsen.

find $HOME -type f -atime +30 -size 100k |
while true
   read f
   echo "$f is over 100KB and has not been accessed in over 30 days"
   echo "Consider moving the file to archives."
   # ------------------------------------
     echo "Subshell level = $BASH_SUBSHELL"
   # Subshell level = 1
   # Yes, we're inside a subshell.
   # ------------------------------------

#  foundone will always be false here since it is
#+ set to true inside a subshell
if [ $foundone = false ]
   echo "No files need archiving."

# =====================Now, here is the correct way:=================

for f in $(find $HOME -type f -atime +30 -size 100k)  # No pipe here.
   echo "$f is over 100KB and has not been accessed in over 30 days"
   echo "Consider moving the file to archives."

if [ $foundone = false ]
   echo "No files need archiving."

# ==================And here is another alternative==================

#  Places the part of the script that reads the variables
#+ within a code block, so they share the same subshell.
#  Thank you, W.B.

find $HOME -type f -atime +30 -size 100k | {
     while read f
       echo "$f is over 100KB and has not been accessed in over 30 days"
       echo "Consider moving the file to archives."

     if ! $foundone
       echo "No files need archiving."

       A lookalike problem occurs when trying to write the stdout of a
       tail -f piped to grep.

tail -f /var/log/messages | grep "$ERROR_MSG" >> error.log
#  The "error.log" file will not have anything written to it.
#  As Samuli Kaipiainen points out, this results from grep
#+ buffering its output.
#  The fix is to add the "--line-buffered" parameter to grep.

     * Using "suid" commands within scripts is risky, as it may
       compromise system security. [117]
     * Using shell scripts for CGI programming may be problematic. Shell
       script variables are not "typesafe," and this can cause
       undesirable behavior as far as CGI is concerned. Moreover, it is
       difficult to "cracker-proof" shell scripts.
     * Bash does not handle the double slash (//) string correctly.
     * Bash scripts written for Linux or BSD systems may need fixups to
       run on a commercial UNIX (or Apple OSX) machine. Such scripts
       often employ the GNU set of commands and filters, which have
       greater functionality than their generic UNIX counterparts. This
       is particularly true of such text processing utilites as tr.


   Danger is near thee --

   Beware, beware, beware, beware.

   Many brave hearts are asleep in the deep.

   So beware --


   --A.J. Lamb and H.W. Petrie

Chapter 35. Scripting With Style

   Get into the habit of writing shell scripts in a structured and
   systematic manner. Even on-the-fly and "written on the back of an
   envelope" scripts will benefit if you take a few minutes to plan and
   organize your thoughts before sitting down and coding.

   Herewith are a few stylistic guidelines. This is not (necessarily)
   intended as an Official Shell Scripting Stylesheet.

35.1. Unofficial Shell Scripting Stylesheet

     * Comment your code. This makes it easier for others to understand
       (and appreciate), and easier for you to maintain.

#  It made perfect sense when you wrote it last year,
#+ but now it's a complete mystery.
#  (From Antek Sawicki's "" script.)

       Add descriptive headers to your scripts and functions.


#                                #
#           written by Bozo Bozeman              #
#                July 05, 2001                   #
#                                                #
#           Clean up project files.              #

E_BADDIR=85                       # No such directory.
projectdir=/home/bozo/projects    # Directory to clean up.

# --------------------------------------------------------- #
# cleanup_pfiles ()                                         #
# Removes all files in designated directory.                #
# Parameter: $target_directory                              #
# Returns: 0 on success, $E_BADDIR if something went wrong. #
# --------------------------------------------------------- #
cleanup_pfiles ()
  if [ ! -d "$1" ]  # Test if target directory exists.
    echo "$1 is not a directory."
    return $E_BADDIR

  rm -f "$1"/*
  return 0   # Success.

cleanup_pfiles $projectdir

exit $?

     * Avoid using "magic numbers," [118] that is, "hard-wired" literal
       constants. Use meaningful variable names instead. This makes the
       script easier to understand and permits making changes and
       updates without breaking the application.

if [ -f /var/log/messages ]
#  A year later, you decide to change the script to check /var/log/syslog.
#  It is now necessary to manually change the script, instance by instance,
#+ and hope nothing breaks.

# A better way:
LOGFILE=/var/log/messages  # Only line that needs to be changed.
if [ -f "$LOGFILE" ]

     * Choose descriptive names for variables and functions.

fl=`ls -al $dirname`                 # Cryptic.
file_listing=`ls -al $dirname`       # Better.

MAXVAL=10   # All caps used for a script constant.
while [ "$index" -le "$MAXVAL" ]

E_NOTFOUND=95                        #  Uppercase for an errorcode,
                                     #+ and name prefixed with E_.
if [ ! -e "$filename" ]
  echo "File $filename not found."
  exit $E_NOTFOUND

MAIL_DIRECTORY=/var/spool/mail/bozo  #  Uppercase for an environmental
export MAIL_DIRECTORY                #+ variable.

GetAnswer ()                         #  Mixed case works well for a
{                                    #+ function name, especially
  prompt=$1                          #+ when it improves legibility.
  echo -n $prompt
  read answer
  return $answer

GetAnswer "What is your favorite number? "
echo $favorite_number

_uservariable=23                     # Permissible, but not recommended.
# It's better for user-defined variables not to start with an underscore.
# Leave that for system variables.

     * Use exit codes in a systematic and meaningful way.


       See also Appendix D.
       Ender suggests using the exit codes in /usr/include/sysexits.h in
       shell scripts, though these are primarily intended for C and C++
     * Use standardized parameter flags for script invocation. Ender
       proposes the following set of flags.

-a      All: Return all information (including hidden file info).
-b      Brief: Short version, usually for other scripts.
-c      Copy, concatenate, etc.
-d      Daily: Use information from the whole day, and not merely
        information for a specific instance/user.
-e      Extended/Elaborate: (often does not include hidden file info).
-h      Help: Verbose usage w/descs, aux info, discussion, help.
        See also -V.
-l      Log output of script.
-m      Manual: Launch man-page for base command.
-n      Numbers: Numerical data only.
-r      Recursive: All files in a directory (and/or all sub-dirs).
-s      Setup & File Maintenance: Config files for this script.
-u      Usage: List of invocation flags for the script.
-v      Verbose: Human readable output, more or less formatted.
-V      Version / License / Copy(right|left) / Contribs (email too).

       See also Section F.1.
     * Break complex scripts into simpler modules. Use functions where
       appropriate. See Example 37-4.
     * Don't use a complex construct where a simpler one will do.

if [ $? -eq 0 ]
# Redundant and non-intuitive.

# More concise (if perhaps not quite as legible).


   ... reading the UNIX source code to the Bourne shell (/bin/sh). I was
   shocked at how much simple algorithms could be made cryptic, and
   therefore useless, by a poor choice of code style. I asked myself,
   "Could someone be proud of this code?"

   --Landon Noll

Chapter 36. Miscellany


   Nobody really knows what the Bourne shell's grammar is. Even
   examination of the source code is little help.

   --Tom Duff

36.1. Interactive and non-interactive shells and scripts

   An interactive shell reads commands from user input on a tty. Among
   other things, such a shell reads startup files on activation,
   displays a prompt, and enables job control by default. The user can
   interact with the shell.

   A shell running a script is always a non-interactive shell. All the
   same, the script can still access its tty. It is even possible to
   emulate an interactive shell in a script.
while :
  echo -n "$MY_PROMPT"
  read line
  eval "$line"

exit 0

# This example script, and much of the above explanation supplied by
# StИphane Chazelas (thanks again).

   Let us consider an interactive script to be one that requires input
   from the user, usually with read statements (see Example 15-3). "Real
   life" is actually a bit messier than that. For now, assume an
   interactive script is bound to a tty, a script that a user has
   invoked from the console or an xterm.

   Init and startup scripts are necessarily non-interactive, since they
   must run without human intervention. Many administrative and system
   maintenance scripts are likewise non-interactive. Unvarying
   repetitive tasks cry out for automation by non-interactive scripts.

   Non-interactive scripts can run in the background, but interactive
   ones hang, waiting for input that never comes. Handle that difficulty
   by having an expect script or embedded here document feed input to an
   interactive script running as a background job. In the simplest case,
   redirect a file to supply input to a read statement (read variable
   <file). These particular workarounds make possible general purpose
   scripts that run in either interactive or non-interactive modes.

   If a script needs to test whether it is running in an interactive
   shell, it is simply a matter of finding whether the prompt variable,
   $PS1 is set. (If the user is being prompted for input, then the
   script needs to display a prompt.)

if [ -z $PS1 ] # no prompt?
### if [ -v PS1 ]   # On Bash 4.2+ ...
  # non-interactive
  # interactive

   Alternatively, the script can test for the presence of option "i" in
   the $- flag.

case $- in
*i*)    # interactive shell
*)      # non-interactive shell
# (Courtesy of "UNIX F.A.Q.," 1993)

   However, John Lange describes an alternative method, using the -t
   test operator.

# Test for a terminal!

fd=0   # stdin

#  As we recall, the -t test option checks whether the stdin, [ -t 0 ],
#+ or stdout, [ -t 1 ], in a given script is running in a terminal.
if [ -t "$fd" ]
  echo interactive
  echo non-interactive

#  But, as John points out:
#    if [ -t 0 ] works ... when you're logged in locally
#    but fails when you invoke the command remotely via ssh.
#    So for a true test you also have to test for a socket.

if [[ -t "$fd" || -p /dev/stdin ]]
  echo interactive
  echo non-interactive


   Scripts may be forced to run in interactive mode with the -i option
   or with a #!/bin/bash -i header. Be aware that this can cause erratic
   script behavior or show error messages even when no error is present.

36.2. Shell Wrappers

   A wrapper is a shell script that embeds a system command or utility,
   that accepts and passes a set of parameters to that command. [119]
   Wrapping a script around a complex command-line simplifies invoking
   it. This is expecially useful with sed and awk.

   A sed or awk script would normally be invoked from the command-line
   by a sed -e 'commands' or awk 'commands'. Embedding such a script in
   a Bash script permits calling it more simply, and makes it reusable.
   This also enables combining the functionality of sed and awk, for
   example piping the output of a set of sed commands to awk. As a saved
   executable file, you can then repeatedly invoke it in its original
   form or modified, without the inconvenience of retyping it on the

   Example 36-1. shell wrapper

# This simple script removes blank lines from a file.
# No argument checking.
# You might wish to add something like:
# if [ -z "$1" ]
# then
#  echo "Usage: `basename $0` target-file"
#  exit $E_NOARGS
# fi

sed -e /^$/d "$1"
# Same as
#    sed -e '/^$/d' filename
# invoked from the command-line.

#  The '-e' means an "editing" command follows (optional here).
#  '^' indicates the beginning of line, '$' the end.
#  This matches lines with nothing between the beginning and the end --
#+ blank lines.
#  The 'd' is the delete command.

#  Quoting the command-line arg permits
#+ whitespace and special characters in the filename.

#  Note that this script doesn't actually change the target file.
#  If you need to do that, redirect its output.


   Example 36-2. A slightly more complex shell wrapper

# a script that substitutes one pattern for
#+ another in a file,
#+ i.e., "sh Smith Jones letter.txt".
#                     Jones replaces Smith.

ARGS=3         # Script requires 3 arguments.
E_BADARGS=85   # Wrong number of arguments passed to script.

if [ $# -ne "$ARGS" ]
  echo "Usage: `basename $0` old-pattern new-pattern filename"
  exit $E_BADARGS


if [ -f "$3" ]
    echo "File \"$3\" does not exist."
    exit $E_BADARGS

# -----------------------------------------------
#  Here is where the heavy work gets done.
sed -e "s/$old_pattern/$new_pattern/g" $file_name
# -----------------------------------------------

#  's' is, of course, the substitute command in sed,
#+ and /pattern/ invokes address matching.
#  The 'g,' or global flag causes substitution for EVERY
#+ occurence of $old_pattern on each line, not just the first.
#  Read the 'sed' docs for an in-depth explanation.

exit $?  # Redirect the output of this script to write to a file.

   Example 36-3. A generic shell wrapper that writes to a logfile
#  Generic shell wrapper that performs an operation
#+ and logs it.

# Must set the following two variables.
#         Can be a complex chain of commands,
#+        for example an awk script or a pipe . . .
#         Command-line arguments, if any, for the operation.


# Log it.
echo "`date` + `whoami` + $OPERATION "$@"" >> $LOGFILE
# Now, do it.
exec $OPERATION "$@"

# It's necessary to do the logging before the operation.
# Why?

   Example 36-4. A shell wrapper around an awk script
# Prints a table of ASCII characters.

START=33   # Range of printable ASCII characters (decimal).
END=127    # Will not work for unprintable characters (> 127).

echo " Decimal   Hex     Character"   # Header.
echo " -------   ---     ---------"

for ((i=START; i<=END; i++))
  echo $i | awk '{printf("  %3d       %2x         %c\n", $1, $1, $1)}'
# The Bash printf builtin will not work in this context:
#     printf "%c" "$i"

exit 0

#  Decimal   Hex     Character
#  -------   ---     ---------
#    33       21         !
#    34       22         "
#    35       23         #
#    36       24         $
#    . . .
#   122       7a         z
#   123       7b         {
#   124       7c         |
#   125       7d         }

#  Redirect the output of this script to a file
#+ or pipe it to "more":  sh | more

   Example 36-5. A shell wrapper around another awk script

# Adds up a specified column (of numbers) in the target file.
# Floating-point (decimal) numbers okay, because awk can handle them.


if [ $# -ne "$ARGS" ] # Check for proper number of command-line args.
   echo "Usage: `basename $0` filename column-number"
   exit $E_WRONGARGS


#  Passing shell variables to the awk part of the script is a bit tricky.
#  One method is to strong-quote the Bash-script variable
#+ within the awk script.
#      ^                ^
#  This is done in the embedded awk script below.
#  See the awk documentation for more details.

# A multi-line awk script is here invoked by
#   awk '
#   ...
#   ...
#   ...
#   '

# Begin awk script.
# -----------------------------
awk '

{ total += $'"${column_number}"'
     print total

' "$filename"
# -----------------------------
# End awk script.

#   It may not be safe to pass shell variables to an embedded awk script,
#+  so Stephane Chazelas proposes the following alternative:
#   ---------------------------------------
#   awk -v column_number="$column_number" '
#   { total += $column_number
#   }
#   END {
#       print total
#   }' "$filename"
#   ---------------------------------------

exit 0

   For those scripts needing a single do-it-all tool, a Swiss army
   knife, there is Perl. Perl combines the capabilities of sed and awk,
   and throws in a large subset of C, to boot. It is modular and
   contains support for everything ranging from object-oriented
   programming up to and including the kitchen sink. Short Perl scripts
   lend themselves to embedding within shell scripts, and there may be
   some substance to the claim that Perl can totally replace shell
   scripting (though the author of the ABS Guide remains skeptical).

   Example 36-6. Perl embedded in a Bash script

# Shell commands may precede the Perl script.
echo "This precedes the embedded Perl script within \"$0\"."
echo "==============================================================="

perl -e 'print "This is an embedded Perl script.\n";'
# Like sed, Perl also uses the "-e" option.

echo "==============================================================="
echo "However, the script may also contain shell and system commands."


   It is even possible to combine a Bash script and Perl script within
   the same file. Depending on how the script is invoked, either the
   Bash part or the Perl part will execute.

   Example 36-7. Bash and Perl scripts combined

echo "Greetings from the Bash part of the script, $0."
# More Bash commands may follow here.

# End of Bash part of the script.

# =======================================================

# This part of the script must be invoked with
#    perl -x

print "Greetings from the Perl part of the script, $0.\n";
#      Perl doesn't seem to like "echo" ...
# More Perl commands may follow here.

# End of Perl part of the script.

bash$ bash
Greetings from the Bash part of the script.

bash$ perl -x
Greetings from the Perl part of the script.

   One interesting example of a complex shell wrapper is Martin
   Matusiak's undvd script, which provides an easy-to-use command-line
   interface to the complex
   [] mencoder
   utility. Another example is Itzchak Rehberg's
   [] Ext3Undel, a set of
   scripts to recover deleted file on an ext3 filesystem.

36.3. Tests and Comparisons: Alternatives

   For tests, the [[ ]] construct may be more appropriate than [ ].
   Likewise, arithmetic comparisons might benefit from the (( ))

# All of the comparisons below are equivalent.
test "$a" -lt 16 && echo "yes, $a < 16"         # "and list"
/bin/test "$a" -lt 16 && echo "yes, $a < 16"
[ "$a" -lt 16 ] && echo "yes, $a < 16"
[[ $a -lt 16 ]] && echo "yes, $a < 16"          # Quoting variables within
(( a < 16 )) && echo "yes, $a < 16"             # [[ ]] and (( )) not necessar

city="New York"
# Again, all of the comparisons below are equivalent.
test "$city" \< Paris && echo "Yes, Paris is greater than $city"
                                  # Greater ASCII order.
/bin/test "$city" \< Paris && echo "Yes, Paris is greater than $city"
[ "$city" \< Paris ] && echo "Yes, Paris is greater than $city"
[[ $city < Paris ]] && echo "Yes, Paris is greater than $city"
                                  # Need not quote $city.

# Thank you, S.C.

36.4. Recursion: a script calling itself

   Can a script recursively call itself? Indeed.

   Example 36-8. A (useless) script that recursively calls itself

#  Can a script recursively call itself?
#  Yes, but is this of any practical use?
#  (See the following.)


let "i %= $RANGE"  # Generate a random number between 0 and $RANGE - 1.

if [ "$i" -lt "$MAXVAL" ]
  echo "i = $i"
  ./$0             #  Script recursively spawns a new instance of itself.
fi                 #  Each child script does the same, until
                   #+ a generated $i equals $MAXVAL.

#  Using a "while" loop instead of an "if/then" test causes problems.
#  Explain why.

exit 0

# Note:
# ----
# This script must have execute permission for it to work properly.
# This is the case even if it is invoked by an "sh" command.
# Explain why.

   Example 36-9. A (useful) script that recursively calls itself
# phone book

# Written by Rick Boivie, and used with permission.
# Modifications by ABS Guide author.

MINARGS=1     #  Script needs at least one argument.
              #  A data file in current working directory
              #+ named "phonebook" must exist.
E_NOARGS=70   #  No arguments error.

if [ $# -lt $MINARGS ]; then
      echo "Usage: "$PROGNAME" data-to-look-up"
      exit $E_NOARGS

if [ $# -eq $MINARGS ]; then
      grep $1 "$DATAFILE"
      # 'grep' prints an error message if $DATAFILE not present.
      ( shift; "$PROGNAME" $* ) | grep $1
      # Script recursively calls itself.

exit 0        #  Script exits here.
              #  Therefore, it's o.k. to put
              #+ non-hashmarked comments and data after this point.

# ------------------------------------------------------------------------
Sample "phonebook" datafile:

John Doe        1555 Main St., Baltimore, MD 21228          (410) 222-3333
Mary Moe        9899 Jones Blvd., Warren, NH 03787          (603) 898-3232
Richard Roe     856 E. 7th St., New York, NY 10009          (212) 333-4567
Sam Roe         956 E. 8th St., New York, NY 10009          (212) 444-5678
Zoe Zenobia     4481 N. Baker St., San Francisco, SF 94338  (415) 501-1631
# ------------------------------------------------------------------------

$bash Roe
Richard Roe     856 E. 7th St., New York, NY 10009          (212) 333-4567
Sam Roe         956 E. 8th St., New York, NY 10009          (212) 444-5678

$bash Roe Sam
Sam Roe         956 E. 8th St., New York, NY 10009          (212) 444-5678

#  When more than one argument is passed to this script,
#+ it prints *only* the line(s) containing all the arguments.

   Example 36-10. Another (useful) script that recursively calls itself
#, written by Anthony Richardson
# Used with permission.

# usage:
# description: mount device, invoking user must be listed in the
#              MNTUSERS group in the /etc/sudoers file.

# ----------------------------------------------------------
#  This is a usermount script that reruns itself using sudo.
#  A user with the proper permissions only has to type

#   usermount /dev/fd0 /mnt/floppy

# instead of

#   sudo usermount /dev/fd0 /mnt/floppy

#  I use this same technique for all of my
#+ sudo scripts, because I find it convenient.
# ----------------------------------------------------------

#  If SUDO_COMMAND variable is not set we are not being run through
#+ sudo, so rerun ourselves. Pass the user's real and group id . . .

if [ -z "$SUDO_COMMAND" ]
   mntusr=$(id -u) grpusr=$(id -g) sudo $0 $*
   exit 0

# We will only get here if we are being run by sudo.
/bin/mount $* -o uid=$mntusr,gid=$grpusr

exit 0

# Additional notes (from the author of this script):
# -------------------------------------------------

# 1) Linux allows the "users" option in the /etc/fstab
#    file so that any user can mount removable media.
#    But, on a server, I like to allow only a few
#    individuals access to removable media.
#    I find using sudo gives me more control.

# 2) I also find sudo to be more convenient than
#    accomplishing this task through groups.

# 3) This method gives anyone with proper permissions
#    root access to the mount command, so be careful
#    about who you allow access.
#    You can get finer control over which access can be mounted
#    by using this same technique in separate mntfloppy, mntcdrom,
#    and mntsamba scripts.


   Too many levels of recursion can exhaust the script's stack space,
   causing a segfault.

36.5. "Colorizing" Scripts

   The ANSI [120] escape sequences set screen attributes, such as bold
   text, and color of foreground and background. DOS batch files
   commonly used ANSI escape codes for color output, and so can Bash

   Example 36-11. A "colorized" address database
# "Colorized" version of
#            Crude address database

clear                                   # Clear the screen.

echo -n "          "
echo -e '\E[37;44m'"\033[1mContact List\033[0m"
                                        # White on blue background
echo; echo
echo -e "\033[1mChoose one of the following persons:\033[0m"
                                        # Bold
tput sgr0                               # Reset attributes.
echo "(Enter only the first letter of name.)"
echo -en '\E[47;34m'"\033[1mE\033[0m"   # Blue
tput sgr0                               # Reset colors to "normal."
echo "vans, Roland"                     # "[E]vans, Roland"
echo -en '\E[47;35m'"\033[1mJ\033[0m"   # Magenta
tput sgr0
echo "ones, Mildred"
echo -en '\E[47;32m'"\033[1mS\033[0m"   # Green
tput sgr0
echo "mith, Julie"
echo -en '\E[47;31m'"\033[1mZ\033[0m"   # Red
tput sgr0
echo "ane, Morris"

read person

case "$person" in
# Note variable is quoted.

  "E" | "e" )
  # Accept upper or lowercase input.
  echo "Roland Evans"
  echo "4321 Flash Dr."
  echo "Hardscrabble, CO 80753"
  echo "(303) 734-9874"
  echo "(303) 734-9892 fax"
  echo ""
  echo "Business partner & old friend"

  "J" | "j" )
  echo "Mildred Jones"
  echo "249 E. 7th St., Apt. 19"
  echo "New York, NY 10009"
  echo "(212) 533-2814"
  echo "(212) 533-9972 fax"
  echo ""
  echo "Girlfriend"
  echo "Birthday: Feb. 11"

# Add info for Smith & Zane later.

          * )
   # Default option.
   # Empty input (hitting RETURN) fits here, too.
   echo "Not yet in database."


tput sgr0                               # Reset colors to "normal."


exit 0

   Example 36-12. Drawing a box
# Drawing a box using ASCII characters.

# Script by Stefano Palmeri, with minor editing by document author.
# Minor edits suggested by Jim Angstadt.
# Used in the ABS Guide with permission.

###  draw_box function doc  ###

#  The "draw_box" function lets the user
#+ draw a box in a terminal.
#  ROW and COLUMN represent the position
#+ of the upper left angle of the box you're going to draw.
#  ROW and COLUMN must be greater than 0
#+ and less than current terminal dimension.
#  HEIGHT is the number of rows of the box, and must be > 0.
#  HEIGHT + ROW must be <= than current terminal height.
#  WIDTH is the number of columns of the box and must be > 0.
#  WIDTH + COLUMN must be <= than current terminal width.
# E.g.: If your terminal dimension is 20x80,
#  draw_box 2 3 10 45 is good
#  draw_box 2 3 19 45 has bad HEIGHT value (19+2 > 20)
#  draw_box 2 3 18 78 has bad WIDTH value (78+3 > 80)
#  COLOR is the color of the box frame.
#  This is the 5th argument and is optional.
#  0=black 1=red 2=green 3=tan 4=blue 5=purple 6=cyan 7=white.
#  If you pass the function bad arguments,
#+ it will just exit with code 65,
#+ and no messages will be printed on stderr.
#  Clear the terminal before you start to draw a box.
#  The clear command is not contained within the function.
#  This allows the user to draw multiple boxes, even overlapping ones.

###  end of draw_box function doc  ###




if [ $# -lt "$MINARGS" ]; then          # If args are less than 4, exit.
    exit $E_BADARGS

# Looking for non digit chars in arguments.
# Probably it could be done better (exercise for the reader?).
if echo $@ | tr -d [:blank:] | tr -d [:digit:] | grep . &> /dev/null; then
   exit $E_BADARGS

BOX_HEIGHT=`expr $3 - 1`   #  -1 correction needed because angle char "+"
BOX_WIDTH=`expr $4 - 1`    #+ is a part of both box height and width.
T_ROWS=`tput lines`        #  Define current terminal dimension
T_COLS=`tput cols`         #+ in rows and columns.

if [ $1 -lt 1 ] || [ $1 -gt $T_ROWS ]; then    #  Start checking if arguments
   exit $E_BADARGS                             #+ are correct.
if [ $2 -lt 1 ] || [ $2 -gt $T_COLS ]; then
   exit $E_BADARGS
if [ `expr $1 + $BOX_HEIGHT + 1` -gt $T_ROWS ]; then
   exit $E_BADARGS
if [ `expr $2 + $BOX_WIDTH + 1` -gt $T_COLS ]; then
   exit $E_BADARGS
if [ $3 -lt 1 ] || [ $4 -lt 1 ]; then
   exit $E_BADARGS
fi                                 # End checking arguments.

plot_char(){                       # Function within a function.
   echo -e "\E[${1};${2}H"$3

echo -ne "\E[3${5}m"               # Set box frame color, if defined.

# start drawing the box

count=1                                         #  Draw vertical lines using
for (( r=$1; count<=$BOX_HEIGHT; r++)); do      #+ plot_char function.
  plot_char $r $2 $VERT
  let count=count+1

c=`expr $2 + $BOX_WIDTH`
for (( r=$1; count<=$BOX_HEIGHT; r++)); do
  plot_char $r $c $VERT
  let count=count+1

count=1                                        #  Draw horizontal lines using
for (( c=$2; count<=$BOX_WIDTH; c++)); do      #+ plot_char function.
  plot_char $1 $c $HORZ
  let count=count+1

r=`expr $1 + $BOX_HEIGHT`
for (( c=$2; count<=$BOX_WIDTH; c++)); do
  plot_char $r $c $HORZ
  let count=count+1

plot_char $1 $2 $CORNER_CHAR                   # Draw box angles.
plot_char $1 `expr $2 + $BOX_WIDTH` $CORNER_CHAR
plot_char `expr $1 + $BOX_HEIGHT` $2 $CORNER_CHAR
plot_char `expr $1 + $BOX_HEIGHT` `expr $2 + $BOX_WIDTH` $CORNER_CHAR

echo -ne "\E[0m"             #  Restore old colors.

P_ROWS=`expr $T_ROWS - 1`    #  Put the prompt at bottom of the terminal.

echo -e "\E[${P_ROWS};1H"

# Now, let's try drawing a box.
clear                       # Clear the terminal.
R=2      # Row
C=3      # Column
H=10     # Height
W=45     # Width
col=1    # Color (red)
draw_box $R $C $H $W $col   # Draw the box.

exit 0

# Exercise:
# --------
# Add the option of printing text within the drawn box.

   The simplest, and perhaps most useful ANSI escape sequence is bold
   text, \033[1m ... \033[0m. The \033 represents an escape, the "[1"
   turns on the bold attribute, while the "[0" switches it off. The "m"
   terminates each term of the escape sequence.
bash$ echo -e "\033[1mThis is bold text.\033[0m"

   A similar escape sequence switches on the underline attribute (on an
   rxvt and an aterm).
bash$ echo -e "\033[4mThis is underlined text.\033[0m"


   With an echo, the -e option enables the escape sequences.

   Other escape sequences change the text and/or background color.

bash$ echo -e '\E[34;47mThis prints in blue.'; tput sgr0

bash$ echo -e '\E[33;44m'"yellow text on blue background"; tput sgr0

bash$ echo -e '\E[1;33;44m'"BOLD yellow text on blue background"; tput sgr0


   It's usually advisable to set the bold attribute for light-colored
   foreground text.

   The tput sgr0 restores the terminal settings to normal. Omitting this
   lets all subsequent output from that particular terminal remain blue.


   Since tput sgr0 fails to restore terminal settings under certain
   circumstances, echo -ne \E[0m may be a better choice.

   Use the following template for writing colored text on a colored

   echo -e '\E[COLOR1;COLOR2mSome text goes here.'

   The "\E[" begins the escape sequence. The semicolon-separated numbers
   "COLOR1" and "COLOR2" specify a foreground and a background color,
   according to the table below. (The order of the numbers does not
   matter, since the foreground and background numbers fall in
   non-overlapping ranges.) The "m" terminates the escape sequence, and
   the text begins immediately after that.

   Note also that single quotes enclose the remainder of the command
   sequence following the echo -e.

   The numbers in the following table work for an rxvt terminal. Results
   may vary for other terminal emulators.

   Table 36-1. Numbers representing colors in Escape Sequences
   Color   Foreground Background
   black   30         40
   red     31         41
   green   32         42
   yellow  33         43
   blue    34         44
   magenta 35         45
   cyan    36         46
   white   37         47

   Example 36-13. Echoing colored text
# Echoing text messages in color.

# Modify this script for your own purposes.
# It's easier than hand-coding color.


alias Reset="tput sgr0"      #  Reset text attributes to normal
                             #+ without clearing screen.

cecho ()                     # Color-echo.
                             # Argument $1 = message
                             # Argument $2 = color
local default_msg="No message passed."
                             # Doesn't really need to be a local variable.

message=${1:-$default_msg}   # Defaults to default message.
color=${2:-$black}           # Defaults to black, if not specified.

  echo -e "$color"
  echo "$message"
  Reset                      # Reset to normal.


# Now, let's try it out.
# ----------------------------------------------------
cecho "Feeling blue..." $blue
cecho "Magenta looks more like purple." $magenta
cecho "Green with envy." $green
cecho "Seeing red?" $red
cecho "Cyan, more familiarly known as aqua." $cyan
cecho "No color passed (defaults to black)."
       # Missing $color argument.
cecho "\"Empty\" color passed (defaults to black)." ""
       # Empty $color argument.
       # Missing $message and $color arguments.
cecho "" ""
       # Empty $message and $color arguments.
# ----------------------------------------------------


exit 0

# Exercises:
# ---------
# 1) Add the "bold" attribute to the 'cecho ()' function.
# 2) Add options for colored backgrounds.

   Example 36-14. A "horserace" game
# Very simple horserace simulation.
# Author: Stefano Palmeri
# Used with permission.

#  Goals of the script:
#  playing with escape sequences and terminal colors.
#  Exercise:
#  Edit the script to make it run less randomly,
#+ set up a fake betting shop . . .
#  Um . . . um . . . it's starting to remind me of a movie . . .
#  The script gives each horse a random handicap.
#  The odds are calculated upon horse handicap
#+ and are expressed in European(?) style.
#  E.g., odds=3.75 means that if you bet $1 and win,
#+ you receive $3.75.
#  The script has been tested with a GNU/Linux OS,
#+ using xterm and rxvt, and konsole.
#  On a machine with an AMD 900 MHz processor,
#+ the average race time is 75 seconds.
#  On faster computers the race time would be lower.
#  So, if you want more suspense, reset the USLEEP_ARG variable.
#  Script by Stefano Palmeri.


# Check if md5sum and bc are installed.
if ! which bc &> /dev/null; then
   echo bc is not installed.
   echo "Can\'t run . . . "
   exit $E_RUNERR
if ! which md5sum &> /dev/null; then
   echo md5sum is not installed.
   echo "Can\'t run . . . "
   exit $E_RUNERR

#  Set the following variable to slow down script execution.
#  It will be passed as the argument for usleep (man usleep)
#+ and is expressed in microseconds (500000 = half a second).

#  Clean up the temp directory, restore terminal cursor and
#+ terminal colors -- if script interrupted by Ctl-C.
trap 'echo -en "\E[?25h"; echo -en "\E[0m"; stty echo;\
tput cup 20 0; rm -fr  $HORSE_RACE_TMP_DIR'  TERM EXIT
#  See the chapter on debugging for an explanation of 'trap.'

# Set a unique (paranoid) name for the temp directory the script needs.
HORSE_RACE_TMP_DIR=$HOME/.horserace-`date +%s`-`head -c10 /dev/urandom \
| md5sum | head -c30`

# Create the temp directory and move right in.

#  This function moves the cursor to line $1 column $2 and then prints $3.
#  E.g.: "move_and_echo 5 10 linux" is equivalent to
#+ "tput cup 4 9; echo linux", but with one command instead of two.
#  Note: "tput cup" defines 0 0 the upper left angle of the terminal,
#+ echo defines 1 1 the upper left angle of the terminal.
move_and_echo() {
          echo -ne "\E[${1};${2}H""$3"

# Function to generate a pseudo-random number between 1 and 9.
random_1_9 ()
    head -c10 /dev/urandom | md5sum | tr -d [a-z] | tr -d 0 | cut -c1

#  Two functions that simulate "movement," when drawing the horses.
draw_horse_one() {
               echo -n " "//$MOVE_HORSE//
              echo -n " "\\\\$MOVE_HORSE\\\\

# Define current terminal dimension.
N_COLS=`tput cols`
N_LINES=`tput lines`

# Need at least a 20-LINES X 80-COLUMNS terminal. Check it.
if [ $N_COLS -lt 80 ] || [ $N_LINES -lt 20 ]; then
   echo "`basename $0` needs a 80-cols X 20-lines terminal."
   echo "Your terminal is ${N_COLS}-cols X ${N_LINES}-lines."
   exit $E_RUNERR

# Start drawing the race field.

# Need a string of 80 chars. See below.
BLANK80=`seq -s "" 100 | head -c80`


# Set foreground and background colors to white.
echo -ne '\E[37;47m'

# Move the cursor on the upper left angle of the terminal.
tput cup 0 0

# Draw six white lines.
for n in `seq 5`; do
      echo $BLANK80   # Use the 80 chars string to colorize the terminal.

# Sets foreground color to black.
echo -ne '\E[30m'

move_and_echo 3 1 "START  1"
move_and_echo 3 75 FINISH
move_and_echo 1 5 "|"
move_and_echo 1 80 "|"
move_and_echo 2 5 "|"
move_and_echo 2 80 "|"
move_and_echo 4 5 "|  2"
move_and_echo 4 80 "|"
move_and_echo 5 5 "V  3"
move_and_echo 5 80 "V"

# Set foreground color to red.
echo -ne '\E[31m'

# Some ASCII art.
move_and_echo 1 8 "..@@@..@@@@@...@@@@@.@...@..@@@@..."
move_and_echo 2 8 ".@...@...@.......@...@...@.@......."
move_and_echo 3 8 ".@@@@@...@.......@...@@@@@.@@@@...."
move_and_echo 4 8 ".@...@...@.......@...@...@.@......."
move_and_echo 5 8 ".@...@...@.......@...@...@..@@@@..."
move_and_echo 1 43 "@@@@...@@@...@@@@..@@@@..@@@@."
move_and_echo 2 43 "@...@.@...@.@.....@.....@....."
move_and_echo 3 43 "@@@@..@@@@@.@.....@@@@...@@@.."
move_and_echo 4 43 "@..@..@...@.@.....@.........@."
move_and_echo 5 43 "@...@.@...@..@@@@..@@@@.@@@@.."

# Set foreground and background colors to green.
echo -ne '\E[32;42m'

# Draw  eleven green lines.
tput cup 5 0
for n in `seq 11`; do
      echo $BLANK80

# Set foreground color to black.
echo -ne '\E[30m'
tput cup 5 0

# Draw the fences.
echo "++++++++++++++++++++++++++++++++++++++\

tput cup 15 0
echo "++++++++++++++++++++++++++++++++++++++\

# Set foreground and background colors to white.
echo -ne '\E[37;47m'

# Draw three white lines.
for n in `seq 3`; do
      echo $BLANK80

# Set foreground color to black.
echo -ne '\E[30m'

# Create 9 files to stores handicaps.
for n in `seq 10 7 68`; do
      touch $n

# Set the first type of "horse" the script will draw.

#  Create position-file and odds-file for every "horse".
#+ In these files, store the current position of the horse,
#+ the type and the odds.
for HN in `seq 9`; do
      touch horse_${HN}_position
      touch odds_${HN}
      echo \-1 > horse_${HN}_position
      echo $HORSE_TYPE >>  horse_${HN}_position
      # Define a random handicap for horse.
      # Check if the random_1_9 function returned a good value.
      while ! echo $HANDICAP | grep [1-9] &> /dev/null; do
      # Define last handicap position for horse.
      LHP=`expr $HANDICAP \* 7 + 3`
      for FILE in `seq 10 7 $LHP`; do
            echo $HN >> $FILE

      # Calculate odds.
      case $HANDICAP in
              1) ODDS=`echo $HANDICAP \* 0.25 + 1.25 | bc`
                                 echo $ODDS > odds_${HN}
              2 | 3) ODDS=`echo $HANDICAP \* 0.40 + 1.25 | bc`
                                       echo $ODDS > odds_${HN}
              4 | 5 | 6) ODDS=`echo $HANDICAP \* 0.55 + 1.25 | bc`
                                             echo $ODDS > odds_${HN}
              7 | 8) ODDS=`echo $HANDICAP \* 0.75 + 1.25 | bc`
                                       echo $ODDS > odds_${HN}
              9) ODDS=`echo $HANDICAP \* 0.90 + 1.25 | bc`
                                  echo $ODDS > odds_${HN}


# Print odds.
print_odds() {
tput cup 6 0
echo -ne '\E[30;42m'
for HN in `seq 9`; do
      echo "#$HN odds->" `cat odds_${HN}`

# Draw the horses at starting line.
draw_horses() {
tput cup 6 0
echo -ne '\E[30;42m'
for HN in `seq 9`; do
      echo /\\$HN/\\"                               "


echo -ne '\E[47m'
# Wait for a enter key press to start the race.
# The escape sequence '\E[?25l' disables the cursor.
tput cup 17 0
echo -e '\E[?25l'Press [enter] key to start the race...
read -s

#  Disable normal echoing in the terminal.
#  This avoids key presses that might "contaminate" the screen
#+ during the race.
stty -echo

# --------------------------------------------------------
# Start the race.

echo -ne '\E[37;47m'
move_and_echo 18 1 $BLANK80
echo -ne '\E[30m'
move_and_echo 18 1 Starting...
sleep 1

# Set the column of the finish line.

# Define the time the race started.
START_TIME=`date +%s`

# COL variable needed by following "while" construct.

while [ $COL -lt $WINNING_POS ]; do


          # Check if the random_1_9 function has returned a good value.
          while ! echo $MOVE_HORSE | grep [1-9] &> /dev/null; do

          # Define old type and position of the "randomized horse".
          HORSE_TYPE=`cat  horse_${MOVE_HORSE}_position | tail -n 1`
          COL=$(expr `cat  horse_${MOVE_HORSE}_position | head -n 1`)

          # Check if the current position is an handicap position.
          if seq 10 7 68 | grep -w $COL &> /dev/null; then
                if grep -w $MOVE_HORSE $COL &> /dev/null; then
                      grep -v -w  $MOVE_HORSE $COL > ${COL}_new
                      rm -f $COL
                      mv -f ${COL}_new $COL
                      else ADD_POS=1
          else ADD_POS=1
          COL=`expr $COL + $ADD_POS`
          echo $COL >  horse_${MOVE_HORSE}_position  # Store new position.

         # Choose the type of horse to draw.
          case $HORSE_TYPE in
                1) HORSE_TYPE=2; DRAW_HORSE=draw_horse_two
                2) HORSE_TYPE=1; DRAW_HORSE=draw_horse_one
          echo $HORSE_TYPE >>  horse_${MOVE_HORSE}_position
          # Store current type.

          # Set foreground color to black and background to green.
          echo -ne '\E[30;42m'

          # Move the cursor to new horse position.
          tput cup `expr $MOVE_HORSE + 5` \
          `cat  horse_${MOVE_HORSE}_position | head -n 1`

          # Draw the horse.
           usleep $USLEEP_ARG

           # When all horses have gone beyond field line 15, reprint odds.
           touch fieldline15
           if [ $COL = 15 ]; then
             echo $MOVE_HORSE >> fieldline15
           if [ `wc -l fieldline15 | cut -f1 -d " "` = 9 ]; then
               : > fieldline15

          # Define the leading horse.
          HIGHEST_POS=`cat *position | sort -n | tail -1`

          # Set background color to white.
          echo -ne '\E[47m'
          tput cup 17 0
          echo -n Current leader: `grep -w $HIGHEST_POS *position | cut -c7`\
          "                              "


# Define the time the race finished.
FINISH_TIME=`date +%s`

# Set background color to green and enable blinking text.
echo -ne '\E[30;42m'
echo -en '\E[5m'

# Make the winning horse blink.
tput cup `expr $MOVE_HORSE + 5` \
`cat  horse_${MOVE_HORSE}_position | head -n 1`

# Disable blinking text.
echo -en '\E[25m'

# Set foreground and background color to white.
echo -ne '\E[37;47m'
move_and_echo 18 1 $BLANK80

# Set foreground color to black.
echo -ne '\E[30m'

# Make winner blink.
tput cup 17 0
echo -e "\E[5mWINNER: $MOVE_HORSE\E[25m""  Odds: `cat odds_${MOVE_HORSE}`"\
"  Race time: `expr $FINISH_TIME - $START_TIME` secs"

# Restore cursor and old colors.
echo -en "\E[?25h"
echo -en "\E[0m"

# Restore echoing.
stty echo

# Remove race temp directory.

tput cup 19 0

exit 0

   See also Example A-21, Example A-44, Example A-52, and Example A-40.


   There is, however, a major problem with all this. ANSI escape
   sequences are emphatically non-portable. What works fine on some
   terminal emulators (or the console) may work differently, or not at
   all, on others. A "colorized" script that looks stunning on the
   script author's machine may produce unreadable output on someone
   else's. This somewhat compromises the usefulness of colorizing
   scripts, and possibly relegates this technique to the status of a
   gimmick. Colorized scripts are probably inappropriate in a commercial
   setting, i.e., your supervisor might disapprove.

   Alister's [] ansi-color utility
   (based on Moshe Jacobson's color utility considerably simplifies
   using ANSI escape sequences. It substitutes a clean and logical
   syntax for the clumsy constructs just discussed.

   Henry/teikedvl has likewise created a utility
   ([] to simplify creation of
   colorized scripts.

36.6. Optimizations

   Most shell scripts are quick 'n dirty solutions to non-complex
   problems. As such, optimizing them for speed is not much of an issue.
   Consider the case, though, where a script carries out an important
   task, does it well, but runs too slowly. Rewriting it in a compiled
   language may not be a palatable option. The simplest fix would be to
   rewrite the parts of the script that slow it down. Is it possible to
   apply principles of code optimization even to a lowly shell script?

   Check the loops in the script. Time consumed by repetitive operations
   adds up quickly. If at all possible, remove time-consuming operations
   from within loops.

   Use builtin commands in preference to system commands. Builtins
   execute faster and usually do not launch a subshell when invoked.

   Avoid unnecessary commands, particularly in a pipe.
cat "$file" | grep "$word"

grep "$word" "$file"

#  The above command-lines have an identical effect,
#+ but the second runs faster since it launches one fewer subprocess.

   The cat command seems especially prone to overuse in scripts.

   Use the time and times tools to profile computation-intensive
   commands. Consider rewriting time-critical code sections in C, or
   even in assembler.

   Try to minimize file I/O. Bash is not particularly efficient at
   handling files, so consider using more appropriate tools for this
   within the script, such as awk or Perl.

   Write your scripts in a modular and coherent form, [121] so they can
   be reorganized and tightened up as necessary. Some of the
   optimization techniques applicable to high-level languages may work
   for scripts, but others, such as loop unrolling, are mostly
   irrelevant. Above all, use common sense.

   For an excellent demonstration of how optimization can dramatically
   reduce the execution time of a script, see Example 16-47.

36.7. Assorted Tips

36.7.1. Ideas for more powerful scripts

     * You have a problem that you want to solve by writing a Bash
       script. Unfortunately, you don't know quite where to start. One
       method is to plunge right in and code those parts of the script
       that come easily, and write the hard parts as pseudo-code.


ARGCOUNT=1                     # Need name as argument.

if [ number-of-arguments is-not-equal-to "$ARGCOUNT" ]
#    ^^^^^^^^^^^^^^^^^^^ ^^^^^^^^^^^^^^^
#  Can't figure out how to code this . . .
#+ . . . so write it in pseudo-code.

  echo "Usage: name-of-script name"
  #            ^^^^^^^^^^^^^^     More pseudo-code.

. . .

exit 0

# Later on, substitute working code for the pseudo-code.

# Line 6 becomes:
if [ $# -ne "$ARGCOUNT" ]

# Line 12 becomes:
  echo "Usage: `basename $0` name"

       For an example of using pseudo-code, see the Square Root
     * To keep a record of which user scripts have run during a
       particular session or over a number of sessions, add the
       following lines to each script you want to keep track of. This
       will keep a continuing file record of the script names and
       invocation times.

# Append (>>) following to end of each script tracked.

whoami>> $SAVE_FILE    # User invoking the script.
echo $0>> $SAVE_FILE   # Script name.
date>> $SAVE_FILE      # Date and time.
echo>> $SAVE_FILE      # Blank line as separator.

#  Of course, SAVE_FILE defined and exported as environmental variable in ~/.b
#+ (something like ~/.scripts-run)

     * The >> operator appends lines to a file. What if you wish to
       prepend a line to an existing file, that is, to paste it in at
       the beginning?

title="***This is the title line of data text file***"

echo $title | cat - $file >$
# "cat -" concatenates stdout to $file.
#  End result is
#+ to write a new file with $title appended at *beginning*.

       This is a simplified variant of the Example 19-13 script given
       earlier. And, of course, sed can also do this.
     * A shell script may act as an embedded command inside another
       shell script, a Tcl or wish script, or even a Makefile. It can be
       invoked as an external shell command in a C program using the
       system() call, i.e., system("script_name");.
     * Setting a variable to the contents of an embedded sed or awk
       script increases the readability of the surrounding shell
       wrapper. See Example A-1 and Example 15-20.
     * Put together files containing your favorite and most useful
       definitions and functions. As necessary, "include" one or more of
       these "library files" in scripts with either the dot (.) or
       source command.

# ------ -------

# Note:
# No "#!" here.
# No "live code" either.

# Useful variable definitions

ROOT_UID=0             # Root has $UID 0.
E_NOTROOT=101          # Not root user error.
MAXRETVAL=255          # Maximum (positive) return value of a function.

# Functions

Usage ()               # "Usage:" message.
  if [ -z "$1" ]       # No arg passed.

  echo "Usage: `basename $0` "$msg""

Check_if_root ()       # Check if root running script.
{                      # From "" example.
  if [ "$UID" -ne "$ROOT_UID" ]
    echo "Must be root to run this script."
    exit $E_NOTROOT

CreateTempfileName ()  # Creates a "unique" temp filename.
{                      # From "" example.
  suffix=`eval date +%s`

isalpha2 ()            # Tests whether *entire string* is alphabetic.
{                      # From "" example.
  [ $# -eq 1 ] || return $FAILURE

  case $1 in
  *[!a-zA-Z]*|"") return $FAILURE;;
  *) return $SUCCESS;;
  esac                 # Thanks, S.C.

abs ()                           # Absolute value.
{                                # Caution: Max return value = 255.

  if [ -z "$1" ]                 # Need arg passed.
    return $E_ARGERR             # Obvious error value returned.

  if [ "$1" -ge 0 ]              # If non-negative,
  then                           #
    absval=$1                    # stays as-is.
  else                           # Otherwise,
    let "absval = (( 0 - $1 ))"  # change sign.

  return $absval

tolower ()             #  Converts string(s) passed as argument(s)
{                      #+ to lowercase.

  if [ -z "$1" ]       #  If no argument(s) passed,
  then                 #+ send error message
    echo "(null)"      #+ (C-style void-pointer error message)
    return             #+ and return from function.

  echo "$@" | tr A-Z a-z
  # Translate all passed arguments ($@).


# Use command substitution to set a variable to function output.
# For example:
#    oldvar="A seT of miXed-caSe LEtTerS"
#    newvar=`tolower "$oldvar"`
#    echo "$newvar"    # a set of mixed-case letters
# Exercise: Rewrite this function to change lowercase passed argument(s)
#           to uppercase ... toupper()  [easy].

     * Use special-purpose comment headers to increase clarity and
       legibility in scripts.

## Caution.
rm -rf *.zzy   ##  The "-rf" options to "rm" are very dangerous,
               ##+ especially with wild cards.

#+ Line continuation.
#  This is line 1
#+ of a multi-line comment,
#+ and this is the final line.

#* Note.

#o List item.

#> Another point of view.
while [ "$var1" != "end" ]    #> while test "$var1" != "end"

     * Dotan Barak contributes template code for a progress bar in a
       Example 36-15. A Progress Bar


# Author: Dotan Barak (very minor revisions by ABS Guide author).
# Used in ABS Guide with permission (thanks!).


        # Calculate how many characters will be full.
        let "full_limit = ((($1 - $BRACKET_CHARS) * $2) / $LIMIT)"

        # Calculate how many characters will be empty.
        let "empty_limit = ($1 - $BRACKET_CHARS) - ${full_limit}"

        # Prepare the bar.
        for ((j=0; j<full_limit; j++)); do

        for ((j=0; j<empty_limit; j++)); do


        printf "%3d%% %s" $2 ${bar_line}

# Here is a sample of code that uses it.
for ((i=0; i<=MAX_PERCENT; i++)); do
        usleep 10000
        # ... Or run some other commands ...
        print_progress_bar ${BAR_WIDTH} ${i}
        echo -en "\r"

echo ""


     * A particularly clever use of if-test constructs is for comment


#  Try setting the above variable to some value
#+ for an unpleasant surprise.

if [ $COMMENT_BLOCK ]; then

Comment block --
This is a comment line.
This is another comment line.
This is yet another comment line.

echo "This will not echo."

Comment blocks are error-free! Whee!


echo "No more comments, please."

exit 0

       Compare this with using here documents to comment out code
     * Using the $? exit status variable, a script may test if a
       parameter contains only digits, so it can be treated as an



test "$1" -ne 0 -o "$1" -eq 0 2>/dev/null
# An integer is either equal to 0 or not equal to 0.
# 2>/dev/null suppresses error message.

if [ $? -ne "$SUCCESS" ]
  echo "Usage: `basename $0` integer-input"
  exit $E_BADINPUT

let "sum = $1 + 25"             # Would give error if $1 not integer.
echo "Sum = $sum"

# Any variable, not just a command-line parameter, can be tested this way.

exit 0

     * The 0 - 255 range for function return values is a severe
       limitation. Global variables and other workarounds are often
       problematic. An alternative method for a function to communicate
       a value back to the main body of the script is to have the
       function write to stdout (usually with echo) the "return value,"
       and assign this to a variable. This is actually a variant of
       command substitution.
       Example 36-16. Return value trickery


multiply ()                     # Multiplies params passed.
{                               # Will accept a variable number of args.

  local product=1

  until [ -z "$1" ]             # Until uses up arguments passed...
    let "product *= $1"

  echo $product                 #  Will not echo to stdout,
}                               #+ since this will be assigned to a variable.

mult1=15383; mult2=25211
val1=`multiply $mult1 $mult2`
echo "$mult1 X $mult2 = $val1"
                                # 387820813

mult1=25; mult2=5; mult3=20
val2=`multiply $mult1 $mult2 $mult3`
echo "$mult1 X $mult2 X $mult3 = $val2"
                                # 2500

mult1=188; mult2=37; mult3=25; mult4=47
val3=`multiply $mult1 $mult2 $mult3 $mult4`
echo "$mult1 X $mult2 X $mult3 X $mult4 = $val3"
                                # 8173300

exit 0

       The same technique also works for alphanumeric strings. This
       means that a function can "return" a non-numeric value.

capitalize_ichar ()          #  Capitalizes initial character
{                            #+ of argument string(s) passed.

  string0="$@"               # Accepts multiple arguments.

  firstchar=${string0:0:1}   # First character.
  string1=${string0:1}       # Rest of string(s).

  FirstChar=`echo "$firstchar" | tr a-z A-Z`
                             # Capitalize first character.

  echo "$FirstChar$string1"  # Output to stdout.


newstring=`capitalize_ichar "every sentence should start with a capital letter
echo "$newstring"          # Every sentence should start with a capital letter

       It is even possible for a function to "return" multiple values
       with this method.
       Example 36-17. Even more return value trickery

# A function may "return" more than one value.

sum_and_product ()   # Calculates both sum and product of passed args.
  echo $(( $1 + $2 )) $(( $1 * $2 ))
# Echoes to stdout each calculated value, separated by space.

echo "Enter first number "
read first

echo "Enter second number "
read second

retval=`sum_and_product $first $second`      # Assigns output of function.
sum=`echo "$retval" | awk '{print $1}'`      # Assigns first field.
product=`echo "$retval" | awk '{print $2}'`  # Assigns second field.

echo "$first + $second = $sum"
echo "$first * $second = $product"

exit 0


   There can be only one echo statement in the function for this to
   work. If you alter the previous example:
sum_and_product ()
  echo "This is the sum_and_product function." # This messes things up!
  echo $(( $1 + $2 )) $(( $1 * $2 ))
retval=`sum_and_product $first $second`      # Assigns output of function.
# Now, this will not work correctly.

     * Next in our bag of tricks are techniques for passing an array to
       a function, then "returning" an array back to the main body of
       the script.
       Passing an array involves loading the space-separated elements of
       the array into a variable with command substitution. Getting an
       array back as the "return value" from a function uses the
       previously mentioned strategem of echoing the array in the
       function, then invoking command substitution and the ( ... )
       operator to assign it to an array.
       Example 36-18. Passing and returning arrays

# Passing an array to a function and ...
#                   "returning" an array from a function

Pass_Array ()
  local passed_array   # Local variable!
  passed_array=( `echo "$1"` )
  echo "${passed_array[@]}"
  #  List all the elements of the new array
  #+ declared and set within the function.

original_array=( element1 element2 element3 element4 element5 )

echo "original_array = ${original_array[@]}"
#                      List all elements of original array.

# This is the trick that permits passing an array to a function.
# **********************************
argument=`echo ${original_array[@]}`
# **********************************
#  Pack a variable
#+ with all the space-separated elements of the original array.
# Attempting to just pass the array itself will not work.

# This is the trick that allows grabbing an array as a "return value".
# *****************************************
returned_array=( `Pass_Array "$argument"` )
# *****************************************
# Assign 'echoed' output of function to array variable.

echo "returned_array = ${returned_array[@]}"

echo "============================================================="

#  Now, try it again,
#+ attempting to access (list) the array from outside the function.
Pass_Array "$argument"

# The function itself lists the array, but ...
#+ accessing the array from outside the function is forbidden.
echo "Passed array (within function) = ${passed_array[@]}"
# NULL VALUE since the array is a variable local to the function.



# And here is an even more explicit example:

ret_array ()
  for element in {11..20}
    echo "$element "   #  Echo individual elements
  done                 #+ of what will be assembled into an array.

arr=( $(ret_array) )   #  Assemble into array.

echo "Capturing array \"arr\" from function ret_array () ..."
echo "Third element of array \"arr\" is ${arr[2]}."   # 13  (zero-indexed)
echo -n "Entire array is: "
echo ${arr[@]}                # 11 12 13 14 15 16 17 18 19 20


exit 0

       For a more elaborate example of passing arrays to functions, see
       Example A-10.
     * Using the double-parentheses construct, it is possible to use
       C-style syntax for setting and incrementing/decrementing
       variables and in for and while loops. See Example 11-12 and
       Example 11-17.
     * Setting the path and umask at the beginning of a script makes it
       more "portable" -- more likely to run on a "foreign" machine
       whose user may have bollixed up the $PATH and umask.

PATH=/bin:/usr/bin:/usr/local/bin ; export PATH
umask 022   # Files that the script creates will have 755 permission.

# Thanks to Ian D. Allen, for this tip.

     * A useful scripting technique is to repeatedly feed the output of
       a filter (by piping) back to the same filter, but with a
       different set of arguments and/or options. Especially suitable
       for this are tr and grep.

# From "" example.

wlist=`strings "$1" | tr A-Z a-z | tr '[:space:]' Z | \
tr -cs '[:alpha:]' Z | tr -s '\173-\377' Z | tr Z ' '`

       Example 36-19. Fun with anagrams

# Playing games with anagrams.

# Find anagrams of...
FILTER='.......'       # How many letters minimum?
#       1234567

anagram "$LETTERSET" | # Find all anagrams of the letterset...
grep "$FILTER" |       # With at least 7 letters,
grep '^is' |           # starting with 'is'
grep -v 's$' |         # no plurals
grep -v 'ed$'          # no past tense verbs
# Possible to add many combinations of conditions and filters.

#  Uses "anagram" utility
#+ that is part of the author's "yawl" word list package.

exit 0                 # End of code.

bash$ sh

#  Exercises:
#  ---------
#  Modify this script to take the LETTERSET as a command-line parameter.
#  Parameterize the filters in lines 11 - 13 (as with $FILTER),
#+ so that they can be specified by passing arguments to a function.

#  For a slightly different approach to anagramming,
#+ see the script.

       See also Example 29-4, Example 16-25, and Example A-9.
     * Use "anonymous here documents" to comment out blocks of code, to
       save having to individually comment out each line with a #. See
       Example 19-11.
     * Running a script on a machine that relies on a command that might
       not be installed is dangerous. Use whatis to avoid potential
       problems with this.

CMD=command1                 # First choice.
PlanB=command2               # Fallback option.

command_test=$(whatis "$CMD" | grep 'nothing appropriate')
#  If 'command1' not found on system , 'whatis' will return
#+ "command1: nothing appropriate."
#  A safer alternative is:
#     command_test=$(whereis "$CMD" | grep \/)
#  But then the sense of the following test would have to be reversed,
#+ since the $command_test variable holds content only if
#+ the $CMD exists on the system.
#     (Thanks, bojster.)

if [[ -z "$command_test" ]]  # Check whether command present.
  $CMD option1 option2       #  Run command1 with options.
else                         #  Otherwise,
  $PlanB                     #+ run command2.

     * An if-grep test may not return expected results in an error case,
       when text is output to stderr, rather that stdout.

if ls -l nonexistent_filename | grep -q 'No such file or directory'
  then echo "File \"nonexistent_filename\" does not exist."

       Redirecting stderr to stdout fixes this.

if ls -l nonexistent_filename 2>&1 | grep -q 'No such file or directory'
#                             ^^^^
  then echo "File \"nonexistent_filename\" does not exist."

# Thanks, Chris Martin, for pointing this out.

     * If you absolutely must access a subshell variable outside the
       subshell, here's a way to do it.

TMPFILE=tmpfile                  # Create a temp file to store the variable.

(   # Inside the subshell ...
echo $inner_variable
echo $inner_variable >>$TMPFILE  # Append to temp file.

    # Outside the subshell ...

echo; echo "-----"; echo
echo $inner_variable             # Null, as expected.
echo "-----"; echo

# Now ...
read inner_variable <$TMPFILE    # Read back shell variable.
rm -f "$TMPFILE"                 # Get rid of temp file.
echo "$inner_variable"           # It's an ugly kludge, but it works.

     * The run-parts command is handy for running a set of command
       scripts in a particular sequence, especially in combination with
       cron or at.
     * For doing multiple revisions on a complex script, use the rcs
       Revision Control System package.
       Among other benefits of this is automatically updated ID header
       tags. The co command in rcs does a parameter replacement of
       certain reserved key words, for example, replacing # $Id$ in a
       script with something like:

# $Id:,v 1.1 2004/10/16 02:43:05 bozo Exp $

36.7.2. Widgets

   It would be nice to be able to invoke X-Windows widgets from a shell
   script. There happen to exist several packages that purport to do so,
   namely Xscript, Xmenu, and widtools. The first two of these no longer
   seem to be maintained. Fortunately, it is still possible to obtain
   dtools-2.0.tgz] here.


   The widtools (widget tools) package requires the XForms library to be
   installed. Additionally, the Makefile needs some judicious editing
   before the package will build on a typical Linux system. Finally,
   three of the six widgets offered do not work (and, in fact,

   The dialog family of tools offers a method of calling "dialog"
   widgets from a shell script. The original dialog utility works in a
   text console, but its successors, gdialog, Xdialog, and kdialog use
   X-Windows-based widget sets.

   Example 36-20. Widgets invoked from a shell script
# Using 'gdialog' widgets.

# Must have 'gdialog' installed on your system to run this script.
# Or, you can replace all instance of 'gdialog' below with 'kdialog' ...
# Version 1.1 (corrected 04/05/05)

# This script was inspired by the following article.
#     "Scripting for X Productivity," by Marco Fioretti,
#      LINUX JOURNAL, Issue 113, September 2003, pp. 86-9.
# Thank you, all you good people at LJ.

# Input error in dialog box.
# Dimensions of display, input widgets.

# Output file name (constructed out of script name).

# Display this script in a text widget.
gdialog --title "Displaying: $0" --textbox $0 $HEIGHT $WIDTH

# Now, we'll try saving input in a file.
echo -n "VARIABLE=" > $OUTFILE
gdialog --title "User Input" --inputbox "Enter variable, please:" \

if [ "$?" -eq 0 ]
# It's good practice to check exit status.
  echo "Executed \"dialog box\" without errors."
  echo "Error(s) in \"dialog box\" execution."
        # Or, clicked on "Cancel", instead of "OK" button.
  exit $E_INPUT

# Now, we'll retrieve and display the saved variable.
. $OUTFILE   # 'Source' the saved file.
echo "The variable input in the \"input box\" was: "$VARIABLE""

rm $OUTFILE  # Clean up by removing the temp file.
             # Some applications may need to retain this file.

exit $?

# Exercise: Rewrite this script using the 'zenity' widget set.

   The xmessage command is a simple method of popping up a message/query
   window. For example:
   xmessage Fatal error in script! -button exit

   The latest entry in the widget sweepstakes is zenity. This utility
   pops up GTK+ dialog widgets-and-windows, and it works very nicely
   within a script.
get_info ()
  zenity --entry       #  Pops up query window . . .
                       #+ and prints user entry to stdout.

                       #  Also try the --calendar and --scale options.

answer=$( get_info )   #  Capture stdout in $answer variable.

echo "User entered: "$answer""

   For other methods of scripting with widgets, try Tk or wish (Tcl
   derivatives), PerlTk (Perl with Tk extensions), tksh (ksh with Tk
   extensions), XForms4Perl (Perl with XForms extensions), Gtk-Perl
   (Perl with Gtk extensions), or PyQt (Python with Qt extensions).

36.8. Security Issues

36.8.1. Infected Shell Scripts

   A brief warning about script security is indicated. A shell script
   may contain a worm, trojan, or even a virus. For that reason, never
   run as root a script (or permit it to be inserted into the system
   startup scripts in /etc/rc.d) unless you have obtained said script
   from a trusted source or you have carefully analyzed it to make
   certain it does nothing harmful.

   Various researchers at Bell Labs and other sites, including M.
   Douglas McIlroy, Tom Duff, and Fred Cohen have investigated the
   implications of shell script viruses. They conclude that it is all
   too easy for even a novice, a "script kiddie," to write one. [122]

   Here is yet another reason to learn scripting. Being able to look at
   and understand scripts may protect your system from being compromised
   by a rogue script.

36.8.2. Hiding Shell Script Source

   For security purposes, it may be necessary to render a script
   unreadable. If only there were a utility to create a stripped binary
   executable from a script. Francisco Rosales' shc -- generic shell
   script compiler does exactly that.

   Unfortunately, according to
   [] an article in the October,
   2005 Linux Journal, the binary can, in at least some cases, be
   decrypted to recover the original script source. Still, this could be
   a useful method of keeping scripts secure from all but the most
   skilled hackers.

36.8.3. Writing Secure Shell Scripts

   Dan Stromberg suggests the following guidelines for writing
   (relatively) secure shell scripts.

     * Don't put secret data in environment variables.
     * Don't pass secret data in an external command's arguments (pass
       them in via a pipe or redirection instead).
     * Set your $PATH carefully. Don't just trust whatever path you
       inherit from the caller if your script is running as root. In
       fact, whenever you use an environment variable inherited from the
       caller, think about what could happen if the caller put something
       misleading in the variable, e.g., if the caller set $HOME to

36.9. Portability Issues


   It is easier to port a shell than a shell script.

   --Larry Wall

   This book deals specifically with Bash scripting on a GNU/Linux
   system. All the same, users of sh and ksh will find much of value

   As it happens, many of the various shells and scripting languages
   seem to be converging toward the POSIX 1003.2 standard. Invoking Bash
   with the --posix option or inserting a set -o posix at the head of a
   script causes Bash to conform very closely to this standard. Another
   alternative is to use a #!/bin/sh sha-bang header in the script,
   rather than #!/bin/bash. [123] Note that /bin/sh is a link to
   /bin/bash in Linux and certain other flavors of UNIX, and a script
   invoked this way disables extended Bash functionality.

   Most Bash scripts will run as-is under ksh, and vice-versa, since
   Chet Ramey has been busily porting ksh features to the latest
   versions of Bash.

   On a commercial UNIX machine, scripts using GNU-specific features of
   standard commands may not work. This has become less of a problem in
   the last few years, as the GNU utilities have pretty much displaced
   their proprietary counterparts even on "big-iron" UNIX. Caldera's
   release of the source to many of the original UNIX utilities has
   accelerated the trend.

   Bash has certain features that the traditional Bourne shell lacks.
   Among these are:

     * Certain extended invocation options
     * Command substitution using $( ) notation
     * Brace expansion
     * Certain array operations, and associative arrays
     * The double brackets extended test construct
     * The double-parentheses arithmetic-evaluation construct
     * Certain string manipulation operations
     * Process substitution
     * A Regular Expression matching operator
     * Bash-specific builtins
     * Coprocesses

   See the Bash F.A.Q. for a complete listing.

36.9.1. A Test Suite

   Let us illustrate some of the incompatibilities between Bash and the
   classic Bourne shell. Download and install the "Heirloom Bourne
   Shell" and run the following script, first using Bash, then the
   classic sh.

   Example 36-21. Test Suite
# A partial Bash compatibility test suite.

# Double brackets (test)
String="Double brackets supported?"
echo -n "Double brackets test: "
if [[ "$String" = "Double brackets supported?" ]]
  echo "PASS"
  echo "FAIL"

# Double brackets and regex matching
String="Regex matching supported?"
echo -n "Regex matching: "
if [[ "$String" =~ R.....matching* ]]
  echo "PASS"
  echo "FAIL"

# Arrays
Array=( If supports arrays will print PASS )
echo "Array test: $test_arr"

#  Completing this script is an exercise for the reader.
#  Add to the above similar tests for double parentheses,
#+ brace expansion, $() command substitution, etc.

exit $?

36.10. Shell Scripting Under Windows

   Even users running that other OS can run UNIX-like shell scripts, and
   therefore benefit from many of the lessons of this book. The
   [] Cygwin package from Cygnus and
   the [] MKS utilities from Mortice Kern
   Associates add shell scripting capabilities to Windows.

   There have been intimations that a future release of Windows will
   contain Bash-like command-line scripting capabilities, but that
   remains to be seen.

Chapter 37. Bash, versions 2, 3, and 4

37.1. Bash, version 2

   The current version of Bash, the one you have running on your
   machine, is most likely version 2.xx.yy, 3.xx.yy, or 4.xx.yy.
bash$ echo $BASH_VERSION

   The version 2 update of the classic Bash scripting language added
   array variables, string and parameter expansion, and a better method
   of indirect variable references, among other features.

   Example 37-1. String expansion

# String expansion.
# Introduced with version 2 of Bash.

#  Strings of the form $'xxx'
#+ have the standard escaped characters interpreted.

echo $'Ringing bell 3 times \a \a \a'
     # May only ring once with certain terminals.
     # Or ...
     # May not ring at all, depending on terminal settings.
echo $'Three form feeds \f \f \f'
echo $'10 newlines \n\n\n\n\n\n\n\n\n\n'
echo $'\102\141\163\150'
     #   B   a   s   h
     # Octal equivalent of characters.


   Example 37-2. Indirect variable references - the new way

# Indirect variable referencing.
# This has a few of the attributes of references in C++.


echo "a = $a"           # Direct reference.

echo "Now a = ${!a}"    # Indirect reference.
#  The ${!variable} notation is more intuitive than the old
#+ eval var1=\$$var2


echo "t = ${!t}"                      # t = 24
echo "Value of t changed to ${!t}"    # 387
# No 'eval' necessary.

#  This is useful for referencing members of an array or table,
#+ or for simulating a multi-dimensional array.
#  An indexing option (analogous to pointer arithmetic)
#+ would have been nice. Sigh.

exit 0

# See also, example.

   Example 37-3. Simple database application, using indirect variable
# Simple database / table-lookup application.

# ============================================================== #
# Data

B1723_value=470                                   # Ohms
B1723_powerdissip=.25                             # Watts
B1723_colorcode="yellow-violet-brown"             # Color bands
B1723_loc=173                                     # Where they are
B1723_inventory=78                                # How many



# ============================================================== #


PS3='Enter catalog number: '


select catalog_number in "B1723" "B1724" "B1725"

  echo "Catalog number $catalog_number:"
  # Now, retrieve value, using indirect referencing.
  echo "There are ${!Inv} of  [${!Val} ohm / ${!Pdissip} watt]\
  resistors in stock."  #        ^             ^
  # As of Bash 4.2, you can replace "ohm" with \u2126 (using echo -e).
  echo "These are located in bin # ${!Loc}."
  echo "Their color code is \"${!Ccode}\"."


echo; echo

# Exercises:
# ---------
# 1) Rewrite this script to read its data from an external file.
# 2) Rewrite this script to use arrays,
#+   rather than indirect variable referencing.
#    Which method is more straightforward and intuitive?
#    Which method is easier to code?

# Notes:
# -----
#  Shell scripts are inappropriate for anything except the most simple
#+ database applications, and even then it involves workarounds and kludges.
#  Much better is to use a language with native support for data structures,
#+ such as C++ or Java (or even Perl).

exit 0

   Example 37-4. Using arrays and other miscellaneous trickery to deal
   four random hands from a deck of cards

# Deals four random hands from a deck of cards.




declare -a Deck
declare -a Suits
declare -a Cards
#  It would have been easier to implement and more intuitive
#+ with a single, 3-dimensional array.
#  Perhaps a future version of Bash will support multidimensional arrays.

initialize_Deck ()
until [ "$i" -gt $UPPER_LIMIT ]
  Deck[i]=$UNPICKED   # Set each card of "Deck" as unpicked.
  let "i += 1"

initialize_Suits ()
Suits[0]=C #Clubs
Suits[1]=D #Diamonds
Suits[2]=H #Hearts
Suits[3]=S #Spades

initialize_Cards ()
Cards=(2 3 4 5 6 7 8 9 10 J Q K A)
# Alternate method of initializing an array.

pick_a_card ()
let "card_number %= $CARDS" # Restrict range to 0 - 51, i.e., 52 cards.
if [ "${Deck[card_number]}" -eq $UNPICKED ]
  return $card_number
  return $DUPE_CARD

parse_card ()
let "suit_number = number / CARDS_IN_SUIT"
echo -n "$suit-"
let "card_no = number % CARDS_IN_SUIT"
printf %-4s $Card
# Print cards in neat columns.

seed_random ()  # Seed random number generator.
{               # What happens if you don't do this?
seed=`eval date +%s`
let "seed %= 32766"
#  What are some other methods
#+ of seeding the random number generator?

deal_cards ()

while [ "$cards_picked" -le $UPPER_LIMIT ]

  if [ "$t" -ne $DUPE_CARD ]
    parse_card $t

    # Change back to 1-based indexing (temporarily). Why?
    let "u %= $CARDS_IN_SUIT"
    if [ "$u" -eq 0 ]   # Nested if/then condition test.
    fi                  # Each hand set apart with a blank line.

    let "cards_picked += 1"


return 0

# Structured programming:
# Entire program logic modularized in functions.



# Exercise 1:
# Add comments to thoroughly document this script.

# Exercise 2:
# Add a routine (function) to print out each hand sorted in suits.
# You may add other bells and whistles if you like.

# Exercise 3:
# Simplify and streamline the logic of the script.

37.2. Bash, version 3

   On July 27, 2004, Chet Ramey released version 3 of Bash. This update
   fixed quite a number of bugs and added new features.

   Some of the more important added features:

     * A new, more generalized {a..z} brace expansion operator.


for i in {1..10}
#  Simpler and more straightforward than
#+ for i in $(seq 10)
  echo -n "$i "


# 1 2 3 4 5 6 7 8 9 10

# Or just . . .

echo {a..z}    #  a b c d e f g h i j k l m n o p q r s t u v w x y z
echo {e..m}    #  e f g h i j k l m
echo {z..a}    #  z y x w v u t s r q p o n m l k j i h g f e d c b a
               #  Works backwards, too.
echo {25..30}  #  25 26 27 28 29 30
echo {3..-2}   #  3 2 1 0 -1 -2
echo {X..d}    #  X Y Z [  ] ^ _ ` a b c d
               #  Shows (some of) the ASCII characters between Z and a,
               #+ but don't rely on this type of behavior because . . .
echo {]..a}    #  {]..a}
               #  Why?

# You can tack on prefixes and suffixes.
echo "Number #"{1..4}, "..."
     # Number #1, Number #2, Number #3, Number #4, ...

# You can concatenate brace-expansion sets.
echo {1..3}{x..z}" +" "..."
     # 1x + 1y + 1z + 2x + 2y + 2z + 3x + 3y + 3z + ...
     # Generates an algebraic expression.
     # This could be used to find permutations.

# You can nest brace-expansion sets.
echo {{a..c},{1..3}}
     # a b c 1 2 3
     # The "comma operator" splices together strings.

# Unfortunately, brace expansion does not lend itself to parameterization.
echo {$var1..$var2}   # {1..5}

     * The ${!array[@]} operator, which expands to all the indices of a
       given array.


Array=(element-zero element-one element-two element-three)

echo ${Array[0]}   # element-zero
                   # First element of array.

echo ${!Array[@]}  # 0 1 2 3
                   # All the indices of Array.

for i in ${!Array[@]}
  echo ${Array[i]} # element-zero
                   # element-one
                   # element-two
                   # element-three
                   # All the elements in Array.

     * The =~ Regular Expression matching operator within a double
       brackets test expression. (Perl has a similar operator.)


variable="This is a fine mess."

echo "$variable"

# Regex matching with =~ operator within [[ double brackets ]].
if [[ "$variable" =~ T.........fin*es* ]]
# NOTE: As of version 3.2 of Bash, expression to match no longer quoted.
  echo "match found"
      # match found

       Or, more usefully:



if [[ "$input" =~ "[0-9][0-9][0-9]-[0-9][0-9]-[0-9][0-9][0-9][0-9]" ]]
#                 ^ NOTE: Quoting not necessary, as of version 3.2 of Bash.
# NNN-NN-NNNN (where each N is a digit).
  echo "Social Security number."
  # Process SSN.
  echo "Not a Social Security number!"
  # Or, ask for corrected input.

       For additional examples of using the =~ operator, see Example
       A-29, Example 19-14, Example A-35, and Example A-24.
     * The new set -o pipefail option is useful for debugging pipes. If
       this option is set, then the exit status of a pipe is the exit
       status of the last command in the pipe to fail (return a non-zero
       value), rather than the actual final command in the pipe.
       See Example 16-43.


   The update to version 3 of Bash breaks a few scripts that worked
   under earlier versions. Test critical legacy scripts to make sure
   they still work!

   As it happens, a couple of the scripts in the Advanced Bash Scripting
   Guide had to be fixed up (see Example 9-4, for instance).

37.2.1. Bash, version 3.1

   The version 3.1 update of Bash introduces a number of bugfixes and a
   few minor changes.

     * The += operator is now permitted in in places where previously
       only the = assignment operator was recognized.

echo $a        # 1

a+=5           # Won't work under versions of Bash earlier than 3.1.
echo $a        # 15

echo $a        # 15Hello

       Here, += functions as a string concatenation operator. Note that
       its behavior in this particular context is different than within
       a let construct.

echo $a        # 1

let a+=5       # Integer arithmetic, rather than string concatenation.
echo $a        # 6

let a+=Hello   # Doesn't "add" anything to a.
echo $a        # 6

       Jeffrey Haemer points out that this concatenation operator can be
       quite useful. In this instance, we append a directory to the

bash$ echo $PATH

bash$ PATH+=/opt/bin

bash$ echo $PATH

37.2.2. Bash, version 3.2

   This is pretty much a bugfix update.

     * In global parameter substitutions, the pattern no longer anchors
       at the start of the string.
     * The --wordexp option disables process substitution.
     * The =~ Regular Expression match operator no longer requires
       quoting of the pattern within [[ ... ]].


   In fact, quoting in this context is not advisable as it may cause
   regex evaluation to fail. Chet Ramey states in the Bash FAQ that
   quoting explicitly disables regex evaluation. See also the
   [] Ubuntu Bug
   List and
   Wikinerds on Bash syntax.
   Setting shopt -s compat31 in a script causes reversion to the
   original behavior.

37.3. Bash, version 4

   Chet Ramey announced Version 4 of Bash on the 20th of February, 2009.
   This release has a number of significant new features, as well as
   some important bugfixes.

   Among the new goodies:

     * Associative arrays. [124]

   An associative array can be thought of as a set of two linked arrays
   -- one holding the data, and the other the keys that index the
   individual elements of the data array.
       Example 37-5. A simple address database


declare -A address
#       -A option declares associative array.

address[Charles]="414 W. 10th Ave., Baltimore, MD 21236"
address[John]="202 E. 3rd St., New York, NY 10009"
address[Wilma]="1854 Vermont Ave, Los Angeles, CA 90023"

echo "Charles's address is ${address[Charles]}."
# Charles's address is 414 W. 10th Ave., Baltimore, MD 21236.
echo "Wilma's address is ${address[Wilma]}."
# Wilma's address is 1854 Vermont Ave, Los Angeles, CA 90023.
echo "John's address is ${address[John]}."
# John's address is 202 E. 3rd St., New York, NY 10009.


echo "${!address[*]}"   # The array indices ...
# Charles John Wilma

       Example 37-6. A somewhat more elaborate address database

# A more elaborate version of

E_DB=99    # Error code for missing entry.

declare -A address
#       -A option declares associative array.

store_address ()
  return $?

fetch_address ()
  if [[ -z "${address[$1]}" ]]
    echo "$1's address is not in database."
    return $E_DB

  echo "$1's address is ${address[$1]}."
  return $?

store_address "Lucas Fayne" "414 W. 13th Ave., Baltimore, MD 21236"
store_address "Arvid Boyce" "202 E. 3rd St., New York, NY 10009"
store_address "Velma Winston" "1854 Vermont Ave, Los Angeles, CA 90023"
#  Exercise:
#  Rewrite the above store_address calls to read data from a file,
#+ then assign field 1 to name, field 2 to address in the array.
#  Each line in the file would have a format corresponding to the above.
#  Use a while-read loop to read from file, sed or awk to parse the fields.

fetch_address "Lucas Fayne"
# Lucas Fayne's address is 414 W. 13th Ave., Baltimore, MD 21236.
fetch_address "Velma Winston"
# Velma Winston's address is 1854 Vermont Ave, Los Angeles, CA 90023.
fetch_address "Arvid Boyce"
# Arvid Boyce's address is 202 E. 3rd St., New York, NY 10009.
fetch_address "Bozo Bozeman"
# Bozo Bozeman's address is not in database.

exit $?   # In this case, exit code = 99, since that is function return.

     * Enhancements to the case construct: the ;;& and ;& terminators.
       Example 37-7. Testing characters


test_char ()
  case "$1" in
    [[:print:]] )  echo "$1 is a printable character.";;&       # |
    # The ;;& terminator continues to the next pattern test.      |
    [[:alnum:]] )  echo "$1 is an alpha/numeric character.";;&  # v
    [[:alpha:]] )  echo "$1 is an alphabetic character.";;&     # v
    [[:lower:]] )  echo "$1 is a lowercase alphabetic character.";;&
    [[:digit:]] )  echo "$1 is an numeric character.";&         # |
    # The ;& terminator executes the next statement ...         # |
    %%%@@@@@    )  echo "********************************";;    # v
#   ^^^^^^^^  ... even with a dummy pattern.


test_char 3
# 3 is a printable character.
# 3 is an alpha/numeric character.
# 3 is an numeric character.
# ********************************

test_char m
# m is a printable character.
# m is an alpha/numeric character.
# m is an alphabetic character.
# m is a lowercase alphabetic character.

test_char /
# / is a printable character.


# The ;;& terminator can save complex if/then conditions.
# The ;& is somewhat less useful.

     * The new coproc builtin enables two parallel processes to
       communicate and interact. As Chet Ramey states in the Bash FAQ
       [125] , ver. 4.01:

         There is a new 'coproc' reserved word that specifies a coproce
         an asynchronous command run with two pipes connected to the cr
         shell. Coprocs can be named. The input and output file descrip
         and the PID of the coprocess are available to the calling shel
     l in
         variables with coproc-specific names.
         George Dimitriu explains,
         "... coproc ... is a feature used in Bash process substitution
         which now is made publicly available."
         This means it can be explicitly invoked in a script, rather th
         just being a behind-the-scenes mechanism used by Bash.

       Coprocesses use file descriptors. File descriptors enable
       processes and pipes to communicate.

# A coprocess communicates with a while-read loop.

coproc { cat mx_data.txt; sleep 2; }
#                         ^^^^^^^
# Try running this without "sleep 2" and see what happens.

while read -u ${COPROC[0]} line    #  ${COPROC[0]} is the
do                                 #+ file descriptor of the coprocess.
  echo "$line" | sed -e 's/line/NOT-ORIGINAL-TEXT/'

kill $COPROC_PID                   #  No longer need the coprocess,
                                   #+ so kill its PID.

       But, be careful!


echo; echo

coproc echo "one two three"
while read -u ${COPROC[0]} a b c;  #  Note that this loop
do                                 #+ runs in a subshell.
  echo "Inside while-read loop: ";
  echo "a = $a"; echo "b = $b"; echo "c = $c"
  echo "coproc file descriptor: ${COPROC[0]}"

# a = one
# b = two
# c = three
# So far, so good, but ...

echo "-----------------"
echo "Outside while-read loop: "
echo "a = $a"  # a =
echo "b = $b"  # b =
echo "c = $c"  # c =
echo "coproc file descriptor: ${COPROC[0]}"
#  The coproc is still running, but ...
#+ it still doesn't enable the parent process
#+ to "inherit" variables from the child process, the while-read loop.

#  Compare this to the "" script.


   The coprocess is asynchronous, and this might cause a problem. It may
   terminate before another process has finished communicating with it.

coproc cpname { for i in {0..10}; do echo "index = $i"; done; }
#      ^^^^^^ This is a *named* coprocess.
read -u ${cpname[0]}
echo $REPLY         #  index = 0
echo ${COPROC[0]}   #+ No output ... the coprocess timed out
#  after the first loop iteration.

# However, George Dimitriu has a partial fix.

coproc cpname { for i in {0..10}; do echo "index = $i"; done; sleep 1;
echo hi > myo; cat - >> myo; }
#       ^^^^^ This is a *named* coprocess.

echo "I am main"$'\04' >&${cpname[1]}
echo myfd=$myfd

### while read -u $myfd
### do
###   echo $REPLY;
### done

echo $cpname_PID

#  Run this with and without the commented-out while-loop, and it is
#+ apparent that each process, the executing shell and the coprocess,
#+ waits for the other to finish writing in its own write-enabled pipe.

     * The new mapfile builtin makes it possible to load an array with
       the contents of a text file without using a loop or command


mapfile Arr1 < $0
# Same result as     Arr1=( $(cat $0) )
echo "${Arr1[@]}"  # Copies this entire script out to stdout.

echo "--"; echo

# But, not the same as   read -a   !!!
read -a Arr2 < $0
echo "${Arr2[@]}"  # Reads only first line of script into the array.


     * The read builtin got a minor facelift. The -t timeout option now
       accepts (decimal) fractional values [126] and the -i option
       permits preloading the edit buffer. [127] Unfortunately, these
       enhancements are still a work in progress and not (yet) usable in
     * Parameter substitution gets case-modification operators.


echo ${var}            # veryMixedUpVariable
echo ${var^}           # VeryMixedUpVariable
#         *              First char --> uppercase.
echo ${var^^}          # VERYMIXEDUPVARIABLE
#         **             All chars  --> uppercase.
echo ${var,}           # veryMixedUpVariable
#         *              First char --> lowercase.
echo ${var,,}          # verymixedupvariable
#         **             All chars  --> lowercase.

     * The declare builtin now accepts the -l lowercase and -c
       capitalize options.


declare -l var1            # Will change to lowercase
echo "$var1"               # mixedcasevariable
# Same effect as             echo $var1 | tr A-Z a-z

declare -c var2            # Changes only initial char to uppercase.
echo "$var2"               # Originally_lowercase
# NOT the same effect as     echo $var2 | tr a-z A-Z

     * Brace expansion has more options.
       Increment/decrement, specified in the final term within braces.


echo {40..60..2}
# 40 42 44 46 48 50 52 54 56 58 60
# All the even numbers, between 40 and 60.

echo {60..40..2}
# 60 58 56 54 52 50 48 46 44 42 40
# All the even numbers, between 40 and 60, counting backwards.
# In effect, a decrement.
echo {60..40..-2}
# The same output. The minus sign is not necessary.

# But, what about letters and symbols?
echo {X..d}
# X Y Z [  ] ^ _ ` a b c d
# Does not echo the \ which escapes a space.

       Zero-padding, specified in the first term within braces, prefixes
       each term in the output with the same number of zeroes.

bash4$ echo {010..15}
010 011 012 013 014 015

bash4$ echo {000..10}
000 001 002 003 004 005 006 007 008 009 010

     * Substring extraction on positional parameters now starts with $0
       as the zero-index. (This corrects an inconsistency in the
       treatment of positional parameters.)

# show-params.bash
# Requires version 4+ of Bash.

# Invoke this scripts with at least one positional parameter.


if [ -z "$1" ]
  echo "Usage $0 param1 ..."

echo ${@:0}

# bash3 show-params.bash4 one two three
# one two three

# bash4 show-params.bash4 one two three
# show-params.bash4 one two three

# $0                $1  $2  $3

     * The new ** globbing operator matches filenames and directories

# filelist.bash4

shopt -s globstar  # Must enable globstar, otherwise ** doesn't work.
                   # The globstar shell option is new to version 4 of Bash.

echo "Using *"; echo
for filename in *
  echo "$filename"
done   # Lists only files in current directory ($PWD).

echo; echo "--------------"; echo

echo "Using **"
for filename in **
  echo "$filename"
done   # Lists complete file tree, recursively.


Using *



Using **


     * The new $BASHPID internal variable.
     * There is a new builtin error-handling function named


command_not_found_handle ()
{ # Accepts implicit parameters.
  echo "The following command is not valid: \""$1\"""
  echo "With the following argument(s): \""$2\"" \""$3\"""   # $4, $5 ...
} # $1, $2, etc. are not explicitly passed to the function.

bad_command arg1 arg2

# The following command is not valid: "bad_command"
# With the following argument(s): "arg1" "arg2"

   Editorial comment

   Associative arrays? Coprocesses? Whatever happened to the lean and
   mean Bash we have come to know and love? Could it be suffering from
   (horrors!) "feature creep"? Or perhaps even Korn shell envy?

   Note to Chet Ramey: Please add only essential features in future Bash
   releases -- perhaps for-each loops and support for multi-dimensional
   arrays. [128] Most Bash users won't need, won't use, and likely won't
   greatly appreciate complex "features" like built-in debuggers, Perl
   interfaces, and bolt-on rocket boosters.

37.3.1. Bash, version 4.1

   Version 4.1 of Bash, released in May, 2010, was primarily a bugfix

     * The printf command now accepts a -v option for setting array
     * Within double brackets, the > and < string comparison operators
       now conform to the locale. Since the locale setting may affect
       the sorting order of string expressions, this has side-effects on
       comparison tests within [[ ... ]] expressions.
     * The read builtin now takes a -N option (read -N chars), which
       causes the read to terminate after chars characters.
       Example 37-8. Reading N characters

# Requires Bash version -ge 4.1 ...


read -N $num_chars var < $0   # Read first 61 characters of script!
echo "$var"

####### Output of Script #######

# Requires Bash version -ge 4.1 ...


     * Here documents embedded in $( ... ) command substitution
       constructs may terminate with a simple ).
       Example 37-9. Using a here document to set a variable

# Requires Bash version -ge 4.1 ...

multi_line_var=$( cat <<ENDxxx
This is line 1 of the variable
This is line 2 of the variable
This is line 3 of the variable

#  Rather than what Bash 4.0 requires:
#+ that the terminating limit string and
#+ the terminating close-parenthesis be on separate lines.

# ENDxxx
# )

echo "$multi_line_var"

#  Bash still emits a warning, though.
#  warning: here-document at line 10 delimited
#+ by end-of-file (wanted `ENDxxx')

37.3.2. Bash, version 4.2

   Version 4.2 of Bash, released in February, 2011, contains a number of
   new features and enhancements, in addition to bugfixes.

     * Bash now supports the the \u and \U Unicode escape.

   Unicode is a cross-platform standard for encoding into numerical
   values letters and graphic symbols. This permits representing and
   displaying characters in foreign alphabets and unusual fonts.

echo -e '\u2630'   # Horizontal triple bar character.
# Equivalent to the more roundabout:
echo -e "\xE2\x98\xB0"
                   # Recognized by earlier Bash versions.

echo -e '\u220F'   # PI (Greek letter and mathematical symbol)
echo -e '\u0416'   # Capital "ZHE" (Cyrillic letter)
echo -e '\u2708'   # Airplane (Dingbat font) symbol

echo -e "The amplifier circuit requires a 100 \u2126 pull-up resistor."

echo -e $unicode_var      # Female symbol
printf "$unicode_var \n"  # Female symbol, with newline

#  And for something a bit more elaborate . . .

#  We can store Unicode symbols in an associative array,
#+ then retrieve them by name.
#  Run this in a gnome-terminal or a terminal with a large, bold font
#+ for better legibility.

declare -A symbol  # Associative array.


echo -ne "${symbol[script_E]}   "
echo -ne "${symbol[script_F]}   "
echo -ne "${symbol[script_J]}   "
echo -ne "${symbol[script_M]}   "
echo -ne "${symbol[Rx]}   "
echo -ne "${symbol[TEL]}   "
echo -ne "${symbol[FAX]}   "
echo -ne "${symbol[care_of]}   "
echo -ne "${symbol[account]}   "
echo -ne "${symbol[trademark]}   "


   The above example uses the $' ... ' string-expansion construct.
     * When the lastpipe shell option is set, the last command in a pipe
       doesn't run in a subshell.
       Example 37-10. Piping input to a read


line=''                   # Null value.
echo "\$line = "$line""   # $line =


shopt -s lastpipe         # Error on Bash version -lt 4.2.
echo "Exit status of attempting to set \"lastpipe\" option is $?"
#     1 if Bash version -lt 4.2, 0 otherwise.


head -1 $0 | read line    # Pipe the first line of the script to read.
#            ^^^^^^^^^      Not in a subshell!!!

echo "\$line = "$line""
# Older Bash releases       $line =
# Bash version 4.2          $line = #!/bin/bash

       This option offers possible "fixups" for these example scripts:
       Example 34-3 and Example 15-8.
     * Negative array indices permit counting backwards from the end of
       an array.
       Example 37-11. Negative array indices

# Requires Bash, version -ge 4.2.

array=( zero one two three four five )   # Six-element array.

# Negative array indices now permitted.
echo ${array[-1]}   # five
echo ${array[-2]}   # four
# ...
echo ${array[-6]}   # zero
# Negative array indices count backward from the last element+1.

# But, you cannot index past the beginning of the array.
echo ${array[-7]}   # array: bad array subscript

# So, what is this new feature good for?

echo "The last element in the array is "${array[-1]}""
# Which is quite a bit more straightforward than:
echo "The last element in the array is "${array[${#array[*]}-1]}""

# And ...

let "neg_element_count = 0 - ${#array[*]}"
# Number of elements, converted to a negative number.

while [ $index -gt $neg_element_count ]; do
  ((index--)); echo -n "${array[index]} "
done  # Lists the elements in the array, backwards.
      # We have just simulated the "tac" command on this array.


     * Substring extraction uses a negative length parameter to specify
       an offset from the end of the target string.
       Example 37-12. Negative parameter in string-extraction construct

# Bash, version -ge 4.2
# Negative length-index in substring extraction.
# Important: This changes the interpretation of this construct!


echo ${stringZ}                              # abcABC123ABCabc
echo ${stringZ:2:3}                          #   cAB
#  Count 2 chars forward from string beginning, and extract 3 chars.
#  ${string:position:length}

#  So far, nothing new, but now ...

echo ${stringZ:3:-6}                         #    ABC123
#                ^
#  Index 3 chars forward from beginning and 6 chars backward from end,
#+ and extract everything in between.
#  ${string:offset-from-front:offset-from-end}
#  When the "length" parameter is negative,
#+ it serves as an "offset-from-end" parameter.
Chapter 38. Endnotes

38.1. Author's Note


   doce ut discas

   (Teach, that you yourself may learn.)

   How did I come to write a scripting book? It's a strange tale. It
   seems that a few years back I needed to learn shell scripting -- and
   what better way to do that than to read a good book on the subject? I
   was looking to buy a tutorial and reference covering all aspects of
   the subject. I was looking for a book that would take difficult
   concepts, turn them inside out, and explain them in excruciating
   detail, with well-commented examples. [129] In fact, I was looking
   for this very book, or something very much like it. Unfortunately, it
   didn't exist, and if I wanted it, I'd have to write it. And so, here
   we are, folks.

   That reminds me of the apocryphal story about a mad professor. Crazy
   as a loon, the fellow was. At the sight of a book, any book -- at the
   library, at a bookstore, anywhere -- he would become totally obsessed
   with the idea that he could have written it, should have written it
   -- and done a better job of it to boot. He would thereupon rush home
   and proceed to do just that, write a book with the very same title.
   When he died some years later, he allegedly had several thousand
   books to his credit, probably putting even Asimov to shame. The books
   might not have been any good, who knows, but does that really matter?
   Here's a fellow who lived his dream, even if he was obsessed by it,
   driven by it . . . and somehow I can't help admiring the old coot.

38.2. About the Author

   Who is this guy anyhow?

   The author claims no credentials or special qualifications, [130]
   other than a compulsion to write. [131] This book is somewhat of a
   departure from his other major work, HOW-2 Meet Women: The Shy Man's
   Guide to Relationships. He has also written the Software-Building
   HOWTO. Of late, he has been trying his (heavy) hand at short fiction.

   A Linux user since 1995 (Slackware 2.2, kernel 1.2.1), the author has
   emitted a few software truffles, including the
   [] cruft
   one-time pad encryption utility, the
   [] mcalc
   mortgage calculator, the
   judge Scrabble╝ adjudicator, the
   [] yawl word
   gaming list package, and the
   [] Quacky anagramming
   gaming package. He got off to a rather shaky start in the computer
   game -- programming FORTRAN IV on a CDC 3800 (on paper coding pads,
   no less) -- and is not the least bit nostalgic for those days.

   Living in a secluded community with wife and orange tabby, he
   cherishes human frailty, especially his own. [132]

38.3. Where to Go For Help

   [] The author will infrequently, if
   not too busy (and in a good mood), answer general scripting
   questions. [133] However, if you have a problem getting a specific
   script to work, you would be well advised to post to the
   [] Usenet newsgroup.

   If you need assistance with a schoolwork assignment, read the
   pertinent sections of this and other reference works. Do your best to
   solve the problem using your own wits and resources. Please do not
   waste the author's time. You will get neither help nor sympathy.

   Likewise, kindly refrain from annoying the author with solicitations,
   offers of employment, or "business opportunities." He is doing just
   fine, and requires neither help nor sympathy, thank you.


   ... sophisticated in mechanism but possibly agile operating under
   noises being extremely suppressed ...

   --CI-300 printer manual

38.4. Tools Used to Produce This Book

38.4.1. Hardware

   A used IBM Thinkpad, model 760XL laptop (P166, 104 meg RAM) running
   Red Hat 7.1/7.3. Sure, it's slow and has a funky keyboard, but it
   beats the heck out of a No. 2 pencil and a Big Chief tablet.

   Update: upgraded to a 770Z Thinkpad (P2-366, 192 meg RAM) running
   FC3. Anyone feel like donating a later-model laptop to a starving
   writer <g>?

   Update: upgraded to a A31 Thinkpad (P4-1.6, 512 meg RAM) running FC8.
   No longer starving, and no longer soliciting donations <g>.

38.4.2. Software and Printware

    i. Bram Moolenaar's powerful SGML-aware [] vim
       text editor.
   ii. [] OpenJade, a DSSSL rendering
       engine for converting SGML documents into other formats.
   iii. Norman Walsh's DSSSL stylesheets.
   iv. DocBook, The Definitive Guide, by Norman Walsh and Leonard
       Muellner (O'Reilly, ISBN 1-56592-580-7). This is still the
       standard reference for anyone attempting to write a document in
       Docbook SGML format.

38.5. Credits

   Community participation made this project possible. The author
   gratefully acknowledges that writing this book would have been
   unthinkable without help and feedback from all you people out there.

   [] Philippe Martin translated the first version
   (0.1) of this document into DocBook/SGML. While not on the job at a
   small French company as a software developer, he enjoys working on
   GNU/Linux documentation and software, reading literature, playing
   music, and, for his peace of mind, making merry with friends. You may
   run across him somewhere in France or in the Basque Country, or you
   can email him at []

   Philippe Martin also pointed out that positional parameters past $9
   are possible using {bracket} notation. (See Example 4-5).

   StИphane Chazelas sent a long list of corrections, additions, and
   example scripts. More than a contributor, he had, in effect, for a
   while taken on the role of co-editor for this document. Merci

   Paulo Marcel Coelho Aragao offered many corrections, both major and
   minor, and contributed quite a number of helpful suggestions.

   I would like to especially thank Patrick Callahan, Mike Novak, and
   Pal Domokos for catching bugs, pointing out ambiguities, and for
   suggesting clarifications and changes in the preliminary version
   (0.1) of this document. Their lively discussion of shell scripting
   and general documentation issues inspired me to try to make this
   document more readable.

   I'm grateful to Jim Van Zandt for pointing out errors and omissions
   in version 0.2 of this document. He also contributed an instructive
   example script.

   Many thanks to [] Jordi Sanfeliu for giving
   permission to use his fine tree script (Example A-16), and to Rick
   Boivie for revising it.

   Likewise, thanks to [] Michel Charpentier
   for permission to use his dc factoring script (Example 16-52).

   Kudos to [] Noah Friedman for
   permission to use his string function script (Example A-18).

   Emmanuel Rouat suggested corrections and additions on command
   substitution and aliases. He also contributed a very nice sample
   .bashrc file (Appendix L).

   [] Heiner Steven kindly gave permission to
   use his base conversion script, Example 16-48. He also made a number
   of corrections and many helpful suggestions. Special thanks.

   Rick Boivie contributed the delightfully recursive script
   (Example 36-9), revised the script (Example A-16), and
   suggested performance improvements for the script
   (Example 16-47).

   Florian Wisser enlightened me on some of the fine points of testing
   strings (see Example 7-6), and on other matters.

   Oleg Philon sent suggestions concerning cut and pidof.

   Michael Zick extended the empty array example to demonstrate some
   surprising array properties. He also contributed the isspammer
   scripts (Example 16-41 and Example A-28).

   Marc-Jano Knopp sent corrections and clarifications on DOS batch

   Hyun Jin Cha found several typos in the document in the process of
   doing a Korean translation. Thanks for pointing these out.

   Andreas Abraham sent in a long list of typographical errors and other
   corrections. Special thanks!

   Others contributing scripts, making helpful suggestions, and pointing
   out errors were Gabor Kiss, Leopold Toetsch, Peter Tillier, Marcus
   Berglof, Tony Richardson, Nick Drage (script ideas!), Rich Bartell,
   Jess Thrysoee, Adam Lazur, Bram Moolenaar, Baris Cicek, Greg
   Keraunen, Keith Matthews, Sandro Magi, Albert Reiner, Dim Segebart,
   Rory Winston, Lee Bigelow, Wayne Pollock, "jipe," "bojster," "nyal,"
   "Hobbit," "Ender," "Little Monster" (Alexis), "Mark," "Patsie,"
   "vladz," Peggy Russell, Emilio Conti, Ian. D. Allen, Hans-Joerg
   Diers, Arun Giridhar, Dennis Leeuw, Dan Jacobson, Aurelio Marinho
   Jargas, Edward Scholtz, Jean Helou, Chris Martin, Lee Maschmeyer,
   Bruno Haible, Wilbert Berendsen, Sebastien Godard, BjЖn Eriksson,
   John MacDonald, John Lange, Joshua Tschida, Troy Engel, Manfred
   Schwarb, Amit Singh, Bill Gradwohl, E. Choroba, David Lombard, Jason
   Parker, Steve Parker, Bruce W. Clare, William Park, Vernia Damiano,
   Mihai Maties, Mark Alexander, Jeremy Impson, Ken Fuchs, Jared Martin,
   Frank Wang, Sylvain Fourmanoit, Matthew Sage, Matthew Walker, Kenny
   Stauffer, Filip Moritz, Andrzej Stefanski, Daniel Albers, Jeffrey
   Haemer, Stefano Palmeri, Nils Radtke, Sigurd Solaas, Serghey Rodin,
   Jeroen Domburg, Alfredo Pironti, Phil Braham, Bruno de Oliveira
   Schneider, Stefano Falsetto, Chris Morgan, Walter Dnes, Linc
   Fessenden, Michael Iatrou, Pharis Monalo, Jesse Gough, Fabian Kreutz,
   Mark Norman, Harald Koenig, Dan Stromberg, Peter Knowles, Francisco
   Lobo, Mariusz Gniazdowski, Sebastian Arming, Chetankumar Phulpagare,
   Benno Schulenberg, Tedman Eng, Jochen DeSmet, Juan Nicolas Ruiz,
   Oliver Beckstein, Achmed Darwish, Dotan Barak, Richard Neill, Albert
   Siersema, Omair Eshkenazi, Geoff Lee, JuanJo Ciarlante, Cliff
   Bamford, Nathan Coulter, Ramses Rodriguez Martinez, Evgeniy Ivanov,
   George Dimitriu, Kevin LeBlanc, Antonio Macchi, Tomas Pospisek,
   Andreas KЭhne, PАdraig Brady, and David Lawyer (himself an author of
   four HOWTOs).

   My gratitude to Chet Ramey and Brian Fox for writing Bash, and
   building into it elegant and powerful scripting capabilities rivaling
   those of ksh.

   Very special thanks to the hard-working volunteers at the Linux
   Documentation Project. The LDP hosts a repository of Linux knowledge
   and lore, and has, to a great extent, enabled the publication of this

   Thanks and appreciation to IBM, Red Hat, the []
   Free Software Foundation, and all the good people fighting the good
   fight to keep Open Source software free and open.

   Belated thanks to my fourth grade teacher, Miss Spencer, for
   emotional support and for convincing me that maybe, just maybe I
   wasn't a total loss.

   Thanks most of all to my wife, Anita, for her encouragement,
   inspiration, and emotional support.

38.6. Disclaimer

   (This is a variant of the standard [] LDP

   No liability for the contents of this document can be accepted. Use
   the concepts, examples and information at your own risk. There may be
   errors, omissions, and inaccuracies that could cause you to lose data
   or harm your system, so proceed with appropriate caution. The author
   takes no responsibility for any damages, incidental or otherwise.

   As it happens, it is highly unlikely that either you or your system
   will suffer ill effects. In fact, the raison d'etre of this book is
   to enable its readers to analyze shell scripts and determine whether
   they have unanticipated consequences.

Комментариев нет:

Отправить комментарий