korn shell

55
Korn Shell (ksh) Programming A tutorial by Philip Brown Copyright 2002 This is the top level of my "Intro to Korn shell programming" tree. Korn shell is a 'shell-scripting' language, as well as a user-level login shell. It is also a superset of a POSIX.1 compliant shell, which is great for ensuring portability. Why scripting? Scripting, when done right, is a fast, easy way to "get the job done", without the usual "code, compile, test, debug" overhead of writing in C or some other compiled language. It is easier than C for multiple reasons: 1. Scripting commands tend to be more readable than low-level code. (with the exception of perl) 2. Scripting languages tend to come with powerful tools attached 3. There is no "compile" phase, so "tweaking" can be done rapidly. UNIX tends to take #2 to extremes, since it comes standard with "powerful tools" that can be strung together with pipes or other mechanisms, to get the result you want, with a short development time. It may not be as efficient as a fully compiled solution, but quite often it can "get the job done" in a few seconds of run time, compared to 1 second or less for a compiled program. A quick scripting solution can be used as a prototype. Then, when you have been using the prototype happily for a while, and you have evolved the behaviour your end users are happiest with, you can go back and code a faster, more robust solution in a lower-level language. That is not to say that scripts cannot be robust! It is possible to do a great deal of error checking in scripts. Unfortunately, it is not common practice to do so. What can you do with ksh? A heck of a lot! You have access to the full range of UNIX utilities, plus some nifty built-in resources. Generally speaking, UNIX scripting is a matter of using the various command line utilities as appropriate, with the shell as a means of facilitating communication between each step. Unfortunately, running all these separate 'external' programs can sometimes result in things working rather slowly. Which is why ksh has a few more things "built in" to it than the older 'sh'. 1

Upload: lowenloven

Post on 04-Apr-2015

193 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Korn Shell

Korn Shell (ksh) Programming

A tutorial by Philip Brown Copyright 2002

This is the top level of my "Intro to Korn shell programming" tree. Korn shell is a 'shell-scripting' language, as well as a user-level login shell. It is also a superset of a POSIX.1 compliant shell, which is great for ensuring portability.

Why scripting? Scripting, when done right, is a fast, easy way to "get the job done", without the usual "code, compile, test, debug" overhead of writing in C or some other compiled language. It is easier than C for multiple reasons:

1. Scripting commands tend to be more readable than low-level code. (with the exception of perl) 2. Scripting languages tend to come with powerful tools attached 3. There is no "compile" phase, so "tweaking" can be done rapidly.

UNIX tends to take #2 to extremes, since it comes standard with "powerful tools" that can be strung together with pipes or other mechanisms, to get the result you want, with a short development time. It may not be as efficient as a fully compiled solution, but quite often it can "get the job done" in a few seconds of run time, compared to 1 second or less for a compiled program.

A quick scripting solution can be used as a prototype. Then, when you have been using the prototype happily for a while, and you have evolved the behaviour your end users are happiest with, you can go back and code a faster, more robust solution in a lower-level language.

That is not to say that scripts cannot be robust! It is possible to do a great deal of error checking in scripts. Unfortunately, it is not common practice to do so.

What can you do with ksh?

A heck of a lot! You have access to the full range of UNIX utilities, plus some nifty built-in resources. Generally speaking, UNIX scripting is a matter of using the various command line utilities as appropriate, with the shell as a means of facilitating communication between each step. Unfortunately, running all these separate 'external' programs can sometimes result in things working rather slowly. Which is why ksh has a few more things "built in" to it than the older 'sh'.

1

Page 2: Korn Shell

Why ksh, not XYZsh for programming?

Bourne shell has been THE "standard" UNIX shellscript language for many years. However, it does lack some things that would be very useful, at a basic level. Some of those things were added to C-shell (csh) later on. However, csh is undesirable to use in programming, for various reasons. Happily, ksh adds most of the things that people say csh has, but sh does not. So much so, that ksh became the basis for the "POSIX shell". Which means that all properly POSIX-compliant systems MUST HAVE something compatible, even though it is now named "sh" like the old Bourne shell. (For example, /usr/xpg4/bin/sh, is a LINK to /bin/ksh, on solaris!) More precisely, the behaviour of a POSIX-compliant shell is specified in "IEEE POSIX 1003.2" BTW: "Korn shell" was written by "David Korn", around 1982 at AT&T labs. You can now freely download the full source from AT&T if you're not lucky enough to use an OS that comes with ksh already. It's "open source", even!

Chapter headings Now we get to the good stuff. Please note: you should read these in order. If you do not understand all the stuff from the previous chapters, you may get very confused.

• Before you start: Basic skills you need. • Ksh Basics: Loops and conditions. • Advanced variable operators: Read This. Yes, you. • POSIX.1 utilities: Externally usable tools on most UNIXen • Functions: Just like in "regular" programming languages • Built-in functions: Integrated commands that run fast. • Redirection and pipes: Joining things together • Interesting stuff: The 'Misc' page • Paranoia and good programming: Be a good programmer! • Real code: A useful program done wrong, and then done right.

This material Copyright 2000 Philip Brown Part of bolthole.com... Solaris tips ... AWK tutorial ... OO Programming tutorial Ksh preparedness Howdy, and welcome to the intro to Korn shell programming, AKA POSIX.1 shell.

This is the first part of the larger tutorial. It assumes you want to learn how to really be a programmer in ksh, as opposed to someone who just quickly throws something together in a file and stops as soon as it works.

2

Page 3: Korn Shell

This particular chapter assumes you have done minimal or no sh programming before, so has a lot more general stuff. Here are the most important things to know and do, before really getting serious about shellscripting.

Step 0. Learn how to type. No, I mean type PROPERLY, and unconsciously. You should be able to type faster than you can write, and not be sweating while doing so. The reason for this is that good programming practices involve extra typing. You won't follow the best practices, if you are too fussed up about "How can I use as few characters as possible to get all this typing done?" Step 1. Learn how to use a text editor. No, not microsoft word, or word perfect. Those are "word processors". I'm talking about something small, fast, and quick, that makes shoving pure text around easy. [Note that emacs does not fit either 'small', 'fast', OR 'quick' :-)]

You will have to be very comfortable with your choice of text editor, because that’s how you make shellscripts. All examples given should be put into some file. You can then run it with "ksh file". Or, do the more official way; Put the directions below, exactly as-is, into a file, and follow the directions in it.

#!/bin/ksh # the above must always be the first line. But generally, lines # starting with '#' are comments. They dont do anything. # This is the only time I will put in the '#!/bin/ksh' bit. But # EVERY EXAMPLE NEEDS IT, unless you want to run the examples with # 'ksh filename' every time. # # If for some odd reason, you dont have ksh in /bin/ksh, change # the path above, as appropriate. # # Then do 'chmod 0755 name-of-this-file'. After that, # you will be able to use the filename directly like a command echo Yeup, you got the script to work!

Step 2. Understand variables. Hopefully, you already understand the concept of a variable. It is a place you can store a value to, and then do operations on "whatever is in this place", vs. the value directly.

In shellscripts, a variable can contain a collection of letters and/or numbers [aka a 'string'] , as well as pure numbers.

You set a variable by using

variablename="some string here" OR

3

Page 4: Korn Shell

variablename=1234

You access what is IN a variable, by putting a dollar-sign in front of it. echo $variablename OR echo ${variablename} If you have JUST a number in a variable, you can do math operations on it. But that comes later on in this tutorial. Step 3. Put everything in appropriate variables Well, okay, not EVERYTHING :-) But properly named variables make the script more easily readable. There isn't really a 'simple' example for this, since it is only "obvious" in large scripts. So either just take my word for it, or stop reading and go somewhere else now!

An example of "proper" variable naming practice:

#Okay, this script doesn’t do anything useful, it is just for demo # purposes, and normally, I would put in more safety checks, but this # is a quickie. INPUTFILE="$1" USERLIST="$2" OUTPUTFILE="$3" count=0 while read username ; do grep $username $USERLIST >>$OUTPUTFILE count=$(($count+1)) done < $INPUTFILE echo user count is $count

While the script may not be totally readable to you yet, I think you'll agree it is a LOT clearer than the following;

i=0 while read line ; do grep $line $2 >> $3 i=$(($i+1)) done

Note that '$1' means the first argument to your script. '$*' means "all the arguments together '$#' means "how many arguments are there?"

Step 4. Know your quotes It is very important to know when, and what type, of quotes to use. Quotes are generally used to group things together into a single entity.

4

Page 5: Korn Shell

Single-quotes are literal quotes. Double-quotes can have their contents expanded echo "$PWD"

prints out your current directory echo '$PWD'

prints out the string $PWD echo $PWDplusthis

prints out NOTHING. There is no such variable "PWDplusthis echo "$PWD"plusthis

prints out your current directory, and the string "plusthis" immediately following it. You could also accomplish this with the alternate form of using variables,

echo ${PWD}plusthis

Ksh basics This is a quickie page to run through basic "program flow control" commands, if you are completely new to shell programming. The basic ways to shape a program, are loops, and conditionals. Conditionals say "run this command, IF some condition is true". Loops say "repeat these commands" (usually, until some condition is met, and then you stop repeating).

Conditionals

IF The basic type of condition is "if".

if [ $? -eq 0 ] ; then print we are okay else print something failed fi

IF the variable $? is equal to 0, THEN print out a message. Otherwise (else), print out a different message. FYI, "$?" checks the exit status of the last command run.

The final 'fi' is required. This is to allow you to group multiple things together. You can have multiple things between if and else, or between else and fi, or both. You can even skip the 'else' altogether, if you don’t need an alternate case.

if [ $? -eq 0 ] ; then print we are okay print We can do as much as we like here fi

CASE The case statement functions like 'switch' in some other languages. Given a particular variable, jump to a particular set of commands, based on the value of that variable.

While the syntax is similar to C on the surface, there are some major differences;

5

Page 6: Korn Shell

• The variable being checked can be a string, not just a number • There is no "fall through". You hit only one set of commands • To make up for no 'fall through', you can 'share' variable states • You can use WILDCARDS to match strings

echo input yes or no read answer case $answer in yes|Yes|y) echo got a positive answer # the following ';;' is mandatory for every set # of comparative xxx) that you do ;; no) echo got a 'no' ;; q*|Q*) #assume the user wants to quit exit ;; *) echo This is the default clause. we are not sure why or echo what someone would be typing, but we could take echo action on it here ;; esac

Loops WHILE The basic loop is the 'while' loop; "while" something is true, keep looping.

There are two ways to stop the loop. The obvious way is when the 'something' is no longer true. The other way is with a 'break' command.

keeplooping=1; while [[ $keeplooping -eq 1 ]] ; do read quitnow if [[ "$quitnow" = "yes" ]] ; then keeplooping=0 fi if [[ "$quitnow" = "q" ]] ; then break; fi done

UNTIL The other kind of loop in ksh, is 'until'. The difference between them is that 'while' implies looping while something remains true. 'until', implies looping until something false, becomes true

6

Page 7: Korn Shell

until [[ $stopnow -eq 1 ]] ; do echo just run this once stopnow=1; echo we should not be here again. done

FOR A "for loop", is a "limited loop". It loops a specific number of times, to match a specific number of items. Once you start the loop, the number of times you will repeat is fixed.

The basic syntax is

for var in one two three ; do echo $var done

Whatever name you put in place of 'var', will be updated by each value following "in". So the above loop will print out

one two three

But you can also have variables defining the item list. They will be checked ONLY ONCE, when you start the loop.

list="one two three" for var in $list ; do echo $var # Note: Changing this does NOT affect the loop items list="nolist" done

The two things to note are:

1. It stills prints out "one" "two" "three" 2. Do NOT quote "$list", if you want the 'for' command to use multiple items

If you used "$list" in the 'for' line, it would print out a SINGLE LINE, "one two three"

Advanced variable usage Arrays Yes, you CAN have arrays in ksh, unlike old bourne shell. The syntax is as follows:

# This is an OPTIONAL way to quickly null out prior values set -A array # array[1]="one" array[2]="two" array[3]="three" three=3

7

Page 8: Korn Shell

print ${array[1]} print ${array[2]} print ${array[3]} print ${array[three]}

Special variables There are some "special" variables that ksh itself gives values to. Here are the ones I find interesting

• PWD - always the current directory • RANDOM - a different number every time you access it • $$ - the current process id (of the script, not the user's shell) • PPID - the "parent process"s ID. (BUT NOT ALWAYS, FOR FUNCTIONS) • $? - exit status of last command run by the script • PS1 - your "prompt". "PS1='$PWD:> '" is interesting. • $1 to $9 - arguments 1 to 9 passed to your script or function

Tweaks with variables Both bourne shell and KSH have lots of strange little tweaks you can do with the ${} operator. The ones I like are below.

To give a default value if and ONLY if a variable is not already set, use this construct:

APP_DIR=${APP_DIR:-/usr/local/bin}

(KSH only) You can also get funky, by running an actual command to generate the value. For example

DATESTRING=${DATESTRING:-$(date)}

(KSH only) To count the number of characters contained in a variable string, use ${#varname}.

echo num of chars in stringvar is ${#stringvar}

Ksh and POSIX utilities

POSIX.1 (or is it POSIX.2?) compliant systems (eg: most current versions of UNIX) come with certain incredibly useful utilities. The short list is:

cat, cut, join, paste, comm, fmt, grep, egrep, sed, awk

8

Page 9: Korn Shell

Any of these commands (and many others) can be used within your shellscripts to manipulate data.

Some of these are programming languages themselves. Sed is fairly complex, and AWK is actually its own mini-programming language. So I'll just skim over some basic hints and tricks.

cut "cut" is a small, lean version of what most people use awk for. It will "cut" a file up into columns, and particularly, only the columns you specify. Its drawbacks are:

1. It is picky about argument order. You MUST use the -d argument before the -f argument 2. It defaults to a tab, SPECIFICALLY, as its delimiter of columns.

The first one is just irritating. The second one is a major drawback, if you want to be flexible about files. This is the reason why AWK is used more, even for this trivial type of operator: Awk defaults to letting ANY whitespace define columns. join join is similar to a "database join" command, except it works with files. If you have two files, both with information sorted by username, you can "join" them in one file, IF and ONLY IF they are also sorted by that same field. For example

john_s John Smith in one file, and

john_s 1234 marlebone rd will be joined to make a single line,

john_s John Smith 1234 marlebone rd

If the files do not already have a common field, you could either use the paste utility to join the two files, or give each file line numbers before joining them, with

cat -n file1 >file1.numbered

comm

I think of "comm" as being short for "compare", in a way. But technically, it stands for "common lines". First, run any two files through "sort". Then you can run 'comm file1 file2' to tell you which lines are ONLY in file1, or ONLY in file2, or both. Or any combination.

For example

comm -1 file1 file2

means "Do not show me lines ONLY in file1." Which is the same thing as saying "Show me lines that are ONLY in file2", and also "Show me lines that are in BOTH file1 and file2".

9

Page 10: Korn Shell

fmt

fmt is a simple command that takes some informational text file, and word-wraps it nicely to fit within the confines of a fixed-width terminal. Okay, it isn't so useful in shellscripts, but its cool enough I just wanted to mention it :-)

pr is similarly useful. But where fmt was more oriented towards paragraphs, pr is more specifically toward page-by-page formatting.

grep and egrep

These are two commands that have more depth to them than is presented here. But generally speaking, you can use them to find specific lines in a file (or files) that have information you care about. One of the more obvious uses of them is to find out user information. For example,

grep joeuser /etc/passwd will give you the line in the passwd file that is about account 'joeuser'. If you are suitable paranoid, you would actually use

grep '^joeuser:' /etc/passwd to make sure it did not accidentally pick up information about 'joeuser2' as well.

(Note: this is just an example: often, awk is more suitable than grep, for /etc/passwd fiddling)

sed

Sed actually has multiple uses, but its simplest use is "substitute this string, where you see that string". The syntax for this is

sed 's/oldstring/newstring/' This will look at every line of input, and change the FIRST instance of "oldstring" to "newstring".

If you want it to change EVERY instance on a line, you must use the 'global' modifier at the end:

sed 's/oldstring/newstring/g'

If you want to substitute either an oldstring or a newstring that has slashes in it, you can use a different separator character:

sed 's:/old/path:/new/path:'

awk

Awk really deserves its own tutorial, since it is its own mini-language. And, it has one!But if you dont have time to look through it, the most common use for AWK is to print out specific columns of a file. You can specify what character separates columns. The default is 'whitespace' (space,

10

Page 11: Korn Shell

or TAB). But the cannonical example, is "How do I print out the first and fifth columns/fields of the password file?"

awk -F: '{print $1,$5}' /etc/passwd

"-F:" defines the "field separator" to be ':'

The bit between single-quotes is a mini-program that awk interprets. You can tell awk filename(s), after you tell it what program to run. OR you can use it in a pipe.

You must use single-quotes for the mini-program, to avoid $1 being expanded by the shell itself. In this case, you want awk to literally see '$1'

"$x" means the 'x'th column The comma is a quick way to say "put a space here". If you instead did

awk -F: '{print $1 $5}' /etc/passwd

awk would not put any space between the columns!

If you are interested in learning more about AWK, read my AWK tutorial

Ksh Functions Functions are the key to writing just about ANY program that is longer than a page or so of text. Other languages may call functions something else. But essentially, its all a matter of breaking up a large program, into smaller, manageable chunks. Ideally, functions are sort of like 'objects' for program flow. You pick a part of your program that is pretty much self-contained, and make it into its own 'function' Why are functions critical? Properly written functions can exist by themselves, and affect only small things external to themselves. You should DOCUMENT what things it changes external to itself. Then you can look very carefully just at the function, and determine whether it actually does what you think it does :-)

When your program isn't working properly(WHEN, not if), you can then put in little debug notes to yourself in the approximate section you think is broken. If you suspect a function is not working, then all you have to verify is

• Is the INPUT to the function correct? • Is the OUTPUT from the function correct?

Once you have done that, you then know the entire function is correct, for that particular set of input(s), and you can look for errors elsewhere.

A trivial function

11

Page 12: Korn Shell

printmessage() { echo "Hello, this is the printmessage function" } printmessage

The first part, from the first "printmessage()" all the way through the final '}', is the function definition. It only defines what the function does, when you decide to call it. It does not DO anything, until you actually say "I want to call this function now".

You call a function in ksh, by pretending it is a regular command, as shown above. Just have the function name as the first part of your line. Or any other place commands go. For example,

echo The message is: `printmessage`

Remember: Just like its own separate shellscript. Which means if you access "$1" in a function, it is the first argument passed in to the function, not the shellscript. Debugging your functions If you are really having difficulties, it should be easy to copy the entire function into another file, and test it separately from the main program.

This same type of modularity can be achieved by making separate script files, instead of functions. In some ways, that is almost preferable, because it is then easier to test each part by itself. But functions run much faster than separate shellscripts.

A nice way to start a large project is to start with multiple, separate shellscripts, but then encapsulate them into functions in your main script, once you are happy with how they work. CRITICAL ISSUE: exit vs return THE main difference when changing between shellscripts and functions, is the use of "exit".

'exit' will exit the entire script, whether it is in a function or not. 'return' will just quit the function. Like 'exit', however, it can return the default "success" value of 0, or any number from 1-255 that you specify. You can then check the return value of a function, just in the same way you can check the return value of an external program, with the $? variable.

# This is just a dummy script. It does not DO anything fatal(){ echo FATAL ERROR # This will quit the 'fatal' function, and the entire script that # it is in! exit } lessthanfour(){ if [[ "$1" = "" ]] ; then echo "hey, give me an argument" ; return 1; fi # we should use 'else' here, but this is just a demonstration

12

Page 13: Korn Shell

if [[ $1 -lt 4 ]] ; then echo Argument is less than 4 # We are DONE with this function. Dont do anything else in # here. But the shellscript will continue at the caller return fi echo Argument is equal to or GREATER than 4 echo We could do other stuff if we wanted to now } echo note that the above functions are not even called. They are just echo defined

A bare "return" in a shellscript is an error. It can only be used inside a function.

CRITICAL ISSUE: "scope" for function variables! Be warned: Functions act almost just like external scripts... except that by default, all variables are SHARED between the same ksh process! If you change a variable name inside a function.... that variable's value will still be changed after you have left the function!! Run this script to see what I mean.

#!/bin/sh # Acts the same with /bin/sh, or /bin/ksh, or /bin/bash subfunc(){ echo sub: var starts as $var var=2 echo sub: var is now $var } var=1 echo var starts as $var, before calling function '"subfunc"' subfunc # calls the function echo var after function is now $var

To avoid this behaviour, and give what is known as "local scope" to a variable, you can use the typeset command, to define a variable as local to the function.

#!/bin/ksh # You must use a modern sh like /bin/ksh, or /bin/bash for this subfunc(){ typeset var echo sub: var starts as $var '(empty)' var=2 echo sub: var is now $var } var=1 echo var starts as $var, before calling function '"subfunc"' subfunc # calls the function echo var after function is now $var

Another exception to this is if you call a function in the 'background', or as part of a pipe (like echo val | function ) This makes the function be called in a separate ksh process, which cannot dynamically share variables back to the parent shell. Another way that this happens, is if you use backticks to call the function. This

13

Page 14: Korn Shell

treats the function like an external call, and forks a new shell. This means the variable from the parent will not be updated. Eg:

func() { newval=$(($1 + 1)) echo $newval echo in func: newval ends as $newval } newval=1 echo newval in main is $newval output=`func $newval` func $newval echo output is : $output echo newval finishes in main as $newval

Write Comments! Lastly, as mentioned in the good practices chapter, don’t forget to comment your functions! While shellscripts are generally easier to read than most programming languages, you really can't beat actual human language to explain what a function is doing! Ksh built-in functions Calling the many UNIX programs in /bin and elsewhere can be very useful. The one drawback is speed. Every time you call a separate program, there is a lot of overhead in starting it up. So the conscientious programmer always tries to use built-in functions over external ones. In particular, ksh programmers should always try use '[[ ]]' over '[ ]', except where [] is necessary

The more useful functions in ksh I find are:

• 'read' and 'set' functions • built-in 'test': [[ ]]

(But this is apparently NOT a part of POSIX!!) • built-in math : $(( )) • built-in 'typeset'

See the manpages for 'test' and 'typeset', if you want full info on those beasties.

Read and Set

read varname

will set the variable varname to have the value of the next line read in from standard input.

What often comes next, is

set $varname

This sets the argument variables $1, $2, etc to be set as if the program were called with $varname as the argument string to the shellscript. So, if the value of varname is "first second third", then $1="first", $2="second", and $3="third".

14

Page 15: Korn Shell

Note that you can only access up to $9. There is no '$10'. If you wish to access the 10th argument, you first have to do 'shift 9', and access it as '$1'. The previously 11th argument will now be $2, etc.

The test function

In brief, 'test' can let you check the status of files OR string values. Here are the most common uses for it. if [[ $? -ne 0 ]] ; then echo status is bad ; fi if [[ "$var" != "good" ]] ; then echo var is not good ; fi if [[ ! -f /file/name ]] ; then echo /file/name is not there ; fi if [[ ! -d /dir/name ]] ; then echo /dir/name is not a directory ; fi

Please note that [[]] is a special built-in version of test, that is almost, but not 100%, like the standard []. The main difference being that wildcard expansion does not work within [[]].

Built-in math

The math evaluator is very useful. Everything inside the double-parens gets evaluated with basic math functions. For example;

four=$((2 + 2)) eight=$(($four + 4)) print $(($four * $eight))

Warning: Some versions of ksh allow you to use floating point with $(()). Most do NOT.

Also, be wary of assumptions. Being "built in" is not always faster than an external program. For example, it is trivial to write a shell-only equivalent of the trivial awk usage, "awk '{print $2}'", to print the second column. However, compare them on a long file:

# function to emulate awk '{print $2}' sh_awk(){ while read one two three ; do print $two done } # and now, compare the speed of the two methods time sh_awk </usr/dict/words >/dev/null time awk '{print $2}' </usr/dict/words >/dev/null

The awk version will be much much faster. This is because ksh scripts are interpreted, each and every time it executes a line. AWK, however, loads up its programming in one go, and figures out what it is doing ONE TIME. Once that overhead has been put aside, it then can repeat its instructions very fast.

15

Page 16: Korn Shell

Redirection and Pipes There are lots of strange and interesting ways to connect utilities together. Most of these you have probably already seen.

The standard redirect to file;

ls > /tmp/listing

and piping output from one command to another

ls | wc -l But bourne-shell derivatives give you even more power than that.

Most properly written programs output in one of two ways.

1. Progress messages go to stdout, error messages go to stderr 2. Data goes to stdout, error AND progress messsages go to stderr

If you know which of the categories your utilities fall into, you can do interesting things. Redirection

An uncommon program to use for this example is the "fuser" program under solaris. it gives you a long listing of what processes are using a particular file. For example:

$ fuser /bin/sh /bin/sh: 13067tm 21262tm

If you wanted to see just the processes using that file, you might initially groan and wonder how best to parse it with awk or something. However, fuser actually splits up the data for you already. It puts the stuff you may not care about on stderr, and the meaty 'data' on stdout. So if you throw away stderr, with the '2>' special redirect, you get

$ fuser /bin/sh 2>/dev/null 13067 21262

which is then trivially usable.

Unfortunately, not all programs are that straightforward :-) However, it is good to be aware of these things, and also of status returns. The 'grep' command actually returns a status based on whether it found a line. The status of the last command is stored in the '$?' variable. So if all you care about is, "is 'biggles' in /etc/hosts?" you can do the following:

grep biggles /etc/hosts >/dev/null if [[ $? -eq 0 ]] ; then echo YES else

16

Page 17: Korn Shell

echo NO fi

As usual, there are lots of other ways to accomplish this task, even using the same 'grep' command. However, this method has the advantage that it does not waste OS cycles with a temp file, nor does it waste memory with a potentially very long variable. (If you were looking for something that could potentially match hundreds of lines, then var=`grep something /file/name` could get very long)

Inline redirection

You have seen redirection TO a file. But you can also redirect input, from a file. For programs that can take data in stdin, this is useful. The 'wc' can take a filename as an argument, or use stdin. So all the following are roughly equivalent in result, although internally, different things happen: wc -l /etc/hosts wc -l < /etc/hosts cat /etc/hosts | wc -l

Additionally, if there are a some fixed lines you want to use, and you do not want to bother making a temporary file, you can pretend part of your script is a separate file!. This is done with the special '<<' redirect operator.

command << EOF

means, "run 'command', but make its stdin come from this file right here, until you see the string 'EOF'"

EOF is the traditional string. But you can actually use any unique string you want. Additionally, you can use variable expansion in this section!

DATE=`date` HOST=`uname -n` mailx -s 'long warning' root << EOF Something went horribly wrong with system $HOST at $DATE EOF

Pipes

In case you missed it before, pipes take the output of one command, and put it on the input of another command. You can actually string these together, as seen here; grep hostspec /etc/hosts| awk '{print $1}' | fgrep '^10.1.' | wc -l This is a fairly easy way to find what entries in /etc/hosts both match a particular pattern in their name, AND have a particular IP address range.

The "disadvantage" to this, is that it is very wasteful. Whenever you use more than one pipe at a time, you should wonder if there is a better way to do it. And indeed for this case, there most certainly IS a better way:

17

Page 18: Korn Shell

grep '^10\.1\..*hostspec' /etc/hosts | wc -l

There is actually a way to do this with a single awk command. But this is not a lesson on how to use AWK!

Combining pipes and redirection

An interesting example of pipes with stdin/err and redirection is the "tar" command. If you use "tar cvf file.tar dirname", it will create a tar file, and print out all the names of the files in dirname it is putting in the tarfile. It is also possible to take the same 'tar' data, and dump it to stdout. This is useful if you want to compress at the same time you are archiving:

tar cf - dirname | compress > file.tar.Z But it is important to note that pipes by default only take the data on stdout! So it is possible to get an interactive view of the process, by using

tar cvf - dirname | compress > file.tar.Z stdout has been redirected to the pipe, but stderr is still being displayed to your terminal, so you will get a file-by-file progress report. Or of course, you could redirect it somewhere else, with

tar cvf - dirname 2>/tmp/tarfile.list | compress > file.tar.Z

Indirect redirection

Additionally, there is a special type of pipes+redirection. This only works on systems with /dev/fd/X support. You can automatically generate a "fake" file as the result of a command that does not normally generate a file. The name of the fake files will be /dev/fd/{somenumberhere}

Here's an example that doesnt do anything useful

wc -l

wc will report that it saw two files, "/dev/fd/4", and "/dev/fd/5", and each "file" had 1 line each. From its own perspective, wc was called simply as

wc -l /dev/fd/4 /dev/fd/5

There are two useful components to this:

1. You can handle MULTIPLE commands' output at once 2. It's a quick-n-dirty way to create a pipeline out of a command that "requires" a filename (as long

as it only processes its input in a single continuous stream).

Other Stuff And here's stuff that I can’t fit anywhere else :-)

18

Page 19: Korn Shell

• eval • Backticks • Text positioning/color/curses stuff • Number-based menus • Raw TCP access • Graphics and ksh

eval

The eval command is a way to pretend you type something directly. This is a very dangerous command. Think carefully before using it.

One way of using eval, is to use an external command to set variables that you do not know the name of beforehand. Or a GROUP of variables. A common use of this, is to set terminal-size variables on login:

eval `resize`

Backticks

There are ways to put the output of one command as the command line of another one. There are two methods of doing this that are basically equivalent:

echo This is the uptime: `uptime` echo This is the uptime: $(uptime)

Technically, the second one is the POSIX-preferred one.

In addition to creating dynamic output, this is also very useful for setting variables:

datestring=`date` echo "The current date is: $datestring"

Text positioning/color games

This is actually a huge topic, and almost deserves its own tutorial. But I'm just going to mention it briefly.

Some people may be familiar with the "curses" library. It is a way to manipulate and move around text on a screen, regardless of what kind of "terminal" the user is using.

As mentioned, this is a potentially huge topic. So, I'm just going to give you a trivial example, and say "Go read the man-page on tput". Well, okay, actually, you have to read the "tput" manpage, AND either the "terminfo" or "termcap" manpage to figure out what magical 3-5 letter name to use. For example, it should tell you that "cup" is short for the "cursor_address" command. But you must use "cup", NOT "cursor_address", with tput.

19

Page 20: Korn Shell

tput init tput clear tput cup 3 2 print -n "Here is a clean screen, with these words near the top" endline=`tput cols` tput cup $(($endline - 2)) print "and now, back to you" sleep 2

The above example clear the screen, prints the given line at a SPECIFIC place on the screen, then puts the cursor back down near the bottom of the screen for you.

PS: If you've been doing a lot of funky things with the screen, you might want to do a

tput reset

as the last thing before your shellscript exits.

Number-based menus

You don’t have to build your own "choose a number" function: ksh has one already! But note that it returns the value of the line, not the number of the line.

select word in one two three exit; do echo word is $word echo reply is $REPLY if [[ "$word" = "exit" ]] ; then break; fi done

This will print out a mini-menu like the following: 1) one 2) two 3) three 4) exit #?

Note that this will loop between "do ... done" until you trigger a break somehow! (or until the user control-c's or whatever). So don’t forget an exit condition!

Raw TCP access

Ksh88 has a built in virtual filesystem that looks like it is under /dev/tcp. You can use it to create connections to specific ports, if you know the IP address.

Here is a trivial example that just opens up a connection to an SMTP server. Note that the connection is half-duplex: You do NOT see data that you send to the other side.

20

Page 21: Korn Shell

#!/bin/ksh -p MAILHOST=127.0.0.1 exec 3<>/dev/tcp/$MAILHOST/25 || exit 1 read -r BANNER <&3 echo BANNER is $BANNER print -u3 HELO myhost.com read -r REPLY <&3 echo REPLY is $REPLY

The output will look something like the following:

BANNER is 220 yourhost.domain.com ESMTP Sendmail 8.11.6+Sun/8.11.6; Tue, 3 Dec 2002 17:30:01 -0800 (PST) REPLY is 250 yourhost.domain.com Hello localhost [127.0.0.1], pleased to meet you

Note that we use the "-r" flag to read. In this particular example, it is not necessary. But in the general case, it will give you the data "raw". Be warned that if the shell cannot open the port, it will kill your entire script, with status 1, automatically

You can also dump the rest of the data waiting on the socket, to wherever you like, by doing

cat <&3 > somefile

Graphics and ksh

Not many people are aware of this, but there is actually a graphical version of ksh, called "dtksh". It was created as part of "CDE". Any of the modern UNIX(tm)es should come with it, in /usr/dt/bin/dtksh. If you are interested, take a look at some dtksh demos that someone else has written. And/or, you might see if you have a /usr/dt/share/examples/dtksh/ directory present on your machine. Paranoia, and good programming practices First, a bit of good programming practice: Comment your code. You really should at MINIMUM have some comment about every page (that's every 24 lines). Ideally, you should always comment all your functions. One-line functions can probably stand by themselves, but otherwise, a quick single line comment is a good thing to have for small functions.

# This function cleans up the file before printing pcleanup(){ .... }

For longer functions, you should really use formal comment spec. Something like

21

Page 22: Korn Shell

# Function xyz # Usage: xyz arg1 arg2 arg3 arg4 # arg1 is the device # arg2 is a file # arg3 is how badly you want to mangle it # arg4 is an optional output file # Result: the file will be transferred to the device, with the # appropriate ioctls. Any err messages will be saved in the output # file, or to stderr otherwise xyz(){ ... }

Note that shellscripts are themselves one large "function". So dont forget basic comments on your shellscript's functionality at the top of the file! INDENT! Every time you start a function, indent.

Every time you are in a while loop, or if clause, indent.

This makes it easier to see at a glance what level you are at.

# top level print this is the top somefunction(){ # indent in functions print we are in the function now if [[ "$1" != "" ] ; then # yes, ANOTHER indent! print "Hey, we have an argument: $1" print full args are $* else # indent again!! print "oh well. no arguments to play with" fi print leaving somefunction now } # And now we can clearly see that all this stuff is outside any function. # This makes it easier to find the "main line" of the script print original shellscript args are $0 print lets try somefunction somefunction heres some args

Error checking If you are using scripts to check on important systems, or perform crucial functions, it is very important to provide feedback to the calling process when and if something goes wrong.

22

Page 23: Korn Shell

The simplest method of doing this, is a simple

exit 1 # (but it would be nice to print out SOME message before exiting!)

Nice programs will notice that your script exited with a non-zero status. [Remember, the status of the last command is in '$?'] Ideally, they will complain. On the other hand, sometimes your own scripts are the ones that are doing the calling!

In that type of situation, it may be suitable to have a top-level script that keeps an eye on things. A simple example is:

fatal(){ # Something went horribly wrong. # print out an error message if provided, then exit with an # "error" status if [[ "$1" != "" ]] ; then print Something went horribly wrong with "$1" else print "something went horribly wrong (Somewhere?)" fi # Your mail also want to SEND EMAIL, or trigger a PAGER or # something here: # mailx -s "Arrrg! we failed checks!" [email protected] < /dev/null # exit 1 } check_on_modems if [[ $? -ne 0 ]] ; then fatal modems ; fi check_on_network if [[ $? -ne 0 ]] ; then fatal network ; fi check_on_servers if [[ $? -ne 0 ]] ; then fatal servers ; fi

Note that even my paranoid 'fatal' function, IS PARANOID! Normally, it is assumed you will call it with "fatal what-failed". But if you somehow dont, it notices, and provides a default.

Sometimes, making the assumption that $1 contains valid data can completely screw up the rest of your function or script. So if it is an important function, assume nothing!

This is particularly true of CGI scripts. [Gasp]. Yes, Virginia, it IS possible to write CGI in something other than perl.

23

Page 24: Korn Shell

cron job paranoia If you are coding a script that specifically is in a cronjob, there are two things to be aware of. The number one most important thing is;

Set your PATH variable!!!

People always forget this one. It doesn’t matter what is in your .profile, cron will reset the PATH variable to something really short and stupid, like just /usr/bin. So set your PATH=/whatever:/usr/bin explicitly in your cron scripts.

The second tip is more an advisory note.

cron by default saves anything that gets send to 'stderr', and MAILS IT to the owner of the cron job. So, sometimes, if you just want a minor error logged somewhere, it is sufficient to just do

print "Hey, this is kinda wierd" >/dev/fd/2

which will send the output of the print to stderr, which will then go to email. Another way of doing it, if your system does not have /dev/fd, could be print "Wow, this is tortured" 1>&2

Contrariwise, if you want to throw away ALL output, use

command >/dev/null 2>&1

If you do not regularly read email for the user in question, you can either set up an alias for that user, to forward all its email to you, or do

export [email protected]

The MAILTO trick does not work on all cron demons, however.

Example of script development The good, the bad, and the ugly Hopefully, you have read through all the other chapters by this point. This page will show you the "big picture" of shellscript writing.

Here are four versions of essentially the same program: a wrapper to edit a file under SCCS version control. The basic task is to use the sccs command to "check out" a file under version control, and then automatically edit the file. The script will then be used by "users", aka coders, who may not be particularly advanced UNIX users. Hence, the need for a wrapper script.

While the basic functionality is the same across all versions , the differences in safety and usability between the first version and the last version are staggering.

24

Page 25: Korn Shell

The first one is extremely bad: it would be written by someone who has just picked up a book on shellscripting, and has decided, "I'm a programmer now".

The second one is an improvement, showing some consideration to potential users by having safety checks.

The third one is a good, solid version. It's a positive role model for your scripts.

The final one is a full-blown, paranoid, commented program unto itself, with whiz-bang features. This is the way a professional programmer would write it. Don't let that put you off: it's the way you can and should write it too! You might start with a version like the initial dumb one, as the initial step in your code development, just to make sure you have the basic functionality for the task. But after it is shown to work, you should upgrade it to a more reasonable one immediately.

The newbie programmer version

#!/bin/ksh sccs edit $1 if [ "$EDITOR" = "" ] ; then EDITOR=vi fi $EDITOR $1

This version makes somewhat of an attempt to be user friendly, by having a check for a user-specified EDITOR setting, and using it if available. However, there are no comments, no error checking, and no help for the user whatsoever!

The sysadmin-in-training version #!/bin/ksh # Author: Joe Newguy if [ $# -lt 1 ] ; then print "This program will check out a file, or files, with sccs" exit 1 fi # set EDITOR var to "vi" if not already set EDITOR=${EDITOR:-vi} sccs edit $@ $EDITOR $@

25

Page 26: Korn Shell

This is somewhat of a step above the prior version. It accepts multiple files as potential arguments. It's always nice to be flexible about the number of files your scripts can handle. It also has a usage message, if the script is called without arguments.

Unfortunately, there is still quite a bit lacking, as you can tell by comparing it to the next version.

The Senior Admin version

#!/bin/ksh # SCCS editing wrapper, version 0.3 # Author - Sys Admin # Usage: see usage() function, below usage(){ print sedit - a wrapper to edit files under SCCS print "usage: sedit file {file2 ...}" } # Set EDITOR var to "vi" if not already set to something EDITOR=${EDITOR:-vi} # Could already be in path, but it doesn’t hurt to add it again. # Sorry, I assume solaris machines everywhere: adjust as needed. PATH=$PATH:/usr/ccs/bin if [ ! -x /usr/ccs/bin/sccs ] ; then print ERROR: sccs not installed on this machine. Cannot continue. usage exit 1 fi if [ $# -lt 1 ] ; then usage print ERROR: no files specified exit 1 fi # Yes, I could use "sccs edit $@" and check for a single error, but this # approach allows for finer error reporting for f in $@ ; do sccs edit $f if [ $? -ne 0 ] ; then print ERROR checking out file $f if [ "$filelist" != "" ] ; then print "Have checked out $filelist" fi exit 1 fi filelist="$filelist $f" done $EDITOR $filelist if [ $? -eq 0 ] ; then

26

Page 27: Korn Shell

print ERROR: $EDITOR returned error status exit 1 fi

This guy has been around the block a few times. He's a responsible sysadmin, who likes to be disaster-prepared. In this case, the most likely "disaster" is 100 calls from developers asking "Why doesnt it work for me?" So when things break, it's a good idea to provide as much information as possible to the user.

Nice things to note:

• Sets special variables at the top of the script, in a unified place • Paranoid checks about EVERYTHING • Returns an error status from the script, on non-clean exit condition ("exit 1", vs "exit") • Use of comments. Not only does he specify what he is doing, he clarifies what he is NOT doing.

Compare and contrast the first version of the program, to this one. Then try to make your own scripts be more like this!

The Senior Systems Programmer version #!/bin/ksh # SCCS editing wrapper, version 1.3 # Author - Phil Brown # Usage: see usage() function, below usage(){ print sedit - a wrapper to edit files under SCCS print "Usage: sedit [-c|-C] [-f] file {file2 ...}" print " -c check in file(s) after edit is complete" print " -C check in all files with single revision message" print " -f ignore errors in checkout" } # Set EDITOR var to "vi" if not already set to something EDITOR=${EDITOR:-vi} # Could already be in path, but it doesnt hurt to add it again. # Sorry, I assume solaris machines everywhere: adjust as needed. PATH=$PATH:/usr/ccs/bin if [ ! -x /usr/ccs/bin/sccs ] ; then print ERROR: sccs not installed on this machine. Cannot continue. usage exit 1 fi

27

Page 28: Korn Shell

while getopts "cCfh" arg do case $arg in c) checkin="yes" ;; C) checkinall="yes" ;; f) force="yes" ;; h|*) usage exit 1 ;; esac done shift $(($OPTIND - 1)) if [ $# -lt 1 ] ; then usage print ERROR: no files specified exit 1 fi if [ "$checkinall" != "" ] && [ "$checkin" != "" ] ; then print WARNING: -c and -C used. Will use -C. fi # Yes, I could use "sccs edit $@" and check for a single error, but this # approach allows for finer error reporting. # "$@" is a special construct that catches spaces in filenames. # Note that "$*" is NOT 100% the same thing. for f in "$@" ; do sccs edit "$f" if [ $? -ne 0 ] ; then print ERROR checking out file $f if [ "$force" = "" ] ; then if [ "$filelist" != "" ] ; then print "Have checked out $filelist" fi exit 1 fi # else, -f is in effect. Keep going fi filelist="$filelist $f" done # I would like to use "$filelist", but that does not preserve spaces # in file names $EDITOR "$@" if [ $? -eq 0 ] ; then

28

Page 29: Korn Shell

print ERROR: $EDITOR returned error status exit 1 fi if [ "$checkinall" != "" ] ; then # -C option used. re-check in all files at once. sccs delget $filelist if [ $? -ne 0 ] ; then print "ERROR checking in files?" exit 1 fi exit 0 fi if [ "$checkin" != "" ] ; then # -c option used. re-check in each file. for file in $filelist ; do sccs delget $file if [ $? -ne 0 ] ; then print "WARNING: failed to check in $file" fi # do NOT stop after error. Keep trying to check # in any other files done fi

This guy has been around the block a few times. Heck, he helped BUILD the block ;-)

This was originally my third and final version. It's the way I would really write the script. But I decided it might be a bit daunting to new scripting folks, so I made a new intermediate third step, above.

Additional things beyond the previous version:

• Provides optional extra functionality, where it makes sense. Shows understanding of writing scripts, AND understanding of the area of the task (SCCS version control)

• Use of the 'getopts' standard util, rather than hand-coding a custom argument parser

Summary of positive features

Here is a summary of all the positives added through the different versions.

• COMMENT YOUR SCRIPTS!!! • Check for user-environment variables, where appropriate • When setting/overriding special variables, do it at the top of the script, in a unified place • Accept one OR MORE files, when it makes sense to do so. • Have a "usage" message • Be Paranoid: check return statuses • Return correct statuses yourself: ("exit 1", vs "exit") • Take advantage of 'getopts' when adding optional functionality

29

Page 30: Korn Shell

Korn Shell Notes (Man pages) The Korn shell has some of the most advanced programming capabilities of any command interpreter of its type. Although its syntax is nowhere near as elegant or consistent as that of most conventional programming languages, its power and flexibility are comparable. In fact, the Korn shell can be used as a complete environment for writing software prototypes. A script, or file that contains shell commands, is a shell program. Your .profile and .kshrc files are shell scripts. You can execute a shell script in four ways: just type the commands at the shell prompt . file ksh –f file.ksh file (if it has execute permission) file& (if it has execute permission) The first two are basically the same, as are the last two, as far as the way they are handled by the shell. The first two are executed in the calling shell, while the last two run another copy of the shell as a subprocess, called a subshell. The subshell then takes commands from the script, runs them, and terminates, handing control back to the parent shell. An important ramification of subshells is that some information from the parent shell is known by the subshell, but subprocesses can never make any information known to the process that created them. Which pieces of information from the parent shell are known by the subshell: exported environment variables built-in functions aliases, options and variables from your .kshrc file functions from your .kshrc file if they are exported (typeset -fx) This is a good time to talk about the order of precedence for the various sources of commands. When you type a command to the shell, it looks in the following places until it finds a match: Keywords such as function, if, for etc Aliases Built-ins like cd, whence and functions Functions Scripts and executable programs in your PATH To find out the exact source of a command, use: whence -v command OPTIONS The Korn shell has a number of options that can be set. If they are set

30

Page 31: Korn Shell

in the .kshrc file then they are global and can be passed to subshells. To list all the Korn shell options use: set -o To set an option use: set -o optionname To unset an option use: set +o optionname FUNCTIONS What are functions? They are sort of a script-within-a-script. They are always in the shells memory and therefore run faster. They are similar to subroutines or procedures. You can just type them in at the command line or put them in your .kshrc file. The structure of a function is either:

function funcname { shell commands }

or:

funcname () { shell commands }

To remove a function use: unset -f To list all current functions use: functions To export a function use: typeset -fx funcname To unexport a function use: typeset +fx funcname ALIASES The syntax of an alias is: alias aliasname=command Notice there are no spaces on either side of the =, this is required syntax. If the command being aliased contains more than one word, it has to be surrounded by single quotes. For example: alias ll='ls -l' To unset or remove an alias use: unalias aliasname To list all current aliases use:

31

Page 32: Korn Shell

alias DEBUGGING There are two ways to to debug a shell script and one way to debug a function. The two ways to debug a shell script are: set -o xtrace -which gives a low level of debugging (set +o xtrace) ksh -x shellprog args -which gives a much higher level of debugging. To debug a function you use the typeset command with function trace on: typeset -ft funcname (typeset +ft funcname turns it off again) PROMPTS This brings me to the subject of prompts. There are four prompts in the Korn shell, each with a different purpose: PS1 which is the main shell prompt (default $ ) PS2 is the secondary or continuation prompt (default > ) PS3 is the selection prompt used within a select loop (default #? ) PS4 is the debug prompt (default + ) Most people personalise PS1, but rarely touch the others. Suggestions: PS2=more?>; export PS2 PS3 best set in a shell script to reflect variable being requested PS4=debug+>;export PS4 BUILT-IN or COMMAND LINE VARIABLES $# Number of command-line arguments $? Exit value of last executed command $$ Process number of current process $! Process number of last background command $0 First word, that is the command name $n Individual arguments on the command line (positional paramaters) You can have more than nine parameters if specified as ${n} $* All arguments on command line ("$1 $2...") "$@" All arguments on command line, individually quoted ("$1" "$2"...) $_ Used interactively, stores the last argument of the previous command

Logical & Arithmetic Operators

One of the more obscure parts of the Korn shell syntax allows you to combine exit statuses logically, so that you can test more than one thing at a time. Syntax: statement1 && statement2 means execute statement1 and if its exit status is 0 (sucessful) then execute statement2

32

Page 33: Korn Shell

statement1 || statement2 means execute statement1 and if its exit status is not 0 then execute statement2 For example:

filename=$1 word1=$2 word2=$3 if grep $word1 $filename && grep word2 $filename then print "$word1 and $word2 are both in $filename" fi

ARITHMETIC FUNCTIONS The Korn shell is the only shell to have the arithmetic expression feature built in to its syntax. Korn shell arithmetic expressions are equivalent to their counterparts in the C language. The syntax for arithmetic expressions is: $((....)) The arithmetic operators are: Operator Meaning + Plus - Minus * Multiply / Divide (with truncation) % Remainder << Bit-shift left >> Bit-shift right & Bitwise and | Bitwise or ~ Bitwise not ^ Bitwise exclusive or Parentheses can be used to group subexpressions. The arithmetic expression syntax also (like C) supports relational operators as "truth values" of 1 for true and 0 for false. The following table shows the relational operators and the logical operators that can be used to combine relational expressions. Operator Meaning < Less than > Greater than <= Less than or equal to >= Greater than or equal to == Equal to != Not equal to && Logical and || Logical or For example, $((3>2)) has the value 1; $(( (3>2) || (4<=1) )) also has the value 1, since at least one of the two expressions is true. The shell also supports base N numbers, where N is between 2 and

33

Page 34: Korn Shell

36. The notation B#N means "N base B". Of course, if you omit the B#, the base defaults to 10. For example: print $((2#100101)) yields 37 print $((16#ff)) yields 255 Another built-in function that is not strictly a part of the arithmetic functions, but is usefully dealt here is the RANDOM function. Each time this variable is referenced, a random integer, uniformly distributed between 0 and 32767, is generated. The sequence of random numbers can be initialsed by assigning a numeric value to RANDOM. An example of it use, to generate a random number between 1 and 10 would be: $(($RANDOM%10+1)) This is taken the remainder of the random number when divided by 10, giving 0 to 9, and then adding 1 to get 1 to 10.

Flow Control A command is either a simple-command or one of the following. Unless otherwise stated, the value returned by a command is that of the last simple-command executed in the command. for identifier [ in word ... ] ;do list ;done Each time a for command is executed, identifier is set to the next word taken from the in word list. If in word ... is omitted, then the for command executes the do list once for each positional parameter that is set (see Parameter Substitution below). Execution ends when there are no more words in the list. select identifier [ in word ... ] ;do list; done A select command prints to standard error (file descriptor 2), the set of words, each preceded by a number. If in word ... is omitted, then the positional parameters are used instead (see Parameter Substitution below). The PS3 prompt is printed and a line is read from the standard input. If this line consists of the number of one of the listed words, then the value of the variable identifier is set to the word corresponding to this number. If this line is empty the selection list is printed again. Otherwise the value of the variable identifier is set to NULL. (See Blank Interpretation about NULL). The contents of the line read from standard input is saved in the shell variable REPLY. The list is executed for each selection until a break or end-of-file is encountered. If the REPLY variable is set to NULL by the execution of list, then the selection list is printed before displaying the PS3 prompt for the next selection. case word in [ [(]pattern [ | pattern ] ... ) list ;; ] ... esac A case command executes the list associated with the first pattern that matches word. The form of the patterns is the same as that used for file-name generation (see File Name Generation below). if list ;then list [[ elif list ; then list ] ... ]; else list; fi The list following if is executed and, if it returns an exit status of zero, the list following the first then is executed. Otherwise, the list following elif is executed and, if its value is zero, the list following the

34

Page 35: Korn Shell

next then is executed. Failing that, the else list is executed. If no else list or then list is executed, then the if command returns a zero exit status. while list ;do list ;done until list ;do list ;done A while command repeatedly executes the while list and, if the exit status of the last command in the list is zero, executes the do list; otherwise the loop terminates. If no commands in the do list are executed, then the while command returns a zero exit status; until may be used in place of while to negate the loop termination test. (list) Execute list in a separate environment. Note, that if two adjacent open parentheses are needed for nesting, a space must be inserted to avoid arithmetic evaluation as described below. {list} list is simply executed. Note that unlike the meta-characters ( and ), { and } are reserved words and must occur at the beginning of a line or after a ; in order to be recognized. [[expression]] Evaluates expression and returns a zero exit status when expression is true. See Conditional Expressions below, for a description of expression. identifier() { list ;} Define a function which is referenced by identifier. The body of the function is the list of commands between { and }. (See Functions below). time pipeline The pipeline is executed and the elapsed time as well as the user and system time are printed to standard error. The following reserved words are only recognized as the first word of a command and when not quoted: if then else elif fi case esac for while until do done { } function select time [[ ]] Comments A word beginning with # causes that word and all the following characters up to a new-line to be ignored.

35

Page 36: Korn Shell

Conditional Expressions Conditional Expressions A conditional expression is used with the [[ compound command to test attributes of files and to compare strings. Word splitting and file name generation are not performed on the words between [[ and ]]. Each expression can be constructed from one or more of the following unary or binary expressions: -a file True, if file exists. -b file True, if file exists and is a block special file. -c file True, if file exists and is a character spe- cial file. -d file True, if file exists and is a directory. -f file True, if file exists and is an ordinary file. -g file True, if file exists and is has its setgid bit set. -k file True, if file exists and is has its sticky bit set. -n string True, if length of string is non-zero. -o option True, if option named option is on. -p file True, if file exists and is a fifo special file or a pipe. -r file True, if file exists and is readable by current process. -s file True, if file exists and has size greater than zero. -t fildes True, if file descriptor number fildes is open and associated with a terminal device. -u file True, if file exists and has its setuid bit set. -w file True, if file exists and is writable by current process. -x file True, if file exists and is executable by current process. If file exists and is a directory, then the current process has per- mission to search in the directory. -z string True, if length of string is zero. -L file True, if file exists and is a symbolic link. -O file True, if file exists and is owned by the effective user id of this process. -G file True, if file exists and its group matches the effective group id of this process. -S file True, if file exists and is a socket. file1 -nt file2 True, if file1 exists and is newer than file2. file1 -ot file2 True, if file1 exists and is older than file2. file1 -ef file2 True, if file1 and file2 exist and refer to the same file.

36

Page 37: Korn Shell

string = pattern True, if string matches pattern. string != pattern True, if string does not match pattern. string1 < string2 True, if string1 comes before string2 based on ASCII value of their characters. string1 > string2 True, if string1 comes after string2 based on ASCII value of their characters. exp1 -eq exp2 True, if exp1 is equal to exp2. exp1 -ne exp2 True, if exp1 is not equal to exp2. exp1 -lt exp2 True, if exp1 is less than exp2. exp1 -gt exp2 True, if exp1 is greater than exp2. exp1 -le exp2 True, if exp1 is less than or equal to exp2. exp1 -ge exp2 True, if exp1 is greater than or equal to exp2. In each of the above expressions, if file is of the form /dev/fd/n, where n is an integer, then the test is applied to the open file whose descriptor number is n. A compound expression can be constructed from these primitives by using any of the following, listed in decreasing order of precedence. (expression) True, if expression is true. Used to group expressions. ! expression True if expression is false. expression1 && expression2 True, if expression1 and expression2 are both true. expression1 || expression2 True, if either expression1 or expression2 is true.

Commands

alias [ -tx ] [ name[ =value ] ] ... alias with no arguments prints the list of aliases in the form name=value on standard output. An alias is defined for each name whose value is given. A trailing space in value causes the next word to be checked for alias substitution. The -t flag is used to set and list tracked aliases. The value of a tracked alias is the full pathname corresponding to the given name. The value becomes undefined when the value of PATH is reset but the aliases remained tracked. Without the -t flag, for each name in the argument list for which no value is given, the name and value of the alias is printed. The -x flag is used to set or print exported aliases. An exported alias is defined for scripts invoked by name. The exit status is non-zero if a name is given, but no value, and no alias has been defined for the name. bg [ %job... ]

37

Page 38: Korn Shell

This command is only on systems that support job control. Puts each specified job into the background. The current job is put in the background if job is not specified. See Jobs for a description of the format of job. break [ n ] Exit from the enclosed for, while, until, or select loop, if any. If n is specified then break n levels. continue [ n ] Resume the next iteration of the enclosed for, while, until, or select loop. If n is specified then resume at the n-th enclosed loop. cd [ arg ] cd old new This command can be in either of two forms. In the first form it changes the current directory to arg. If arg is - the directory is changed to the previous directory. The shell variable HOME is the default arg. The variable PWD is set to the current directory. The shell variable CDPATH defines the search path for the directory containing arg. Alternative directory names are separated by a colon (:). The default path is (specifying the current directory). Note that the current directory is specified by a NULL path name, which can appear immediately after the equal sign or between the colon delimiters anywhere else in the path list. If arg begins with a / then the search path is not used. Otherwise, each directory in the path is searched for arg. The second form of cd substitutes the string new for the string old in the current directory name, PWD and tries to change to this new directory. The cd command may not be executed by rksh. echo [ arg ... ] See echo(1) for usage and description. eval [ arg ... ] The arguments are read as input to the shell and the resulting command(s) executed. exec [ arg ... ] If arg is given, the command specified by the arguments is executed in place of this shell without creating a new process. Input/output arguments may appear and affect the current process. If no arguments are given the effect of this command is to modify file descriptors as prescribed by the input/output redirection list. In this case, any file descriptor numbers greater than 2 that are opened with this mechanism are closed when invoking another program. exit [ n ] Causes the shell to exit with the exit status specified by n. The value will be the least significant 8 bits of the specified status. If n is omitted then the exit status is that of the last command executed. When exit occurs when executing a trap, the last command refers to the command that executed before

38

Page 39: Korn Shell

the trap was invoked. An end-of-file will also cause the shell to exit except for a shell which has the ignoreeof option (See set below) turned on. export [ name[=value] ] ... The given names are marked for automatic export to the environment of subsequently-executed commands. fc [ -e ename ] -nlr ] [ first [ last ] ] fc -e - [ old=new ] [ command ] In the first form, a range of commands from first to last is selected from the last HISTSIZE commands that were typed at the terminal. The arguments first and last may be specified as a number or as a string. A string is used to locate the most recent command starting with the given string. A negative number is used as an offset to the current command number. If the – l flag is selected, the commands are listed on standard output. Otherwise, the editor program ename is invoked on a file containing these keyboard commands. If ename is not supplied, then the value of the variable FCEDIT (default /bin/ed) is used as the editor. When editing is complete, the edited command(s) is executed. If last is not specified then it will be set to first. If first is not specified the default is the previous command for editing and -16 for listing. The flag -r reverses the order of the commands and the flag - n suppresses command numbers when listing. In the second form the command is re-executed after the substitution old=new is performed. fg [ %job... ] This command is only on systems that support job control. Each job specified is brought to the foreground. Otherwise, the current job is brought into the foreground. See Jobs for a description of the format of job. getopts optstring name [ arg ... ] Checks arg for legal options. If arg is omitted, the positional parameters are used. An option argument begins with a + or a -. An option not beginning with + or - or the argument -- ends the options. Optstring contains the letters that getopts recognizes. If a letter is followed by a :, that option is expected to have an argument. The options can be separated from the argument by blanks. getopts places the next option letter it finds inside variable name each time it is invoked with a + prepended when arg begins with a +. The index of the next arg is stored in OPTIND. The option argument, if any, gets stored in OPTARG. A leading : in optstring causes getopts to store the letter of an invalid option in OPTARG, and to set name to ? for an unknown option and to : when a required option is missing. Otherwise, getopts prints an error message. The exit status is non-zero when there are no more options. See getoptcvt(1) for usage and description. jobs [ -lnp ] [ %job ... ] Lists information about each given job; or all active jobs if job is omitted. The -l flag lists process ids in addition to the normal information. The - n flag only displays jobs that have stopped or exited

39

Page 40: Korn Shell

since last notified. The -p flag causes only the process group to be listed. See Jobs for a description of the format of job. kill [ -sig ] %job ... kill -l Sends either the TERM (terminate) signal or the specified signal to the specified jobs or processes. Signals are either given by number or by names (as given in signal(5) stripped of the prefix ``SIG'' with the exception that SIGCHD is named CHLD). If the signal being sent is TERM (terminate) or HUP (hangup), then the job or process will be sent a CONT (continue) signal if it is stopped. The argument job can be the process id of a process that is not a member of one of the active jobs. See Jobs for a description of the format of job. In the second form, kill - l, the signal numbers and names are listed. let arg... Each arg is a separate arithmetic expression to be evaluated. See Arithmetic Evaluation above, for a description of arithmetic expression evaluation. The exit status is 0 if the value of the last expression is non-zero, and 1 otherwise. newgrp [ arg ... ] Equivalent to exec /bin/newgrp arg .... print [ -Rnprsu[n ] ] [ arg ... ] The shell output mechanism. With no flags or with flag - or --, the arguments are printed on standard output as described by echo(1). The exit status is 0, unless the output file is not open for writing. -n suppresses new-line from being added to the output. -R -r (raw mode) ignore the escape conventions of echo. The -R option will print all subsequent arguments and options other than -n. -p causes the arguments to be written onto the pipe of the process spawned with |& instead of standard output. -s causes the arguments to be written onto the history file instead of standard output. -u [ n ] flag can be used to specify a one digit file descriptor unit number n on which the output will be placed. The default is 1. pwd Equivalent to print -r - $PWD print -r - $PWD read [ -prsu[ n ] ] [ name?prompt ] [ name ... ] The shell input mechanism. One line is read and is broken up into fields using the characters in IFS as separators. The escape character, (\), is used to remove any special meaning for the next character and for line continuation. In raw mode, -r, the \ character is not treated specially. The

40

Page 41: Korn Shell

first field is assigned to the first name, the second field to the second name, etc., with leftover fields assigned to the last name. The -p option causes the input line to be taken from the input pipe of a process spawned by the shell using |&. If the -s flag is present, the input will be saved as a command in the history file. The flag -u can be used to specify a one digit file descriptor unit n to read from. The file descriptor can be opened with the exec special command. The default value of n is 0. If name is omitted then REPLY is used as the default name. The exit status is 0 unless the input file is not open for reading or an end-of-file is encountered. An end-of-file with the –p option causes cleanup for this process so that another can be spawned. If the first argument contains a ?, the remainder of this word is used as a prompt on standard error when the shell is interactive. The exit status is 0 unless an end-of-file is encountered. readonly [ name[=value] ] ... The given names are marked readonly and these names cannot be changed by subsequent assignment. return [ n ] Causes a shell function or '.' script to return to the invoking script with the return status specified by n. The value will be the least significant 8 bits of the specified status. If n is omitted then the return status is that of the last command executed. If return is invoked while not in a function or a '.' script, then it is the same as an exit. set [ +_ aefhkmnopstuvx ] [ +_ o option ]... [ +_ A name ] [ arg ... ] The flags for this command have meaning as follows: -A Array assignment. Unset the variable name and assign values sequentially from the list arg. If +A is used, the variable name is not unset first. -a All subsequent variables that are defined are automatically exported. -e If a command has a non-zero exit status, exe- cute the ERR trap, if set, and exit. This mode is disabled while reading profiles. -f Disables file name generation. -h Each command becomes a tracked alias when first encountered. -k All variable assignment arguments are placed in the environment for a command, not just those that precede the command name. -m Background jobs will run in a separate process group and a line will print upon completion. The exit status of background jobs is reported in a completion message. On systems with job control, this flag is turned on automatically for interactive shells.

41

Page 42: Korn Shell

- n Read commands and check them for syntax errors, but do not execute them. Ignored for interactive shells. -o The following argument can be one of the fol- lowing option names: allexport Same as -a. errexit Same as -e. bgnice All background jobs are run at a lower priority. This is the default mode. emacs Puts you in an emacs style in-line edi- tor for command entry. gmacs Puts you in a gmacs style in-line edi- tor for command entry. ignoreeof The shell will not exit on end-of-file. The command exit must be used. keyword Same as -k. markdirs All directory names resulting from file name generation have a trailing / appended. monitor Same as -m. noclobber Prevents redirection > from truncating existing files. Require >| to truncate a file when turned on. noexec Same as -n. noglob Same as -f. nolog Do not save function definitions in history file. nounset Same as -u. privileged Same as -p. verbose Same as -v. trackall Same as -h. vi Puts you in insert mode of a vi style

42

Page 43: Korn Shell

in-line editor until you hit escape character 033. This puts you in control mode. A return sends the line. viraw Each character is processed as it is typed in vi mode. xtrace Same as -x. If no option name is supplied then the current option settings are printed. -p Disables processing of the $HOME/.profile file and uses the file /etc/suid_profile instead of the ENV file. This mode is on whenever the effective uid is not equal to the real uid, or when the effective gid is not equal to the real gid. Turning this off causes the effective uid and gid to be set to the real uid and gid. -s Sort the positional parameters lexicographi- cally. -t Exit after reading and executing one command. -u Treat unset parameters as an error when sub- stituting. -v Print shell input lines as they are read. -x Print commands and their arguments as they are executed. - Turns off -x and -v flags and stops examining arguments for flags. -- Do not change any of the flags; useful in setting $1 to a value beginning with -. If no arguments follow this flag then the positional parameters are unset. Using + rather than - causes these flags to be turned off. These flags can also be used upon invocation of the shell. The current set of flags may be found in $- . Unless -A is specified, the remaining arguments are positional parameters and are assigned, in order, to $1 $2 .... If no arguments are given then the names and values of all variables are printed on the standard output. shift [ n ] The positional parameters from $n+1 $n+1 ... are renamed $1 ..., default n is 1. The parameter n can be any arithmetic expression that evaluates to a non- negative number less than or equal to $#. stop %jobid ...

43

Page 44: Korn Shell

stop pid ... stop stops the execution of a background job(s) by using its jobid, or of any process by using its pid. (see ps(1)). suspend Stops the execution of the current shell (but not if it is the login shell). times Print the accumulated user and system times for the shell and for processes run from the shell. trap [ arg ] [ sig ] ... arg is a command to be read and executed when the shell receives signal(s) sig. (Note that arg is scanned once when the trap is set and once when the trap is taken.) Each sig can be given as a number or as the name of the signal. trap commands are executed in order of signal number. Any attempt to set a trap on a signal that was ignored on entry to the current shell is ineffective. If arg is omitted or is -, then the trap(s) for each sig are reset to their original values. If arg is the NULL (the empty string, e.g., "" ) string then this signal is ignored by the shell and by the commands it invokes. If sig is ERR then arg will be executed whenever a command has a non-zero exit status. If sig is DEBUG then arg will be executed after each command. If sig is 0 or EXIT and the trap statement is executed inside the body of a function, then the command arg is executed after the function completes. If sig is 0 or EXIT for a trap set outside any function then the command arg is executed on exit from the shell. The trap command with no arguments prints a list of commands associated with each signal number. typeset [ +_HLRZfilrtux[n] ] [ name[=value ] ] ... Sets attributes and values for shell variables and functions. When typeset is invoked inside a function, a new instance of the variables name is created. The variables value and type are restored when the function completes. The following list of attributes may be specified: -H This flag provides UNIX to host-name file mapping on non-UNIX machines. -L Left justify and remove leading blanks from value. If n is non-zero it defines the width of the field; otherwise, it is determined by the width of the value of first assignment. When the variable is assigned to, it is filled on the right with blanks or truncated, if necessary, to fit into the field. Leading zeros are removed if the -Z flag is also set. The -R flag is turned off. -R Right justify and fill with leading blanks. If n is non-zero it defines the width of the field, otherwise it is determined by the width of the value of first assignment. The field is left filled with blanks or truncated from the end if the variable is reassigned. The -L flag is turned off. -Z Right justify and fill with leading zeros if the first non-blank character is a digit and the -L flag has not been set. If n is non-zero it

44

Page 45: Korn Shell

defines the width of the field; otherwise, it is determined by the width of the value of first assignment. -f The names refer to function names rather than variable names. No assignments can be made and the only other valid flags are -t, -u and -x. The flag -t turns on execution tracing for this func- tion. The flag -u causes this function to be marked undefined. The FPATH variable will be searched to find the function definition when the function is referenced. The flag -x allows the function definition to remain in effect across shell procedures invoked by name. -i Parameter is an integer. This makes arithmetic faster. If n is non-zero it defines the output arithmetic base; otherwise, the first assignment determines the output base. - l All upper-case characters are converted to lower-case. The upper-case flag, -u is turned off. -r The given names are marked readonly and these names cannot be changed by subsequent assignment. -t Tags the variables. Tags are user definable and have no special meaning to the shell. - u All lower-case characters are converted to upper-case characters. The lower-case flag, -l is turned off. -x The given names are marked for automatic export to the environment of subsequently-executed com- mands. The -i attribute can not be specified along with -R, - L, -Z, or -f. Using + rather than - causes these flags to be turned off. If no name arguments are given but flags are specified, a list of names (and optionally the values) of the variables which have these flags set is printed. (Using + rather than - keeps the values from being printed.) If no names and flags are given, the names and attributes of all variables are printed. ulimit [ -HSacdfmnpstv ] [ limit ] Set or display a resource limit. The available resources limits are listed below. Many systems do not contain one or more of these limits. The limit for a specified resource is set when limit is specified. The value of limit can be a number in the unit specified below with each resource, or the value unlimited. The H and S flags specify whether the hard limit or the soft limit for the given resource is set. A hard limit cannot be increased once it is set. A soft limit can be increased up to the value of the hard limit. If neither the H or S options is specified, the limit applies to both. The current resource limit is printed when limit is omitted. In this case the soft limit is

45

Page 46: Korn Shell

printed unless H is specified. When more that one resource is specified, then the limit name and unit is printed before the value. -a Lists all of the current resource limits. -c The number of 512-byte blocks on the size of core dumps. -d The number of K-bytes on the size of the data area. -f The number of 512-byte blocks on files written by child processes (files of any size may be read). -m The number of K-bytes on the size of physical memory. -n The number of file descriptors plus 1. -p The number of 512-byte blocks for pipe buffering. -s The number of K-bytes on the size of the stack area. -t The number of seconds to be used by each process. -v The number of K-bytes for virtual memory. If no option is given, -f is assumed. umask [ mask ] The user file-creation mask is set to mask (see umask(2)). mask can either be an octal number or a symbolic value as described in chmod(1). If a symbolic value is given, the new umask value is the complement of the result of applying mask to the complement of the previous umask value. If mask is omitted, the current value of the mask is printed. unalias name... The aliases given by the list of names are removed from the alias list. unset [ -f ] name ... The variables given by the list of names are unas- signed, i.e., their values and attributes are erased. readonly variables cannot be unset. If the -f, flag is set, then the names refer to function names. Unsetting ERRNO, LINENO, MAILCHECK, OPTARG, OPTIND, RANDOM, SECONDS, TMOUT, and _ removes their special meaning even if they are subsequently assigned to. wait [ job ] Wait for the specified job and report its termination status. If job is not given then all currently active child processes are waited for. The exit status from this command is that of the process waited for. See Jobs for a description of the format of job. whence [ -pv ] name ... For each name, indicate how it would be interpreted if used as a command name. The -v flag produces a more verbose report. The -p flag does a path search for name even if name is

46

Page 47: Korn Shell

an alias, a function, or a reserved word.

Escape Sequences Cursor positioning cursor up (CUU) ESC [ Pn A moves cursor up Pn lines - same column cursor down (CUD) ESC [ Pn B moves cursor down Pn lines - " cursor forward (CUF) ESC [ Pn C moves cursor right Pn columns cursor backward (CUB) ESC [ Pn D moves cursor left Pn columns cursor position (CUP) ESC [ Pl ; Pc H moves cursor to line Pl, column Pc Select Graphic Rendition (SGR) You can select one or more character renditions at a time using the following format: ESC [ Ps ; ... Ps m When you use multiple parameters, they are executed in sequence. The effects are cumulative. For example, to change to blinking-underlined, you can use: ESC [ 0 ; 4 ; 5 m Ps Action 0 All attributes off 1 Display at increased intensity 4 Display underscored 5 Display blinking 7 Display negative (reverse) image 2 2 Display normal intensity 2 4 Display not underlined 2 5 Display not blinking 2 7 Display positive image Erasing Erase in line ESC [ K erases from the cursor to end of line ESC [ 1 K erases from beginning of line to cursor ESC [ 2 K erases the complete line Erase in display ESC [ J erases from cursor to end of screen ESC [ 1 J erases from beginning of screen to cursor

47

Page 48: Korn Shell

ESC [ 2 J erases the complete display Xterm title To set the title bare of an xterm: ESC ] 2 ; title ^G For example type this in an xterm window: echo "^[]2;test title^G" where ^[ is an escape and ^G is a control G REMEMEBER: when entering any control sequence type ctrl-V then the control sequence. In this case type ctrl-V ESC .... ctrl-V ctrl-G Also remember that these settings are like toggle switches, they stay on till they are switched off. If you are using them be careful if you allow yourself to ctrl-c out of the program between setting and unsetting. You can end up with your window in a weird state.

Redirection REDIRECTIONS File Desc Name Abbrev Default 0 Std Input stdin keyboard 1 Std Output stdout terminal 2 Std Error stderr terminal The usual input source or output source can be changed as follows: Simple Redirection cmd > file Send output of cmd to file (overwrite) cmd >| file Send output of cmd to file (overwrite, even with noclobber option set) cmd >> file Send output of cmd to file (append) cmd < file Take input for cmd from file cmd << text Read standard input up to a line identical to text (text can be stored in a shell variable) Input is usually typed on the screen or in the shell program. Commands that typically use this syntax include cat, echo, ex and sed. (If <<- is used, leading tabs are ignored when comparing input with the end-of-input text marker). Redirection Using File Descriptors cmd >&n Send cmd output to file descriptor n cmd m>&n Same, except that output that would normally go to file descriptor m is sent to file desciptor n instead cmd >&- Close standard output cmd <&n Take input for cmd from file descriptor n

48

Page 49: Korn Shell

cmd m<&n Same, except that input that would normally come from file descriptor m comes from file descriptor n instead cmd <&- Close standard input Multiple Redirection cmd 2> file Send standard error to file, standard output remains the same, eg to the screen cmd > file 2>&1 Send both standard error and standard output to file (cmd > f1) 2>f2 Send standard output to file f1; standard error to file f2 cmd | tee files Send output of cmd to standard output and to files Coprocesses cmd1 | cmd2 |& Coprocess; execute the pipeline in the background. The shell sets up a two-way pipe, allowing redirection of both standard input and standard output read -p var Read coprocess input into variable var print -p string Write string to the coprocess cmd <&p Take input for cmd from the coprocess cmd >&p Send output of cmd to the coprocess

signals HUP 1 hangup INT 2 interrupt (rubout) QUIT 3 quit (ASCII FS) ILL 4 illegal instruction (not reset when caught) TRAP 5 trace trap (not reset when caught) IOT 6 IOT instruction ABRT 6 used by abort, replace SIGIOT in the future EMT 7 EMT instruction FPE 8 floating point exception KILL 9 kill (cannot be caught or ignored) BUS 10 bus error SEGV 11 segmentation violation SYS 12 bad argument to system call PIPE 13 write on a pipe with no one to read it ALRM 14 alarm clock TERM 15 software termination signal from kill USR1 16 user defined signal 1 USR2 17 user defined signal 2 CLD 18 child status change CHLD 18 child status change alias (POSIX) PWR 19 power-fail restart WINCH 20 window size change URG 21 urgent socket condition POLL 22 pollable event occured IO SIGPOLL socket I/O possible (SIGPOLL alias) STOP 23 stop (cannot be caught or ignored) TSTP 24 user stop requested from tty CONT 25 stopped process has been continued TTIN 26 background tty read attempted TTOU 27 background tty write attempted

49

Page 50: Korn Shell

VTALRM 28 virtual timer expired PROF 29 profiling timer expired XCPU 30 exceeded cpu limit XFSZ 31 exceeded file size limit WAITING 32 process's lwps are blocked LWP 33 special signal used by thread library FREEZE 34 special signal used by CPR THAW 35 special signal used by CPR

scripts ------------------------------------------------------------------ File: calc #!/bin/ksh # # A very simple calculator - one expression per command # print $(($*)) ------------------------------------------------------------------ File: calc2 #!/bin/ksh # # A more complex calculator - multiple expressions till ctrl-c # trap 'print Thank you for calculating!' EXIT while read expr'?expression> '; do print $(($expr)) done ------------------------------------------------------------------ File: conj #!/bin/ksh # # A program to convert tiff to jpeg - with checking # print there are $# files to convert print $* print Is this correct done=false while [[ $done = false ]]; do done=true { print 'Enter y for yes' print 'Enter n for no' } >&2 read REPLY?'answer? ' case $REPLY in y ) GO=y ;; n ) GO=n ;; * ) print 'invalid.' done=false ;; esac done

50

Page 51: Korn Shell

if [[ "$GO" = "y" ]] then for filename in "$@" ; do newfile=${filename%.tif}.jpg eval convert $filename $newfile done fi ------------------------------------------------------------------ File: conjx #!/bin/ksh # # A simple program to convert tiff to jpeg # for filename in "$@" ; do newfile=${filename%.tif}.jpg eval convert $filename $newfile done ------------------------------------------------------------------ File: copro #!/bin/ksh ed - memo |& print -p /word/ read -p search print "$search" ------------------------------------------------------------------ File: copro2 #!/bin/ksh search=eval echo /word/ | ed - memo print $search ------------------------------------------------------------------ File: files #!/bin/ksh # # A program to give information about a file # if [[ ! -a $1 ]]; then print "file $1 does not exist." return 1 fi if [[ -d $1 ]]; then print -n "$1 is a directory that you may" if [[ ! -x $1 ]]; then print -n " not " fi print "search." elif [[ -f $1 ]]; then print "$1 is a regular file." else print "$1 is a special type of file." fi if [[ -O $1 ]]; then print 'you own the file.' else print 'you do not own the file.' fi if [[ -r $1 ]]; then print 'you have read permission on the file.' fi

51

Page 52: Korn Shell

if [[ -w $1 ]]; then print 'you have write permission on the file.' fi if [[ -x $1 && ! -d $1 ]]; then print 'you have execute permission on the file.' fi ------------------------------------------------------------------ File: flist #!/bin/ksh # # A program to list multiple files seperated with file name as # a sub-header and the date as the header # narg=$# if test $# -eq 0 then echo "No files requested for listing" exit fi if test $# -eq 2 then head=$1 shift fi echo `date` for I in $* do echo "------------------------------------------------------------------" if test $narg -eq 1 then head=$i fi echo $head echo cat $i done ------------------------------------------------------------------ File: lower #!/bin/ksh # # A program to convert file names to lower case # for filename in "$@" ; do typeset -l newfile=$filename eval mv $filename $newfile done ------------------------------------------------------------------ File: term1 #!/bin/ksh # # An example of using select and setting terminal options # PS3='terminal? ' select term in vt100 vt102 vt220 xterm; do if [[ -n $term ]]; then TERM=$term print TERM is $TERM break else

52

Page 53: Korn Shell

print 'invalid.' fi done ------------------------------------------------------------------ File: term2 #!/bin/ksh # # An example of using select and case to set terminal type # print 'Select your terminal type:' PS3='terminal? ' select term in \ 'DEC vt100' \ 'DEC vt102' \ 'DEC vt220' \ 'xterm' do case $REPLY in 1 ) TERM=vt100 ;; 2 ) TERM=vt102 ;; 3 ) TERM=vt220 ;; 4 ) TERM=xterm ;; * ) print 'invalid.' ;; esac if [[ -n $term ]]; then print TERM is $TERM break fi done ------------------------------------------------------------------ File: testit #!/bin/ksh # # Script to test functions inside a shell program # testopt $* ------------------------------------------------------------------ File: upper #!/bin/ksh # # A program to convert file names to upper case # for filename in "$@" ; do typeset -u newfile=$filename eval mv $filename $newfile done ------------------------------------------------------------------ File: usernames #!/bin/ksh # # A program to generate email addresses of users sorted by surname # niscat passwd.org_dir | gawk 'BEGIN {FS=":"} /area/ && !/ftp/ && !/cccb/ && !/africa/ {print $5,$1}' | gawk '{print $(NF-1),$0 | "sort"}' | gawk 'ORS=" "{for ( i=2;i '; do sleep 1

53

Page 54: Korn Shell

if (( $guess == $magicnum )); then print 'Right!' exit fi print 'Wrong!' done ------------------------------------------------------------------ File: guesshl #!/bin/ksh # # Another number guessing program # magicnum=$(($RANDOM%100+1)) print 'Guess a number between 1 and 100:' while read guess'?number> '; do if (( $guess == $magicnum )); then print 'Right!' exit fi if (( $guess < $magicnum )); then print 'Too low!' else print 'Too high!' fi done ------------------------------------------------------------------ File: message #!/bin/ksh # # An # eval /disk2/bin/msgs print 'Select your option:' PS3='(1-3 or q)? ' select opt in \ 'quit' \ 'list header of all messages' \ 'read all available messages' do case $REPLY in 1|q ) break ;; 2 ) eval /disk2/bin/msgs -h 1 ;; 3 ) eval /disk2/bin/msgs 1 ;; * ) print "type q to quit" ;; esac done ------------------------------------------------------------------ File: minute #!/bin/ksh # # A simple program to count a minute # i=0 date while test $i -lt 60;do print $i sleep 1

54

Page 55: Korn Shell

let i=i+1 done date ------------------------------------------------------------------ File: minute2 #!/bin/ksh # # A slightly more elegant version of minute # i=0 date print while test $i -lt 60;do print "�[1A"$i sleep 1 let i=i+1 done date ------------------------------------------------------------------ File: copro #!/bin/ksh ed - memo |& print -p /word/ read -p search print "$search" ------------------------------------------------------------------ File:copro2 #!/bin/ksh search=eval echo /word/ | ed - memo print $search ------------------------------------------------------------------ File: infloop #!/bin/ksh # # A simple program that loops indefinitely # trap 'print "You hit control-C"' INT trap 'print "You hit control-\"' QUIT trap 'print "You tried to kill me!"' TERM while true; do sleep 60 done

55