--> ap/xxxxx

The UNIX way

UNIX history embeds the UNIX way with the notion of software tools, also the title of a work by UNIX gurus Brian Kernighan and P.J.Plauger, as key concept; small, well designed text-based tools operating from the command line which do one job very well and can readily be connected to satisfy more complex tasks. UNIX software tools, each with multiple options, can thus be considered as building blocks, allowing for a vast number of combinations of commands, creating programmatic structures which can well outdo the possibilities of any GUI. By way of the central notion of the shell, our command line interface to the OS and these software tools, we are not so much programmed by a conditioning and so called intuitive graphical interface, we are rather programming. As soon as we put two commands together, connected by way of the ubiquitous pipes of UNIX, we are coding; we've made the jump from slave of the mouse towards customised automation, the promise of computation belied by the desktop. It's an exciting major step towards coding our own environment. Commands can be bundled, then programmed either interactively or by way of shell scripts, again with a dizzying array of possibilities and options. Indeed, the sheer range of common UNIX commands, with an emphasis on textual manipulation, alongside the interactive possibilities of the shell, within which we can manipulate our own command history, our working past, to great effect, can prove quite simply too overwhelming. Alongside key knowledge of active help facilities, redirection and piping, a short rundown of the top ten UNIX/GNU Linux commands, worthy of attention from any OS naturalist, should assist.

UNIX flattens data by way of files and active streams with the idea of file encompassing all things. The usual state of affairs is to input our commands to the shell, the command line, by way of keyboard, with such entry referred to as standard in or stdin. Output to our terminal is to stdout. Yet we can choose to redirect either input or output by way of the > operator. By way of example cat /usr/src/linux/* > /dev/dsp throws the Linux kernel source code as raw ascii data straight at the sound card. The >> appends the stream to existing data. Yet, what's really revolutionary is the pipeline, core feature of any UNIX-like OS and key to its flexibility; we can quite simply pipe data from the output of one small, well honed tool to the input of another. A classic example, provided by Tom Truscott, inventor of Usenet, prints out the frequencies of each word in files named book.1, book.2 and so on:

cat file* | tr -cs '[A-Z][a-z]' '\012' | sort | uniq -c | sort -nr | more
And thus first up in a run down of a selective top ten UNIX or GNU/Linux commands, complete with short description and sample use, is cat:

cat [options] files

cat, short for concatenate, quite simply reads from stdin and outputs to stdout, though commonly used with redirection as in the above example. Its reversing brother tac is also worthy of attention. Both serve well in splicing together files. For example:

cat file1 file2 file3 > onebigfile

man [options] [section] command

man displays information on the chosen command culled from online reference which have hopefully been installed on the system. A fine example to take things further:

man man

Not strictly a command but worth listing. Bash is the most common shell under GNU/Linux, exhibiting complex features such as command line completion, history and automation. Zsh pushes the interactive envelope with more intelligent completion facilities and an enhanced wildcard or globbing featureset.

zsh -l

grep represents one of the fastest and most versatile pattern matching or text search tools around. Grep also provides an excellent example from UNIX development history of the UNIX way. Frustrated by having to repeatedly enter complex text search, or regular expression matching, commands into the ed editor, Bell Labs hacker asked Dennis Ritchie to lift the regex code from the editor to furnish a standalone tool. Grep, standing for the editor command that it simulated, g/re/p (global regular expression print) was produced by the very next morning, heralding a long succession of versatile software tools. Grep even occupies a place within a generic hacker lexicon, referring to real world search.

grep -r kill /usr/src/linux

I thought 'dd' was for 'device dump'

I thought it stood for 'device-to-device'.

I heard it was 'death and destruction', for what happens if you screw up the options. :)

Ken Keys.comp.unix.misc

In common with a host of simple UNIX tools, dd, dating back to mainframe days, appears so deceptively simple that it's tempting to ask how useful such a command could be. It quite simply copies an input file (if) to an output file (of), or stdout if none is specified. Yet dd is quite special in keeping its hands off the data unless you specifically demand conversion, for example from lower to upper case or swapping around bits. dd is all about raw transfer of data between often remote physical devices; making exact copies of a filesystem, recovering deleted data in forensic instances, throwing data to the soundcard or creating disk images to boot under virtual machines.

dd if=/dev/hda1 of=/dev/dsp

With artistic applications often running across multiple networked machines, perhaps salvaged from dump or skip, rsh, standing for remote shell, makes testing and administration much easier, allowing for such a network of parallel PCs to transparently resemble a single computer. Though deprecated in favour of more secure ssh, rsh is simpler for offline or firewalled networks. With all machine running rshd, and with changes to files in /etc/pam.d commands can be issued transparently to headless machines without thinking of messy passwords and the like.

rsh hostname cat onepiece >> collector

Sed and awk, or Gawk under GNU, a pair which are hard to separate, form the mainstays of the UNIX world. Both qualify as programming languages which are commonly used straight from the command line. Sed is a stream adaptation of ancient text editor ed, and is all about automated pattern matching and text replacement. Awk, whose name is derived from the last syllables of authors Alfred V. Aho, Peter J. Weinberger, and Brian W. Kernighan, is more concerned with control structures. Their complex use together, which pushes textual processing to new heights, has received good attention in book length works, and it's difficult to give a good idea of the range of these versatile tools. At the same time a working knowledge of regular expressions, again forming the central topic of many books, is essential for use of sed and awk. Sed and Awk feature prominently in key text UNIX for Poets from Kenneth Ward Church at AT&T.

sed -e 's/black/white/g' file

ls files | awk '{print "mv "$1" "$1".new"}' | sh

Old school UNIX tool talk really does put IRC and friends to shame. Talk allows for split screen interactive textual conversation between two users. It's fantastic fun on an old serial terminal, with screen constantly updated as either user types. A new aesthetics of conversation is opened up by talk, which transmits each keystroke in real time as it is typed; errors appear and are retyped by the other user live, ghost statements vanish slowly as they are retracted. Such immediacy is strangely lacking in today's increasingly networked landscape and talk spawned a rich culture. A successor to talk, ytalk, allows for multiple conversationalists. With the other user running the talkd demon we can issue:

talk someone@somedomain.com

at provides yet another example of a deceptively simple tool which beautifully illustrates the UNIX way. Rather than firing up a bulky PIM (Personal Information Manager) which may include some kind of reminder functionality, we can use at to schedule an automated job, mailout, or mail to ourself a reminder at a certain time and date. We can simple pipe commands to at using the equally versatile echo or specify a file containing commands:

echo "mail -s 'go directly to jail' m@1010.com < /dev/null" | at 15:25

Though not perhaps as versatile as some of the other tools, ps can well be defined as belonging to a family of interrogative apps which allow us to peer inside the running system, in this instance providing a report on all active processes. ps takes a vast array of options which allow for sorting of processes, such as by CPU usage, and showing additional information. The -f option presents an interesting family tree of processes with ASCII art style linkage. Somewhat neater is:


Other software tools worthy of attention and artistic exploration include sort, diff, csplit, strings, apropos, yes, strace and top.