Speaking UNIX

10 great tools for any UNIX system

Content series: This content is part # of # in the series: Speaking UNIX Stay tuned for additional content in this series. This content is part of the series: Speaking UNIX Stay tuned for additional content in this series.

Much like a vernacular, the universe of UNIX tools changes almost perpetually. New tools crop up frequently, while others are eternally modernized and adapted to suit emerging best practices. Certain tools are used commonly; others are used more infrequently. Some tools are perennial; occasionally, some are obsoleted outright. To speak UNIX fluently, you have to keep up with the "lingo."

Table 1 lists 11 of the significant packages previously discussed in the Speaking UNIX series.

Table 1. Prominent UNIX tools

Name Purpose Cygwin A UNIX-like shell and build environment for the Windows® operating system. fish A highly interactive shell with automatic expansion and colored syntax for command names, options, and file names. locate Build and search a database of all files rename Rename large collections of files en masse rsync Efficiently synchronize files and directories, locally and remotely Screen Create and manage virtual, persistent consoles Squirrel A cross-platform scripting shell tac Print input in reverse order, last line first ( tac is the reverse of cat ) type Reveal whether a command is an alias, an executable, a shell built in, or a script wget Download files using the command line zsh An advanced shell featuring automatic completion, advanced redirection operands, and advanced substitutions

This month, let's look at 10 more utilities and applications that expand or improve on an existing or better-known UNIX package. The list runs a wide gamut, from a universal archive translator to a high-speed Web server.

In some cases, depending on your flavor of UNIX, you will have to install a new software package. You can build from source as instructed, or you can save time and effort if your package-management software provides an equivalent binary bundle. For example, if you use a Debian flavor of Linux®, many of the utilities mentioned this month can be installed directly using apt-get .

Find a command with apropos

UNIX has so many commands, it is easy to forget the name of a utility—especially if you do not use the tool frequently. If you find yourself scratching your head trying to recall a name, run apropos (or the equivalent man -k ). For example, if you're hunting for a calculator, simply type apropos calculator :

$ apropos calculator bc (1) - An arbitrary precision calculator language dc (1) - An arbitrary precision calculator

Both bc and dc are command-line calculators.

Each UNIX manual page has a short description, and apropos searches the corpus of descriptions for instances of the specified keyword. The keyword can be a literal, such as calculator, or a regular expression, such as calc* . If you use the latter form, be sure to wrap the expression in quotation marks ( "" ) to prevent the shell from interpreting special characters:

$ apropos "calcu*" allcm (1) - force the most important Computer-Modern-fonts to be calculated allec (1) - force the most important Computer-Modern-fonts to be calculated allneeded (1) - force the calculation of all fonts now needed bc (1) - An arbitrary precision calculator language dc (1) - An arbitrary precision calculator

Run a calculation on the command line

As shown above, dc is a capable calculator found on every UNIX system. If you run dc without arguments, you enter Interactive mode, where you can write and evaluate Reverse Polish Notation (RPN) expressions:

$ dc5 6 * 10 / p 3

However, you can do all that work right on the command line. Specify the -e option and provide an expression to evaluate. Again, wrap the expression in quotation marks to prevent interpolation by the shell:

$ dc -e "5 6 * 10 /" 3

Find processes with pgrep

How many times have you hunted for a process with ps aux | grep ... . Countless times, probably. Sure, it works, but there is a much more effective way to search for processes. Try pgrep .

As an example, this command finds all instantiations of strike's login shell, (where strike is the name of a user):

$ pgrep -l -u strike zsh 10331 zsh 10966 zsh

The pgrep command provides options to filter processes by user name (the -u shown), process group, group, and more. A companion utility, pkill , takes all the options of pgrep and accepts a signal to send to all processes that match the given criteria.

For instance, the command pkill -9 -u strike zsh is the equivalent of pgrep -u strike zsh | xargs kill -9 .

Generate secure passwords with pwgen

Virtually every important subsystem in UNIX requires its own password. To wit, e-mail, remote login, and superuser privileges all require a password—preferably disparate and each difficult to guess or derive using an automated attack. Moreover, if you want to develop scripts to generate accounts, you want a reliable source of random, secure passwords.

The pwgen utility is a small utility to generate gobs of passwords. You can tailor the passwords to be memorable or secure, and you can specify whether to include numbers, symbols, vowels, and capital letters.

Many UNIX systems have pwgen . If not, it is simple to build:

$ # As of March 2009, the latest version is 2.06 $ wget http://voxel.dl.sourceforge.net/sourceforge/\ pwgen/pwgen-2.06.tar.gz $ tar xzf pwgen-2.06.tar.gz $ cd pwgen-2.06 $ ./configure && make && sudo make install

Here are some sample uses:

Print a collection of easy-to-recall passwords: $ pwgen -C ue2Ahnga Soom0Lu0 Hie8aiph gei9mooD eiXeex7N Wid4Ueng taShee3v Ja3shii8 iNg0viSh iegh5ouF ... zoo8Ahzu Iefev0ch MoVu4Pae goh1Ak6m EiJup5ei

Generate a single, secure password: $ pwgen -s -1 oYvy9WWa

Generate a single, secure password with no ambiguous, or easily confused, characters and at least one non-alphanumeric character: $ ./pwgen -s -B -1 -y 7gEqT_V[

To see all the available options, type pwgen --help .

Watch many files with multitail

Whether you're a developer debugging new code or a systems administrator monitoring a system, you often have to keep an eye on many things at once. If you're a developer, you might watch a debug log and stdout to track down a bug; if you're an administrator, you might want to police activity to intercede as necessary. Usually, both tasks require oodles of windows tiled on screen to keep a watchful eye—perhaps tail in one window, less in another window, and a command prompt in yet another.

If you have to monitor several files at once, consider multitail . As its name implies, this utility divides a console window into multiple sections, one section per log file. Even better, multitail can colorize well-known formats (and you can define custom color schemes, too) and can merge multiple files into a single stream.

To build multitail , download the source, unpack it, and run make . (The options in the distribution's generic makefile should suffice for most UNIX systems. If the make fails, look in the topmost directory for a makefile specific to your system.)

# As this article was written, the latest version of multitail was 5.2.2 $ wget http://www.vanheusden.com/multitail/multitail-5.2.2.tgz $ tar xzf multitail-5.2.2.tgz $ cd multitail-5.2.2 $ make $ sudo make install

Here are some uses of multitail to consider:

To watch a list of log files in the same window, launch the utility with a list of file names, as in multitail /var/log/apache2/{access,error}.log .

. To watch a pair of files in the same window and buffer everything that's read, use the -I option to merge the named file into another, as in multitail -M 0 /var/log/apache/access.log -I /var/log/apache/error.log . Here, the Apache error log and access log are interlineated. -M 0 records all incoming data; you can see the buffer at any time by pressing the B key.

option to merge the named file into another, as in . Here, the Apache error log and access log are interlineated. records all incoming data; you can see the buffer at any time by pressing the B key. You can also mix and match commands and files. To watch a log file and monitor the output of ping , try multitail logfile -l "ping 192.168.1.3" . This creates two views in the same console: One view shows the contents of logfile , while the other shows the ongoing output of ping 192.168.1.3 .

In addition to command-line options, multitail provides a collection of interactive commands to affect the current state of the display. For instance, press the A key in the display to add a new log file. The B key displays the save buffer. The Q key quits multitail . See the man page for multitail for the complete list of commands.

Compress and extract almost anything with 7zip

Between Windows and UNIX alone, there are dozens of popular archive formats. Windows has long had .zip and .cab, for instance, while UNIX has had .tar, .cpio, and .gzip. UNIX and its variants also employ .rpm, .deb, and .dmg. All these formats are commonly found online, making for something of a Babel of bits.

To save or extract data in any particular format, you could install a bevy of specialized utilities, or you can install 7zip , a kind of universal translator that can compress and extract virtually any archive. Further, 7zip also proffers its own format, featuring a higher compression ratio than any other scheme, gigantic capacity reaching into terabytes, and strong data encryption.

To build 7zip , download the source for p7zip , a port of 7zip to UNIX, from its project page on SourceForge (see Related topics). Unpack the tarball, change to the source directory, and run make . (Like multitail , the generic makefile should suffice; if not, choose from one of the specialized makefiles provided.)

$ wget http://voxel.dl.sourceforge.net/sourceforge/p7zip/\ p7zip_4.65_src_all.tar.bz2 $ tar xjf p7zip_4.65_src_all.tar.bz2 $ cd p7zip_4.65 $ make $ sudo make install

The build produces and installs the utility 7za . Type 7za with no arguments to see a list of available commands and options. Each command is a letter—akin to tar —such as a to add a file to the archive and x to extract.

To try the utility, create an archive of the p7zip source itself in a variety of formats, and extract each archive with 7za :

$ zip -r p7.zip p7zip_4.65 $ 7za -ozip x p7.zip $ tar cvf p7.tar p7zip_4.65 $ 7za -otar x p7.tar $ bzip2 p7.tar $ 7za -so x p7.tar.bz2 | tar tf -

In order from top to bottom, 7za extracted a .zip, .tar, and .bz2 archive. In the last command, 7za extracted the .bz2 archive and wrote the output to stdout, where tar decompressed and cataloged the files. Like tar , 7za can be the source or destination of a pipe ( | ), making it easy to combine with other utilities.

View compressed files with zcat

Per-disk capacity now exceeds a terabyte, but a disk can nonetheless fill up quickly with large data files, lengthy log files, images, and media files such as movies. To conserve space, many files can be compressed to a fraction of their original size. For example, an Apache log file, which is simply text, can shrink to one-tenth of its original size.

Although compression saves disk space, it can add effort. If you need to analyze a compressed Apache log file, for instance, you must decompress it, process the data, then re-compress it. If you have a great number of log files, which is typical if you keep records to establish trends, the overhead can become excessive.

Luckily, the gzip suite includes a number of utilities to process compressed files in situ. The utilities zcat , zgrep , zless , and zdiff , among others, serve the same purpose as cat , grep , less , and diff , respectively, but operate on compressed files.

Here, two source files are compressed with gzip and compared with zdiff :

$ cat old This is Monday. $ cat new This is Tuesday. $ gzip old new $ zdiff -c old.gz new.gz *** - 2009-03-30 22:26:34.518217647 +0000 --- /tmp/new.10874 2009-03-30 22:26:34.000000000 +0000 *************** *** 1,3 **** This is ! Monday. --- 1,3 ---- This is ! Tuesday.

Surf the Web, conquer the Internet, make world peace with cURL

A prior Speaking UNIX column recommended wget to download files directly from the command-line. Ideal for shell scripts, wget is great for those times where you do not have ready access to a Web browser. For example, if you are trying to install new software on a remote server, wget can be a real life-saver.

If you like wget , then you'll love cURL. Like wget , cURL can download files, but it can also post data to a Web page form, upload a file via the File Transfer Protocol (FTP), act as a proxy, set Hypertext Transfer Protocol (HTTP) headers, and a whole lot more. In many ways, cURL is a command-line surrogate for the browser and other clients. Thus, it has many potential applications.

The cURL utility is readily built using the tried-and-true ./configure && make && sudo make install process. Download, extract, and proceed:

$ wget http://curl.haxx.se/download/curl-7.19.4.tar.gz $ tar xzf curl-7.19.4.tar.gz $ cd curl-7.19.4 $ ./configure && make && sudo make install

The cURL utility has so many options, it's best to read over its lengthy man page. Here are some common cURL uses:

To download a file—say, the cURL tarball itself—use: $ curl -o curl.tgz http://curl.haxx.se/download/curl-7.19.4.tar.gz Unlike wget , cURL emits what it downloads to stdout. Use the -o option to save the download to a named file.

To download a number of files, you can provide a sequence, a set, or both. A sequence is a range of numbers in brackets ( [] ); a set is a comma-delimited list in braces ( {} ). For example, the following command would download all files named parta.html, partb.html, and partc.html from the directories named archive1996/vol1 through archive1999/vol4, inclusive, for a total of 48 files. $ curl http://any.org/archive[1996-1999]/vol[1-4]/part{a,b,c}.html \ -o "archive#1_vol#2_part#3.html" When a sequence or set is specified, you can provide the -o option with a template, where #1 is replaced with the current value of the first sequence or set, #2 is a placeholder for the second, and so on. As an alternative you can also provide -O to keep each file name intact.

); a set is a comma-delimited list in braces ( ). For example, the following command would download all files named parta.html, partb.html, and partc.html from the directories named archive1996/vol1 through archive1999/vol4, inclusive, for a total of 48 files. To upload a suite of images to a server, use the -T option: $ curl -T "img[1-1000].png" ftp://ftp.example.com/upload/ Here, the glob img[1-1000].png is captured in quotation marks to prevent the shell from interpreting the pattern. This command uploads img1.png through img1000.png to the named server and path.

option: You can even use cURL to look up words in the dictionary: $ curl dict://dict.org/d:stalwart 220 miranda.org dictd 1.9.15/rf on Linux 2.6.26-bpo.1-686 <auth.mime> <400549.18119.1238445667@miranda.org> 250 ok 150 1 definitions retrieved 151 "Stalwart" gcide "The Collaborative International Dictionary of English v.0.48" Stalwart \Stal"wart\ (st[o^]l"w[~e]rt or st[add]l"-; 277), Stalworth \Stal"worth\ (-w[~e]rth), a. [OE. stalworth, AS. staelwyr[eth] serviceable, probably originally, good at stealing, or worth stealing or taking, and afterwards extended to other causes of estimation. See {Steal}, v. t., {Worth}, a.] Brave; bold; strong; redoubted; daring; vehement; violent. "A stalwart tiller of the soil." --Prof. Wilson. [1913 Webster] Fair man he was and wise, stalworth and bold. --R. of Brunne. [1913 Webster] Note: Stalworth is now disused, or but little used, stalwart having taken its place. [1913 Webster] . 250 ok [d/m/c = 1/0/20; 0.000r 0.000u 0.000s] 221 bye [d/m/c = 0/0/0; 0.000r 0.000u 0.000s] Replace the word stalwart with the word you'd like to define.

In addition to its command-line personality, all of cURL's capabilities are available from a library aptly named libcurl. Many programming languages include interfaces to libcurl to automate tasks such as transmitting a file via FTP. For example, this PHP snippet uses libcurl to deposit a file uploaded via a form to an FTP server:

<?php ... $ch = curl_init(); $localfile = $_FILES['upload']['tmp_name']; $fp = fopen($localfile, 'r'); curl_setopt($ch, CURLOPT_URL, 'ftp://ftp_login:password@ftp.domain.com/'.$_FILES['upload']['name']); curl_setopt($ch, CURLOPT_UPLOAD, 1); curl_setopt($ch, CURLOPT_INFILE, $fp); curl_setopt($ch, CURLOPT_INFILESIZE, filesize($localfile)); curl_exec ($ch); $error_no = curl_errno($ch); curl_close ($ch); ... ?>

If you have to automate any sort of Web access, consider cURL.

SQLite: A database for most occasions

UNIX offers a slew of databases—many of them open source, some for general application, and some highly specialized. Most databases, though, tend to be large, independent applications—MySQL, for example, requires a separate installation, some configuration, and its own daemon—and may be overkill for a large class of software. Consider an address book accessory for the desktop: Is it appropriate to deploy MySQL to persist names and phone numbers? Probably not.

And what if the application is intended to run on a very small device or on a modest computer? Such hardware may not be suited to multiprocessing, a large memory footprint, or significant demands on physical storage. Certainly, an embedded database is an alternative. Typically, an embedded database is packaged as a library and is linked directly to application code. Such a solution makes the application independent of an external service, albeit at a cost: Queries aren't typically expressed in Structured Query Language (SQL).

SQLite combines the best of all worlds: The software is tiny, you can embed it in virtually any application, and you can query your data with vanilla SQL. PHP and Ruby on Rails use SQLite as the default storage engine, as does the Apple iPhone.

To build SQLite, download the source amalgamation (a single file combining all the source) from the SQLite download page, extract it, and run ./configure && make && sudo make install .

$ # As of March 2009, the latest version was 3.6.11. $ wget http://www.sqlite.org/sqlite-amalgamation-3.6.11.tar.gz $ tar xzf sqlite-amalgamation-3.6.11.tar.gz $ cd sqlite-3.6.11 $ ./configure && make $ sudo make install

The build produces a library and associated application programming interface (API) header files as well as a stand-alone command-line utility named sqlite3 that's useful for exploring features. To create a database, launch sqlite3 with the name of the database. You can even place SQL right on the command line, which is great for scripting:

$ sqlite3 comics.db "CREATE TABLE issues \ (issue INT PRIMARY KEY, \ title TEXT NOT_NULL)" $ sqlite3 comics.db "INSERT INTO issues (issue, title) \ VALUES ('1', 'Amazing Adventures')" $ sqlite3 comics.db "SELECT * FROM issues" 1|Amazing Adventures

The first command creates the database (if it does not exist already) as well as a table with two columns, an issue number, and a title. The middle command inserts a row, and the final command shows the contents of the table.

SQLite offers triggers, logging, and sequences. SQLite is also typeless, unless you specify a type. For example, the issues table declared works fine without types:

$ sqlite3 comics.db "create table issues (issue primary key, title)" $ sqlite3 comics.db "INSERT INTO issues (issue, title) \ VALUES (1, 'Amazing Adventures')" $ sqlite3 comics.db "SELECT * FROM issues"1|Amazing Adventures

Lack of type is considered a feature, not a bug, and has many applications.

Grab XAMPP, an off-the-shelf Web stack

If you want to use your UNIX machine as a Web server, you have oodles of choices to compose a Web stack. Of course, there's the Apache HTTP Server, MySQL, Perl, PHP, Python, and Ruby on Rails, and this article recommends some components you may not have heard of previously, including SQLite and lighttpd.

But building a stack from scratch isn't everyone's cup of tea. Configuring Apache and other software packages to interoperate can be maddening at times, and you may not want the onus of maintaining the source yourself, recompiling each time a new patch is issued. For those good reasons, you might opt for an off-the-shelf stack. Just install and go!

XAMPP is one of many pre-packaged Web stacks you can find online. It includes Apache and compatible builds of MySQL, PHP, and Perl. A version of XAMPP is available for Linux, Sun Solaris, Windows, and Mac OS X. You download XAMPP, extract it, and start:

# The latest version for Linux was 1.7 $ wget http://www.apachefriends.org/download.php?xampp-linux-1.7.tar.gz $ sudo tar xzf xampp-linux-1.7.tar.gz -C /opt $ sudo /opt/lampp/lampp start Starting XAMPP 1.7... LAMPP: Starting Apache... LAMPP: Starting MySQL... LAMPP started.

The second command extracts the XAMPP distribution and places it directly in /opt (thus the need to preface the command with sudo . If you want to locate XAMPP elsewhere, change the argument to -C . The last command launches Apache and MySQL, the two daemons required to serve a Web site. To test the installation, simply point your browser to http://localhost. You should see something like Figure 1.

Figure 1. The XAMPP stack start page

Click Status to see how things are operating. XAMPP provides phpMyAdmin and webalizer to create and manage MySQL databases on the server and measure Web traffic, respectively.

By the way, XAMPP also provides the entire source code to the stack, so you can apply customizations or add to the stack if you need to. If nothing else, the XAMPP source code reveals how to build a stack, if you want to eventually tackle or customize the process yourself.

Go small with the lighttpd server

XAMPP and many bundles like it package the Apache HTTP Server. Apache is certainly capable—by most measures, it still powers the majority of sites worldwide—and an enormous number of extensions is available to add wholesale subsystems and integrate tightly with programming languages.

But Apache isn't the only Web server available, and in some cases, it isn't preferable. A complex Apache instance can require an immense memory footprint, which limits throughput. Further, even a small Apache instance may be excessive compared to the return.

"Security, speed, compliance, and flexibility" describe lighttpd (pronounced "lighty"), a small and very efficient alternative to Apache. Better yet, the lighttpd configuration file isn't the morass that Apache's is.

Building lighttpd from scratch is a little more involved, because it depends on other libraries. At a minimum, you need the development version (the version that includes the header files) of the Perl Compatible Regular Expression (PCRE) library and the Zlib compression library. After you've installed those libraries (or built the libraries from scratch), compiling lighttpd is straightforward:

$ # Lighttpd requires libpcre3-dev and zlib1g-dev $ wget http://www.lighttpd.net/download/lighttpd-1.4.22.tar.gz $ tar xzf lighttpd-1.4.22.tar.gz $ cd lighttpd-1.4.22 $ ./configure && make && sudo make install

Next, you must create a configuration. The most minimal configuration possible sets the document root, server port, a few Multipurpose Internet Mail Extension (MIME) types, and the default user and group for the daemon:

server.document-root = "/var/www/lighttpd/host1" server.groupname = "www" server.port = 3000 server.username = "www" mimetype.assign = ( ".html" => "text/html", ".txt" => "text/plain", ".jpg" => "image/jpeg", ".png" => "image/png" ) static-file.exclude-extensions = ( ".fcgi", ".php", ".rb", "~", ".inc" ) index-file.names = ( "index.html" )

Assuming that you saved the text to a file named /opt/etc/lighttpd.conf, you start lighttpd with lighttpd -D -f /opt/etc/lighttpd.conf .

Like Apache, lighttpd can serve virtual hosts. All it takes is three lines, using a conditional:

$HTTP["host"] == "www2.example.org" { server.document-root = "/var/www/lighttpd/host2 }

Here, if the host is named www2.example.org, an alternate document root is used.

Lighttpd is especially adept at managing large numbers of parallel requests. You can readily mix lighttpd with Rails, PHP, and more.

Better, smarter, faster

Yet another "Speaking UNIX" draws to a close. Break out those keyboards, fire up the Wi-Fi, and start downloading!

Downloadable resources

Related topics