APLawrence.com -  Resources for Unix and Linux Systems, Bloggers and the self-employed

The Linux of our Understanding

© February 2009 Anthony Lawrence

We techy folks are sometimes quick to pull the trigger when we spot people using tools without proper understanding. We laugh at Windows "point and click" mentality. Sometimes it may be justified, but I have to wonder if our own houses can stand deep inspection.

With that in mind, I decided to test my own knowledge of something I use every day: the Bash shell. I worked from a standard man page and read every line very carefully, asking myself just how completely I understood what I just read. I think that's a fair test: although I prefer Perl for day to day scripting, I do quite a bit with Bash. If you asked me to grade myself on Bash knowledge ahead of this self test, I would have given myself better than passing marks. I'm no stranger to the Bash man page, but I've never gone through it with this level of concentration.


I hit my first snag almost immediately:

Bash is intended to be a conformant implementation of the IEEE POSIX Shell and Tools specification (IEEE Working Group 1003.2). Bash can be configured to be POSIX-conformant by default.

Would you like to define a POSIX shell, Mr. Lawrence? No, I would not, or at least not without Google handy. Oh, yeah, I have a vague idea that the "P" stands for "Portable" and that the "OSIX" implies a Unix-like operating system. But what features are required of a POSIX shell? You've caught me slack jawed there..

Moving on.. what's this?

If the -s option is present, or if no arguments remain after option processing, then commands are read from the standard input. This option allows the positional parameters to be set when invoking an interactive shell.

OK, that was a little hard to follow. I understand it, but I'm not clear on WHY you'd want to set parameters here. Why would I need to? Right after that comes a mention of the -D option, which supposedly prints out the "strings that are subject to language translation when the current locale is not C or POSIX". Really? It just seems to give me a shell that does nothing at all. A bit of Google research turns up the reason: it's not meant to be used interactively; it's supposed to examine a script (see Advanced Bash-Scripting Guide: Localization if you care.

We then skip along happily for a bit. Or *I* skip along happily. I have to pause again at that "--posix":

Change the behavior of bash where the default operation differs from the POSIX 1003.2 standard to match the standard (posix mode).

Err, what? Oh, yeah, make it fully POSIX. Vague memories start to come back.. it's stuff like "echo -n" and "echo /c", isn't it? Yeah, something like that..

Restricted shells, no problem. What's this?


Produce the list of files that are required for the shell script to run. This implies '-n' and is subject to the same limitations as compile time error checking checking; Backticks, [] tests, and evals are not parsed so some dependencies may be missed.

Does that mean it picks out the executables in the script and finds out their dependencies? Apparently so.. I sure didn't notice that before and it doesn't exist everywhere. I happened to pick a RedHat based man page - I feel fine in ignoring this.

I almost got clear to "Definitions". Unfortunately, I ran into this:

If the shell is started with the effective user (group) id not equal to the real user (group) id, and the -p option is not supplied, no startup files are read, shell functions are not inherited from the environment, the SHELLOPTS variable, if it appears in the environment, is ignored, and the effective user id is set to the real user id. If the -p option is supplied at invocation, the startup behavior is the same, but the effective user id is not reset.

OK, I see what happens, but why? And why wait until now to suddenly mention "-p"? This has to be something to do with security, but I don't know what.. I found a little clue at Effective UID changing... but why and who thread. I see the value - it prevents privileges picked up from a set[ug]id program from being passed to a shell that program allows you (or that you managed to steal from it). The program would have to purposely give you a "bash -p" to pass on its powers. OK, that's clear now.

Sheesh: I'm not even 10 pages in and I've already hit things I didn't know. But at least I UNDERSTOOD everything, right? Big pat on the back, let's take a look at those "Definitions", shall we?

Shrug. Piece of cake. Next? Reserved Words? Ditto. Do your worst, bash, I'm steaming up river now.

A clog in the pipe

Aaargh. What the heck is this?


A pipeline is a sequence of one or more commands separated by the character |. The format for a pipeline is:
[time [-p]] [ ! ] command [ | command2 ... ]
The standard output of command is connected via a pipe to the standard input of command2. This connection is performed before any redirections specified by the command (see REDIRECTION below).

You know what stopped me there? It's the innocent "time" [-p]" at the beginning. Why is that there? That has nothing to do with pipes. And why the (optional) "!"? A paragraph or so down I learn that the "!" is used to negate the normal pipeline exit status - hmm.. if I ever knew that, I'd forgotten it. The paragraph after that attempts to explain why they mentioned "time" but fails to do the job. Are they telling me that bash is helping me out here by not just running "time" on the first element of the pipe and passing the output to the rest? Well, of course, but who asked for such special treatment? To me that violates protocol - it's not how it should work at all. I'd say that if you want to time a pipeline, you ought to put it in parens or in a script file and time that. I guess too many folks thought otherwise, right?

We went along happily for a while. I paused briefly at the description of BASH_REMATCH. I vaguely remember this. Bash's" =~" is familiar for someone used to Perl, so I can't claim not to understand this. Really, I'm doing better than I thought I would.

The section that begins with "Words of the form $'string' are treated specially." made me stop. Another "I did not know that" moment quickly followed by confusion upon reaching:

A double-quoted string preceded by a dollar sign ($) will cause the string to be translated according to the current locale. If the current locale is C or POSIX, the dollar sign is ignored. If the string is translated and replaced, the replacement is double-quoted.

So I set and exported LC_ALL as "POSIX" and it doesn't seem to work. But I don't know what strings POSIX is supposed to translate, do I? So does this go down as "I don't know", "I don't understand", or both?

Here's a definite "did not know":

(Describing "$@"")
If the double-quoted expansion occurs within a word, the expansion of the first parameter is joined with the beginning part of the original word, and the expansion of the last parameter is joined with the last part of the original word.

In other words:

$ cat t
echo "foo$@foo";
$ ./t a b c d
fooa b c dfoo

The explanation of $BASH_LINENO left me puzzled. I wrote some simple scripts that showed me what it does, but the man page says:

${BASH_LINENO[$i]} is the line number in the source file where ${FUNCNAME[$ifP]} was called.

What's $ifP ? Never mentioned before this, and never after. This has to go down as "No idea what you mean".

An unexpected trip-up

I reached the section on file descriptors and expected to sail right through. But wait, what's this?


If host is a valid hostname or Internet address, and port is an integer port number or service name, bash attempts to open a TCP connection to the corresponding socket.

Cool! It works, even on Mac OS X with no visible /dev/tcp: here's an example script. OK, another "did not know".

I ignored all the "readline" stuff. It's probably crammed with things I do not know, but I just can't raise my interest at all.

Did you like that lie? How about something closer to the truth: Most of that whole section leaves me stunned and blinking. Oh, I know bits and pieces and can fake my way through some of the rest, but it's a big weak area. OK, that's definitely something I have to spend some time on. It's a bit overwhelming, but how hard can it be? Famous last words, yeah..

That brings us to the end of the bash manual. It certainly opened my eyes to some knowledge gaps. I'd hate to do the same thing with the big Perl Camel book!

Got something to add? Send me email.

(OLDER)    <- More Stuff -> (NEWER)    (NEWEST)   

Printer Friendly Version

-> The Linux of our Understanding


Inexpensive and informative Apple related e-books:

Take Control of Automating Your Mac

Take Control of Upgrading to El Capitan

Take Control of iCloud, Fifth Edition

Take Control of the Mac Command Line with Terminal, Second Edition

Take Control of High Sierra

More Articles by © Anthony Lawrence

Wed Feb 25 09:46:52 2009: 5504   bob4linux

A friend of mine is a web site programmer. He was forced onto PHP as a consequence of that so he just uses it for his shell scripts as well so he avoided bash altogether. For myself I am happy to write a C program rather than muck around with bash unless the required logic is utterly trivial.
For systematic text replacement jobs I reckon that any job you can do in one hour manually using some kind of editor, you can do in 2 - 3 hours writing a C program, or in 20 - 30 hours using shell utilities. And with the last you can trigger several kilobytes of correspondence on on some Linux LUG mailing list over the next several weeks. Perl is just too ugly for me to contemplate.


Wed Feb 25 13:08:09 2009: 5505   TonyLawrence

Well, I'm not going to agree with you without reservation. I suppose it comes down to the definition of "trivial". If, for example, I need to do certain edits to some of 14,000 web pages, I think it's plain that a shell or Perl script is going to be much faster than doing it by hand. A C program might do it faster still, but then we come down to the effort and time required

If you've been doing C day and night, maybe you can write the C program faster than I can write a Perl script. I haven't done C in years, so I'll use Perl.

Thu Feb 26 09:02:09 2009: 5518   bob4linux

Well "trivial" depends quite a lot on personal knowledge. The Bash man page is really some of the worst technical writing that I have read in my 70 years on Earth. "K & R The C Programming Language" in contrast is among the best. There is a learning curve to use C but once you get on top of pointer usage it becomes very clean. A few years ago I downloaded the Gutenburg DVD iso only to discover that all of the links were broken in Linux because the target file names were different in case to their names in the hypertext links. Using C I was able to rectify that within 2 or 3 hours. So I use C where others might use Bash because 1. I can, and 2. I like to avoid rumaging thru that garbage can called 'man bash' whenever possible.



Printer Friendly Version

Have you tried Searching this site?

This is a Unix/Linux resource website. It contains technical articles about Unix, Linux and general computing related subjects, opinion, news, help files, how-to's, tutorials and more.

Contact us

Printer Friendly Version

There are only two things wrong with C++: The initial concept and the implementation. (Bertrand Meyer)

Linux posts

Troubleshooting posts

This post tagged:




Unix/Linux Consultants

Skills Tests

Unix/Linux Book Reviews

My Unix/Linux Troubleshooting Book

This site runs on Linode