Back at 1e9 seconds, Rob Pike gave a talk on what he thought the strengths and weaknesses of UNIX were. He lists a bunch of strengths, including: pipes, files, tools, and a separable shell.
He left out the big one for me, which is rapid development of automation. With a good shell, one can quickly grab some data, mangle it as you need, and send it on its merry way as you see fit. While you pay a performance penalty (it won't run as fast as C), the quick interactive development you get in return in more than worth it.
Almost all of a shell's power comes from building a pipeline of commands interactively and incrementally. Because we write from left to right, shells help us out as we start with something we know (e.g. a file), and transform it into what we want as we type. So, the flow of the typing (oldest typing is left-most, while newest is right-most) matches the flow of the data.
I don't want to hamper that flow, which is why each() and into() rock. each() allows me to keep hacking instead of cursor'ing around to create a for or while loop. And because we pay a performance penalty, sometimes output can take a while and we want to save it instead of regenerating it, so it makes sens to keep up the flow and pipe it into() a variable.
Returning to Pike's talk, he complained that we appeared to be moving away from flat files. With structured text like html, xml, apache-style configs, it'd be useful to have a command-line utility capable of splitting structured text into sections and dropping those sections out. I.e. an updated version of 'cut'.
Reading around further, I found that bash has variable transformations (man bash and slash for Parameter Expansion), so I refactored each() into the following (see BashEach for the updated code) :
each() { local line="" while read line; do eval "${@/\{\}/${line}}" # $@ =~ s/ {} / $line / done }