[Richard Suchenwirth] 2003-07-09 - After having written the following code repeatedly, I just declare it to be an [idiom]. The ''[filter]'' function is frequently seen on Unix (and even DOS) tools, and I think was published in the Software Tools book(s): * read from stdin (or files specified on command line) * write to stdout Such tools (grep, sed, awk, sort, ...) can easily be combined ("glued") in pipes to do powerful things. A little framework for Tcl scripts with the filter function: set about "usage: myFilter ?file...? Does something meaningful with the specified files, or stdin. " #-- This handler contains the real functionality for one stream (or file) proc myHandler channel { ... } #-- This prevents lengthy errors if a filter|more is terminated with 'q' proc puts! string {catch {puts stdout $string}} if {[lsearch $argv "--help"]>=0} {puts $about; exit} if {[llength $argv]==0} { puts! [myHandler stdin] } else { foreach file $argv { set fp [open $file] puts! [myHandler $fp] close $fp } } ---- [Mike Tuxford] is a little confused here, although that in itself is not unusual. It appears to me that you provide a method of repeating a single set of functions upon a multiple set of files, whereas unix pipes provide multiple functions upon the returned data passed between the functions. That is, the 1st command passes it's stdout to the 2nd command as it's stdin, and so on... Perhaps an example of usage might clarify things for me. - [RS]: well, the above is the framework for one filter, which you can put into a pipe, but also can draw input from files specified on command line, like this (and similar to e.g. ''cat'' or ''more''): echo Data | myFilter | more cat data.file | myFilter | more myFilter *.file | more more data.File ---- [Arts and crafts of Tcl-Tk programming]