boilerplate

standarized text or formula [1 ]

Here's a Wiki location for those code snippets and idioms that might well belong in every script you write.

LV hopes that people explain why for the novices that stumble across the page...


Here's a way to have a procedure accept its arguments both "whole" (as a list) or in pieces, which may sometimes be helpful (RS):

 proc foo { args } {
      if {[llength $args] == 1} {set args [lindex $args 0]}
      ...
 }

I've been thinking this a lot and still can't figure out, what is so cool about this code. Can you explain a bit more, what is the idea behind this? --mjk - RS: It's very simple. Say you have a max function to return a numeric maximum:

 proc max list {lindex [lsort -real $list] end}

Now if you want to call this with discrete values, you'd have to listify them:

 puts [max [list $foo $bar $grill]]

But with the above boilerplate code, you can have it both ways:

 proc max args {
    if {[llength $args]==1} {set args [lindex $args 0]}
    lindex [lsort -real $args] end
 }
 puts [max $foo $bar $grill]
 puts [max $myValueList]

OK. Now I got it. Thanks. --mjk

AMG: Or you could just use {*}.

 proc max {args} {
     lindex [lsort -real $args] end ;# Nice trick, by the way!
 }
 puts [max {*}$myValueList]

Phil Ehrens says nearly every executable script should start with this:

 if { [ regexp {(root|wheel)} $::env(GROUP) ] } {
    puts stderr "DID YOU REALLY MEAN TO RUN $::argv0 AS ROOT?"
    exit 666
 }

LV notes for the novice that it is seldom a good idea to accidentally run miscellaneous commands while in the root or wheel group on Unix, as those groups often have write permission on files that should not be changed accidentally. There's also the possibility of doing something intentionally that should not be done - like the story of the novice who deleted all the files out of the root file system that he believed were not needed... resulting in a system that no longer ran...

RJ For the record, I was not a novice.


RS 2006-04-20: I have taken up the habit to structure stand-alone scripts as follows:

 #!/usr/bin/env tclsh
 set usage {$Id: 2213,v 1.26 2006-12-16 07:00:04 jcw Exp $
    ...
 }
 if {[llength $argv] != ...} {puts stderr $usage; exit}
 proc main argv {
    foreach {...} $argv break
    ...
 }
 proc ...
 ...
 main $argv

The usage message comes first, both for source code documentation and "online help" which is displayed when the script is called with no arguments. With the main proc I can design things top-down, and it is compiled, and its variables are typically local, which both should increase performance.

2006-12-12: added $Id: 2213,v 1.26 2006-12-16 07:00:04 jcw Exp $ to the usage message - this string gets replaced by CVS with the current filename, version, date, author etc., which is a good yield for 4 bytes more to type :^)


Donald Arseneau: What reminds me most of "boilerplate" is the long-recommended incantation

 #!/bin/sh
 #  Restart with tclsh \
 exec tclsh "$0" "[email protected]"

RLH I found I had to use the above to start a script from cron.

placed at the top of a Tcl program, in order to invoke tclsh installed in an unknown location. Recently, many people prefer the simpler

 #!/usr/bin/env tclsh

but /usr/bin/env still does not exist on several systems. See exec magic for full explanation of these and similar forms.

I recently question the value of the former, older, method as emacs and some other editors recognize the sh invocation and provide syntax highlighting appropriate for shell scripts, not Tcl.

(Thanks for the push Lars H.)

AMG: Using a .tcl extension usually clears that up. But since file names often get mutated when making backups, e.g. "frobozz.tcl.12-15-2006.old", I put an "ft=tcl" in my Vim modeline, just to be sure. Emacs has a similar trick, -*- something if memory serves. But nearly everybody at work uses Vim, so I don't bother. My boilerplate modeline is

 # vim: set ts=4 sts=4 sw=4 tw=80 et ft=tcl:

with the ts/sts/sw values changing to match the existing indent style if I'm not creating a new file. Vim's Tcl highlighting is not very good, but that's another story.

One more tip. I used to use "# Restart with tclsh.\" until someone else broke my code by adding comments between that line and the exec. Now I say "# The next line restarts with tclsh.\" to make it clear that the two must not be separated.


Donald Arseneau: Another good habit to get into, which fits the boilerplate theme, is

  package require Tcl 8.4

where the version number is the version you use for development, or which you conciously target for support. Then, on the chance that the program gets run under, say, Tcl 8.0, there is a a clear message for why it fails. There is no complaint when the installed Tcl version is higher than requested.


Likewise, adding a

 package require Tk 8.4

(or whatever version is appropriate) is useful to add at the appropriate spot within an application that is going to use Tk, even if you expect yourself to be using the application with wish. In that way, someone using tclkit or some other arrangement get the necessary library loaded.


AMG: I put most of my startup/"main" type code in a "main" proc, then call main after creating all needed procs. I hear this is good for performance, something about the byte code compiler not kicking in for code not in procs. (Details, anyone?) Sometimes I wrap the call to main in a [catch] to shield users from the full traceback, instead printing only the one-line error summary, accompanied with a hint like "try 'progname -help' for more information". main [error]s if it receives bad input or if a datafile is unavailable or whatever. And if the error is due to my bad code, just add a "puts stderr $errorInfo" right at the end of the file, rerun the program, and debug.

Yeah, maybe it would be better to have a separate return code for bad input or other runtime exceptions, and only hide the stack trace for those cases. On a genuine error (return code TCL_ERROR), print the full stack trace. But I'm too lazy for this.

 proc whatever {...} {...}
 proc whatever_2 {...} {...}
 proc main {args} {
     if {[bad $args]} {
         error "bad args!"
     }
 }
 if {[catch [list main {*}$argv] err]} {
     puts stderr $err
     puts stderr "try '[file tail $argv0] -help' for more information"
 }

aspect: Many good ideas above. One trick I use to allow my scripts to be sourced into an interactive tclsh for debugging is:

 if {!$tcl_interactive} {
   main {*}$argv
 } else {
   puts "[info script] loaded:  run \[main\] to start"
 }

Alternatively, for event-driven programs:

 main {*}$argv
 if {!$tcl_interactive} {
   vwait forever
 }

The latter allows using an interactive tclsh to debug the running program .. very handy!